Back to Full Site


The Harte Hanks Blog

IoT and Micro-Moments: Optimizing Big and Small Data to drive Omnichannel Marketing

HarteHanks_MarektingTechnology_ROIIn our last article we discussed how the advent of IoT is bringing marketers an overwhelming amount of data, behemoth data, that can be synthesized into usable knowledge that can drive more effective customer journeys. With companies having access to all of this data, we’d like to talk more about how this data can be optimized and utilized to have the largest impact on your organization.

Beware the overzealous that want to board the big data train too quickly, although they have the very best of intentions. The same “bad data in – bad data out” (incorrect insights or conclusions) rule holds just as true, if not more so, in the world of big data analytics compared to traditional statistical analytics. Big data is compiled from an ever growing number of sources, much of which is unstructured. And simple rules of probability apply here – the larger the pool of data, the higher the likelihood that analysts will miss “dirty” data that can ultimately lead to identifying false positives or false negatives.

Unlike traditional first party data that historically has lived in relational databases, big data often consists of a tremendous amount of unstructured data. Correctly integrating and/or blending this data with more structured first party data is critical so as not to lead to analytic outcomes that are way off in left field. This problem is only exacerbated by the velocity at which data is created, which can largely be attributed to the growing mobile trends discussed earlier where data is transmitted on almost a continuous bases. Also, keep on the lookout for the increasing trend of automobiles being online, yet another massive pool of data generating “devices”. To help ensure that the “signal” can be correctly extracted from the “noise”, it is critical that the appropriate amount of rigor is put behind understanding the quality of the data source, how that data is collected, and how it is integrated and blended with other sources of data.

Despite the value of big data synthesized to be used effectively, there is also extreme value in small data – data that’s about people and emotion (in addition to small datasets gathered from a singular historical event). Small data can be ingested into big data sets, merged with behavioral or trending information derived from machine learning algorithms, and provide clearer insights than we’ve ever had before.

Here’s an example of both: The use of smart labels on medicine bottles is small data which can be used to determine where the medicine is located, its remaining shelf life, if the seal of the bottle has been broken, and the current temperature conditions in an effort to prevent spoilage. Big data can be used to look at this information over time to examine root cause analysis of why drugs are expiring or spoiling. Is it due to a certain shipping company or a certain retailer? Are there reoccurring patterns that can point to problems in the supply chain that can help determine how to minimize these events? 1

The issue here is that we cannot become so obsessed with Big Data we forget about creativity. You have to remember that Big Data is all about analyzing the past, but it has nothing to do with the future. Small Data, can also be defined as seemingly insignificant observations you identify in consumers’ homes. Things like how you place your shoes to how you hang your paintings. These small data observations are likened to emotional DNA that we leave behind. Big Data is about finding correlations, but Small Data is about finding the causation, the reason why. 2

Optimizing Big and Small data into business processes can not only save companies millions of dollars, but creates a buyer and customer journey that are seamless, continuous and maintains context regardless of the touchpoint. This omnichannel marketing approach should be the ultimate goal of marketers – creating a conversation with their buyers and customers based on trust and value exchange – which leads to strong relationships in an increasingly connected on- and off-line world.

Laura Watson is Strategy Director at Harte Hanks, and Korey Thurber is Chief Analytics & Insights Officer at Harte Hanks. Harte Hanks can help your brand create an omnichannel marketing strategy, contact us for a free assessment.


1 Forbes Tech
2 Small Data: The Tiny Clues That Uncover Huge Trends

IoT and Micro-Moments Marketing: Leveraging Big Data to Improve the Customer Journey

4-biggest-challenges_illustrations_2-1_v02-01Being connected via wearables without your mobile device is already a reality with untethered Tech, like Android Wear and the Samsung Gear S2, which both support e-SIMs tapping into your pre-existing cell network at no extra cost. It’s a good bet that every smartwatch brand will have an LTE version by the end of 2016, which means that while there’s a vast number of facts and untold nuggets of information that could surprise even big data’s most ardent followers. Big Data is about to become behemoth data.

Every day, we create 2.5 quintillion bytes of data (that’s 2.5 followed by a staggering 18 zeros!)1 – so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, the Curiosity Rover on Mars, your Facebook video from your latest vacation, purchase transaction records, and cell phone GPS signals to name a few. Google alone processes 3.5 billion requests per day and stores 10 exabytes of data (10 billion gigabytes!)2

Whether it’s tracking driving habits for the purpose of offering insurance discounts, using biometric data to confirm an ATM user’s identity, or using sensors to detect that it’s time for garbage pick-up, the era of the iOT in which “smart” things can seamlessly collect, share and analyze real-time data, is here.

Imagine a world where your watch recognizes that you withdraw cash every Saturday so that you’re ready for the neighborhood lemonade stand and your evening outing, and you haven’t made your usual transaction yet. A helpful alert pops up on your device, and another reminder displays when you’re within a ½ mile of your Bank ATM where a retina scan allows you to withdraw funds. Your Smart Refrigerator identifies that you’re running low on eggs and yogurt, while your wearable identifies an open parking space within 50-feet of your favorite Saturday farm market stop, but cautions you that there’s a marathon starting in 2 hours so you better get a move on. A “ping” in your email indicates that the killer little black dress you’ve wanted just became affordable with a special discount coupon you received as you drive past the store. While you’re away, the sun comes out, so your Smart Home lowers the window shades, turns the A/C up a few degrees and suggests adding popsicles to the grocery list. Like any fabulous assistant, technology not only aids you, but anticipates your needs and helps you make smarter, faster decisions based on “advice” you can trust. This is the best way to use Big Data.

Having the ability to be smarter, faster and always connected without having to carry around a device (or anything at all)…great.

Using Big Data to synthesize all of the fragmented individual data points into an orchestrated, holistic, powerfully intelligent view of the customer to help them during these everyday micro-marketing moments…priceless.

Big Data allows brands to go beyond customer motivation and engagement in driving value exchange to allowing them to foster their brand affinity and cultivate their customer’s evangelism in real-time, responding to their customer’s behaviors even as their activities and likes shift.

Although simple in concept, many brands are struggling to get it right (or get started at all). Leading brands have already gained a powerful competitive advantage by adopting consumer management technology that allows them to understand and engage based on individual consumer preferences and observation of behaviors and buying signals in their Buyer and Customer journey – thus taking a big step toward making Big Data a strategic reality.

Is Big Data, or really behemoth data, really the answer all by itself? There is lot of insight to be garnered from that data, but the key is being able to quickly sift through it all, tuning out the noise to focus on the key patterns and meaningful relationships in that data.

Traditional statistical analytics techniques which focus on finding relationships between variables to predict an outcome simply won’t do when the goal is to optimize decisions using massive pools of data that are growing and evolving on a near-continuous basis. This is where machine learning comes into play and brings the needed “giddy-up” to the analytic component. Machine learning evolved from the study of pattern recognition within the field of artificial intelligence. The easy way to think about it is, it provides computers the ability to learn and improve without a specific program being written to tell the computer to “learn and improve”. Machine learning software identifies patterns in the data in order to group similar data together and to make predictions. Whenever new data is introduced, the software “learns” and creates a better understanding of the optimal decision. Think of it as the automation of the predictive analytic process.

There is certainly a lot of overlap between statistical analytics and machine learning but there is one key difference. The former requires that someone formulate a hypothesis and structure a test to evaluate whether that that hypothesis is true or not. For example, a hypotheses that states a particular marketing lever (i.e. a certain offer or message) will generate or “cause” additional account openings or sales. Machine learning does not worry about hypothesis testing and simply starts with the outcome that you are trying to optimize – sales for example – and uncovers the factors that are the drivers. As more data is introduced, the algorithm learns and improves its predictions in almost real time.
Interestingly, machine learning has been around for decades. But now, due to the massive explosion in data, cheaper cloud based data stores, and huge increases in computing horsepower, the interest in machine learning is really starting to hit its stride.

Laura Watson is Strategy Director at Harte Hanks, and Korey Thurber is Chief Analytics & Insights Officer at Harte Hanks. Harte Hanks can help your brand leverage big data, contact us for a free assessment.
Forbes Tech

How to Optimize Spend with Fractional Attribution



When traditional “database marketing” first took off in the early 1990’s, marketing performance measurement and attribution was quite simple. We generated sales and direct mail campaign performance reports using a handful of dimensions. Attribution was easily derived through business reply cards (attached to direct mail pieces), phone numbers or tracking codes. We also used indirect attribution rules by making control group comparisons. We were fairly accurate and the process was easy to execute.

The Current State of Attribution

We all know that the marketing landscape has changed … and it continues to evolve with massive channel proliferation. With so much data and so many options regarding how to best apply a limited marketing budget, how can a CMO receive richer insight to influence tactical decisions that will improve media/channel performance?

Let’s first examine the various states of attribution from the viewpoint of the modern day marketer:

  • Direct Attribution: Still used widely today and still relevant. A specific customer behavior (e.g. a purchase) can be “directly” attributed to a given marketing stimuli via a unique code, landing page/URL, response device, etc. However, other marketing stimuli may have created momentum and been a significant contributor to the consumer’s ultimate decision to purchase.
  • Last Touch Attribution: Attributing the desired customer behavior to the last “known” marketing touch. Similar to “Direct” Attribution, but not always the same, here the marketer attributes the desired customer behavior to the last known touch. This method is very common when there are no specific tracking codes/tags that tie a desired customer behavior directly to a specific marketing stimuli.
  • Multi-Full Attribution: Channel proliferation has led to individual channel/media silos, each with their own unique attribution rules. The separation of traditional offline data and online data is very common. For example, direct mail data is stored in a traditional customer database, email data is stored with the email service provider, and online data is stored by various DMPs, by vendors/partners that are contracted to capture it, each often with their own siloed attribution logic taking FULL credit for the same desired behaviors.
  • Rules Based Attribution: Building on the “Multi-Full Attribution” described above, here marketers use what is often called a “common sense approach” to proportionally assign attribution to very siloed marketing stimuli. For example, a business had recently identified the large overlap between their direct mail and digital channels. For the overlapping purchases identified in both groups, 100% of a given purchase was attributed to direct mail, while simultaneously 100% was also attributed to a combination of digital channels. A rule was then quickly implemented to assign 20% of the attribution to the direct mail channel and proportionally reduce the attribution by 20% across the various forms of digital media. So, it is “fractional” by the simplest definition, but no real math or analytics was being used to assign the “fraction” to each media/channel.

Each of these options contains significant attribution bias towards channels/forms of media, that when taken for face value will result is less than optimal decision-making.


What’s Next and What is Fractional Attribution?

Marketers must now leverage math, science and statistics to analyze and derive insight from large pools of data, much of which can now be integrated across channels to inform decisions across touch points during the customer journey. Fractional Attribution is a necessary tool for understanding campaign performance across a multitude of touch points.

Through advanced (and proven) analytic techniques, a weighting calculation is developed and applied to the various marketing touches during the customer’s buying journey. In short, you are attributing a portion of that customer’s purchase to each of the marketing touches that impacted the customer’s decision to buy.

Harte Hanks has a team of analysts that work with marketing organizations to create a fractional attribution model through a collaborative development process:

  1. Define the overall objectives and identify the behavior metrics you want to positively impact (e.g. response, sales, conversion, product registration, etc.).
  2. Define and implement the roadmap including identification of key performance indicators (KPIs) and setting the overall attribution approach. Companies have used both “quick start” fractional attribution solutions and more robust solutions that require dedicated data stores and data integration tools.
  3. Collect and compile the data.
  4. Execute the fractional attribution solution and create the scenario planning tool.

The “scenario planning tool” is what enables the user to optimize media/channel performance. Using the tool, the analyst or marketer can quickly run “what-if” analyses to estimate the impact of reallocating marketing spend across channel/media or removing a channel/media from the mix altogether. The end result is a much more informed decision that can result in significantly higher returns from your marketing budget. Performance data and insights from the optimization exercise are then used to calibrate and refine the attribution engine going forward.

Fractional Attribution rooted in proven math and statistical techniques is a critical tool to accurately improve and optimize the performance of an incredibly fragmented and complex system of channels and media, both online and offline.


It’s not perfect – no marketing science or advanced marketing analytic solution is. But a robust modeled attribution solution is proven marketing science, and those that leverage it appropriately will generate higher return from their marketing spend and outperform their competitors.

Has your company used fractional attribution to better analyze your marketing spend? Tweet us at @HarteHanks and share your experience with us.

How Pharmaceutical CRMs Can Lead to Healthier Relationships

Boosting physician and patient engagement

pharma CRM postCustomer Relationship Management (CRM) software offers a great deal of potential for the pharmaceutical industry. However, this is a complex sector, riddled with regulations surrounding sensitive data. It is not easy to find a solution that fits business needs while complying with relevant laws. This is especially true at an international level when different rules need to be observed for different countries.

Purchasing a standard CRM solution and trying to adapt it to various business and regulatory requirements is time consuming and difficult. Inevitably it involves compromise and hidden expense.

Instead, many pharmaceutical companies could benefit from international CRM programs that are purpose-built from the ground up by a marketing services provider.

Bespoke CRM for pharmaceuticals

A truly customized approach uses business goals as a starting point and builds a CRM framework around them. This ensures variations across different countries can be accounted for and embraced at an early stage, rather than being bolted on later. The result is a highly specified solution intrinsically optimized to meet business needs. It can have built-in scalability and the flexibility to handle international differences in data laws or standard practice, such as call centre versus nurse-led activity.

Ultimately, custom-built CRM offers better value and efficiency. Adapting existing systems is expensive, license fees can be high and product release cycles can delay the implementation of certain functionalities.

Using an MSP to build, manage and implement the solution brings multiple advantages. Since all aspects – from database management to phone calls, emails and SMS to direct mail – are handled by one organization, the program is more cohesive and affordable. What’s more, sensitive data is all held securely in one place.

Physician and patient communications

The best pharmaceutical CRM programs empower physicians and patients to make better, more informed choices – whether they’re prescribing treatment or following it.

Meeting physicians in person is becoming increasingly difficult for pharmaceutical companies. Physicians are often under pressure to see a certain number of patients per day, leaving limited time for meeting with third parties. Some countries also have complex regulations surrounding personal interaction between pharmaceutical companies and medical professionals. In many cases, direct marketing can play an effective role alongside or in place of face-to-face meetings. It enables physicians to keep abreast of the latest developments in treatments and processes such as pharmaceutical-led patient support.

Patient-focused activity varies depending on the nature of the patient’s condition, where they are in the treatment cycle, the level of data available and nuances of their country of residence. Naturally, when more is known about a patient, activity can be better tailored to their current needs and communications become more meaningful.

A central aim of pharmaceutical CRM should be fostering good relationships between patients and physicians. This means acknowledging the authority of the physician in prescribing drugs, while enabling patients to get more out of their appointments and the overall treatment. Ideally communications should operate progressively, supporting patients as they move from the initial awareness that they may have a certain condition, to actively acknowledging it, then learning to live with it. The latter stage is vital to boost adherence to treatment regimen and enhance overall patient outcomes.

Overcoming challenges

There are many challenges facing the marketing of pharmaceuticals today. However, deeper engagement rooted in custom-built CRM can help navigate many of them.

Direct alignment of patient and physician communications is complex from a data perspective, but with care and attention it can usually be achieved. Bespoke CRM programs can incorporate specific opt-in language to overcome many of the barriers surrounding sensitive data. This ensures that patients who are happy to share their data can access the wider support that is on offer should they need it.

Achieving buy-in from physicians and patients is not easy – nor should it be. Pharmaceutical organizations need to earn trust and loyalty over time. Striving for better, deeper engagement is a critical factor. An effective way to realize this in the short- to medium-term is through the empowerment of patients and physicians, arming them with knowledge and information so they can make informed choices. In the longer term, improved patient outcomes will speak for themselves.


Harte Hanks handles CRM programs for leading global pharmaceutical companies. Patient data is handled sensitively and an integrated approach ensures improved patient support and outcomes. Natalia Gallur has more than ten years’ experience in the sector.


Smarter Demand Gen Awakens

Convergence of Tech and People Will Amplify Demand Generation in 2016

UnknownThe B2B demand-marketing ecosystem continues to evolve at a rapid pace. It’s driven by emerging technologies, tactics and buyer behaviors, alongside other well-established factors that continue to shape the discipline.

Industry influencers and analysts such as SiriusDecisions and Forrester identified a raft of demand generation trends and requirements in 2015. These range from better use of analytics as a foundation for demand planning to buyer journey alignment and operationalizing personas.

The notion of operationalizing personas involves integrating persona intelligence into demand generation efforts. At a fundamental level, it involves dynamic delivery of persona-based content, messaging and offers across email, landing pages and websites. It was first mooted by SiriusDecisions in 2014, but began to take hold last year. During 2016 it will occupy a more central role as we enter the next stage of the journey: smarter demand generation.

Why do we need Smarter Demand Generation?

Many B2B organizations find their demand generation efforts are characterized by small pipelines, missed targets and failure to respond to the needs of today’s buyers. It’s not surprising when you consider the seismic shift in buyer behavior over the past few years.

B2B sales and marketing is becoming increasingly complex and far less linear in its nature. There are multiple influencers, decision makers and stakeholders. There are multiple online and offline marketing channels. And there are multiple interactions and conversations taking place.

In this fractured, multifaceted landscape we need to find a path to more effective, joined-up demand generation. We need an approach that embraces the complex realities of the B2B sector today and handles them with ease. Smarter demand generation is the answer.

What does it mean?

A central feature of smarter demand generation is the convergence of people and technology. This is true throughout the process. Human insight and expertise facilitates the creation and operationalization of personas. It also shapes the development and substance of programs that are augmented and delivered via sophisticated technologies. Finally, individuals at the receiving end of smarter demand generation are served with optimized, highly personalized communications. Content is relevant to their current and future professional needs and it is delivered at an opportune time via the most appropriate platform. The upshot is finely tuned buyer engagement and a more robust pipeline.

This might sound a world away from traditional demand generation. And it’s true that it requires a deeply analytical and intelligent approach expertly integrated with technical capabilities. But every journey begins with a single step. Marketers who set their sights on smarter demand generation can quickly realize benefits at a micro level that can later be replicated at a larger scale.

Exploring smarter demand generation with one segment of your target audience can be a good place to start. Integrating data, technology, people and tactics for the first time isn’t easy – but it is more manageable and achievable at a smaller scale. Ring-fence a project that leverages insight to improve targeting, messaging and optimization. Then closely monitor the results to track the impact on the sales pipeline. Spotlighting the effectiveness of smarter demand generation in this way, and sharing it at a Board level, can create an appetite for more. It might help secure investment in the technologies and skills required for a wider rollout.

The B2B sector has strived for precision marketing for decades. With the awakening of smarter demand generation, it is finally within reach.


Alex Gill explores this theme in a B2B Marketing webinar on 27 January: How to align your marketing for smarter demand generation and stronger ROI. Book your seat here.

Harte Hanks Announces Data Refinery to Harness Customer Data and Drive Marketing Results

Data Refinery ProcessMarketers are increasingly looking for innovative ways to get to know their customers better, and to get the most out of the campaigns they create every day. The best way to learn more about your customers is by leveraging data. This isn’t as simple as it sounds. With a plethora of channels at your customers’ disposal, both online and offline, and the growing number of devices that people use, it is difficult to harness all of that data – especially when you’re mining it from multiple sources. Utilizing big data also requires the complexities of hiring a staff to manage data, ensuring best-in-class quality and governance procedures and working with constrained budgets across siloed departments. This is no easy feat.

How do we overcome these challenges together? The answer lies in gathering and storing the most current data on your customers through a data refinery. Data refinery is a scalable platform that allows for on-demand access to compiled customer views that can be accessed by all departments within your organization. The compiled views should be nimble, customizable and rich with proprietary and third-party data sources so they effectively serve the ever-changing marketing demands placed on the various teams that need access, and as a result, empower marketers to know more and communicate better to their customers.

So how does it work?
At the heart of a good data refinery platform is the aggregation of large amounts of various data types from multiple sources and channels, both traditional and digital. A data refinery platform starts with an ideal customer profile that defines data attributes needed to deliver results. This ideal customer profile serves as your “map,” guiding data profiling and sourcing to bring together and enhance owned data with third-party data. The data refinery then cleanses, validates and standardizes the customer profile for output to any downstream marketing or sales application.

Today we are excited to announce that Harte Hanks is launching its very own Data RefinerySM solution. With our solution, access to pre-vetted data sources by vertical and marketing objective are utilized – think of this as an app store for data – reducing the time to value. Selecting data based on reliability and performance metrics optimizes data usage and spending, ensuring campaigns don’t become stagnant. To learn more about Harte Hanks’ Data Refinery click here.

A brand’s success will continue to be dependent on technology, innovation and the ability to connect with the customer in a highly relevant way. A data refinery platform is needed to bring data together and make it foundational to all your marketing and sales efforts.

Next week we’ll review what data sources are available and how best to manage them using the latest open source technologies. In the meantime, start thinking about what you could do if all your data could be harnessed, treated as a single source of the truth and accessed by anyone on demand. The possibilities are almost endless, aren’t they?

Delivering data from all different sources and augmenting it to form purpose-built customer profiles allow you to understand your customers. This insight is powerful and allows you to acquire new customers, reduce churn within your existing customer base, increase repeat purchases and increase customer satisfaction.

A Data Refinery Platform Helps You:

  1. Better understand existing customer base
  2. Create models and segmentation to find better prospects at scale
  3. Understand existing customer behavior, avoid attrition and encourage growth

Back to the Future: Predictive Analytics


What if you knew what your customers wanted, when they wanted it? With predictive marketing analytics, gazing into the future is entirely possible. While predictive analytics is not a new concept – marketers have often tried to use past performance to predict future behavior – the dawn of the information age has amplified its effectiveness and usability. Predictive analytics allow marketers to focus efforts and maximize their budgets by identifying targets who are ready to buy and by eliminating those who aren’t.

Big Data

 To accurately predict consumer behavior, you need more than focus groups and surveys. The era of Big Data has armed marketers with a deluge of information on consumers – including engagement with marketing automation platforms and “intent” data from across the web. The technology to crunch this data and make sense of it is rapidly evolving, providing marketers with a roadmap to reach the right audience at the right time.

Data in Action

The Big Data era has produced an incredible amount of information about habits, desires and tendencies of consumers. Marketers who follow these digital footprints can optimize their marketing efforts to target individual audience segments and personalize messages to speak directly to potential customers. Predictive analytics can help create incredibly specific buyer personas – marketers no longer need to rely on broad demographic data and guestimates of what a particular buyer prefers. Enhanced buyer personas lay the groundwork for highly personalized messaging for nurture campaigns, which multiple studies show leads to significant increases in conversion and revenue. Predictive analytics also provide the benefit of targeted spending. Knowing what audiences to target and which platforms to target them through significantly increases the impact of marketing budgets.

B2B Adoption

B2B marketers have lagged behind their B2C counterparts in the adoption of marketing technology ­­– predictive analytics included. And while it’s true that personalized data from individual consumers offer a more clear view into purchasing habits and tendencies, plenty of data exists for B2B customers that can be utilized to implement more intelligent marketing tactics. Purchase history, for instance, is a great predictor of current and future behavior. If a customer has recently purchased a software system that won’t need an upgrade for three years, targeting that customer with marketing messages is not only inefficient, but could negatively affect that customers’ perception of your brand. Existing software licenses, log-in frequency, help desk calls and firmographics can also help B2B companies predict the need and desire for their products. Normally this kind of data will predict the type of customers that buy your products. Add social data sources to the mix, and you can predict customers that are ready to buy.


Depending on the level of sophistication and budget resources, B2B marketers can deploy analyst-led solutions or automated “black box” solutions to perform predictive analytics. For larger, more comprehensive data operations, an analyst-led approach is preferred. Computers are wonderful, but a human touch – specifically when there are oddities in the data – can more accurately utilize the information output to design programs and messaging that take into account both the customer and the nuances of the company. However, there are various automated solutions that are more than sufficient for less sophisticated marketing automation programs. Both approaches have their own merit, but one thing is clear: predictive analytics allow businesses to focus on what’s important and discard what’s not, leading to amplified revenue growth – and happy customers.


Global Patient Support Needs to ‘Think Local’

PharmaPatient support programs play a vital role in facilitating better disease management and treatment optimization. Traditionally pharmaceutical companies launched such initiatives on a local level. However, from a regional perspective, this sometimes resulted in patchy and fragmented support. Today, many pharmaceutical companies are driving centralized programs that benefit from a more sophisticated and strategic approach.

This approach brings many advantages around compliance, visibility of success and cost-effectiveness of implementation and maintenance. Yet centralized programs can be inherently complex and unwieldy. This is compounded by the fact that they often need to be coordinated at a global or area level to maximize infrastructure and management efficiencies.

Walking the line between global/regional efficiency and local effectiveness is no mean feat. Patient support is not a ‘one size fits all’ discipline; activity needs to be expertly tailored and carefully orchestrated.

At Harte Hanks, we believe five critical factors underpin patient support that is successful both at a global and a local level.

  1. Gather and leverage local knowledge

Understanding the nuances and intricacies of healthcare provision in different regions is essential. Ideally, you should have people on the ground who have in-depth knowledge of their local system and keep a finger on the pulse of any changes or developments.

Typical patient paths can vary significantly between countries for the same disease. Take the patient touchpoints and interactions for the U.S. healthcare system versus the UK’s NHS or Spain’s Seguridad Social. Prescription behaviours, drug dispensing and the length of time between specialist visits can be entirely different. There can even be differences in the role of healthcare practitioners during treatment, in terms of nurse interaction levels, nurse-led advice, pharmacist involvement and primary or speciality care.

  1. Create space for consultation and collaboration

Regional offices need to have clear channels of communication with the head office, and regular opportunities to report back on the local healthcare environment. They need to know that their observations are taken into account and actively used to shape the delivery of patient support in their territory.

At a strategic level, this collaborative approach enables program goals and objectives to be adapted to the realities of each country and healthcare system. It also needs to work at a tactical level, with regional teams of medical and regulatory professionals reviewing and approving materials before they are issued to healthcare professionals and patients.

Pharmaceutical companies often lack the time and resources required to give adequate attention to each country of a global patient support program. This is especially true when implementation needs to happen in parallel with a product launch or other internal deadlines. Working with a trusted third party can be a mutually beneficial solution for individual countries and the global program as whole. They can offer expert guidance as well as coordinate materials distribution and facilitate knowledge sharing.

  1. Ensure processes and training are water-tight

It’s vital that staff delivering the program, especially those with direct patient contact, understand indicators of pharmacovigilance events. Processes need to be in place to ensure that any spontaneous or solicited reports of adverse effects are handled appropriately and escalated in the right timeframes.

A centralized model can ensure that training compliance efforts are optimized and that all pharmacovigilance processes are managed in a cohesive way. A balance needs to be struck to ensure that training and reporting procedures meet certain standards, while respecting any elements or formats that vary between countries.

  1. Coordinated multi-channel communications

Using a CRM suite to facilitate patient and healthcare provider communications boosts efficiency and enables better control of patient support programs. For example, Harte Hanks can act as a multichannel one-stop-shop which is managed centrally but enables local offices to customize activity, such as:

  • Secure data management and hosting, in-line with local privacy rules
  • SMS, email and direct mail assets (drawing on print-on-demand and personalization capabilities)
  • Creation, development and hosting of personalized online portals for patients and healthcare providers, with self-tracking tools to support all digital communications
  • Advanced reporting and analytics to measure success and monitor progress

CRM and digital services should be flexible enough to accommodate multilingual communications and adaptations for the individual needs of each country. For instance, a global program will encounter various regulatory frameworks and the requirements of medical, legal and regulatory teams differ between countries.

  1. Continual improvement philosophy

If program goals and objectives are tailored to local regions, it follows that KPIs need to be tailored too. For measurement to be meaningful, successes or failures need to be considered in context. And they need to feed into the development of ongoing goals and objectives geared towards a cycle of continual improvement. To facilitate effective management at a macro level, it’s important to ensure global real-time visibility across the entire programme, from high-level KPIs to more detailed local perspectives.

The cornerstone of any successful patient support program is recognition that patients are people. They have their own lives, families, work and hobbies, as well as living with a disease or illness. They deserve to be listened to and helped to live their life to the fullest.

Treating patients as people within a program that operates on a global scale is complicated., but with an intelligent, carefully coordinated approach that draws on local knowledge, it is possible to achieve this. Communicating with patients at the right time with the right message via the most appropriate channel is half of the story. Ensuring information and interventions are precisely tailored to their real needs completes the circle, both supporting the treatment and enhancing the overall patient experience.

Harte Hanks handles patient support programmes for leading global pharmaceutical companies. Patient data is handled sensitively and an integrated approach ensures improved patient support and outcomes. Natalia Gallur has more than ten years’ experience in the sector. To learn more about the services we offer, take a look at our case studies.

Taking Your Customers from Anonymous to Known: Introducing Total Customer Discovery

A Deeper Dive into the Solution


Today, we are excited to announce our newest solution to enable smarter customer interactions: Total Customer Discovery. You can learn more about the details through our press release, video and digital guide. In this blog post, I’m going to break down some of the technology components that went into creating it.

In a nutshell, Total Customer Discovery provides a holistic, 360-degree profile of customers, merging data from online and offline channels and across devices. This single customer view encompasses data across demographics (contact data, social profiles); psychographics (interests), historical (purchase and promotion history) and influencing power (networks, connections). With this richer customer view, marketers can deliver enhanced and personalized customer experiences, leading to increased acquisition, retention and, ultimately, ROI.

So without further ado, here are the different components of the Total Customer Discovery Solution and what they help address:

Solution Component: Cross Screen Identification

With cross-screen identification, each customer has a persistent, unique ID that carries with them, helping marketers track associated devices with that customer even when customers delete their browsing history (and their cookies). With Total Customer Discovery, we can identify and track customers across various devices (mobile phones, tablets, computers, laptops and so on), learning their behaviors, adding to their customer profiles and offering a seamless brand experiences across touch points that takes into consideration their past purchase history and preferences.

Solution Component: Cross Journey Mapping

To solve the problem of internal silos and overwhelming amounts of data, the cross journey mapping function captures customer’s digital behavior and stores meaningful attributes, such as click, searches, interests, preference, etc. to produce richer, more multi-dimensional customer profiles. These attributes can then be linked with other data sources within an organization such as a Customer Relationship Management (CRM) database. Total Customer Discovery identifies customer interactions across multiple devices and channels, so that we can track a customer throughout their entire journey, from smartphone, to tablet, to computer, to in-store.

Solution Component: Data Onboarding

A single view of customers provides a comprehensive view of the purchase journey. Integrating both online and offline data helps round out the single view of customer for a comprehensive picture of customer behavior for better retargeting and personalization. With data onboarding, online and offline data are merged and customer files are created using email or physical address lists that are matched with a database of advertiser tracking parameters. Particularly for brick-and-mortar stores, integrating online and offline data sources is crucial for delivering relevant content across channels based on the customer identification, from digital interactions on their smartphone to offline purchases at a retail store.

Solution Component: Social Linkage

Personalized, relevant content is the key to driving ROI in today’s world of real-time “micro-moments.” With social linkage, customers’ social interactions and behaviors are tracked across sites to enable deeper customer segmentation. Social linkage takes data from over 150 social sites, including Facebook, LinkedIn, Pinterest, Twitter and Google+, and gives marketers insightful social profile data to inform their social investment decisions and make their digital marketing efforts more effective.

We’d love to tell you more about how Total Customer Discovery takes customers from anonymous to known. For more information, you can visit or email

How Data Refinery Helps Companies Transform Raw Data into Gold

mapr hadoop data refineryCompanies are being bombarded by new sources of data faster than they can consume them. The explosion of emerging customer data sources (social, clickstream, transactions, mobile, sensor, etc.) presents both a huge opportunity and a challenge.

The opportunity is that new data sources can reveal insights for applications that can drive competitive advantage. Businesses want to analyze and integrate more complex types of data to add new insights to what they already know about their customers to improve service and add more value to clients.

The challenge is that managing this growing volume and complexity of data is difficult with traditional database technology. As the volume of data grows, performance goes down. As data complexity increases, more administrators are required to organize data into something meaningful.

Apache Hadoop technology has emerged as a powerful and flexible big data platform for companies to store and process vast quantities of raw data over a long period of time. Companies no longer have to set limits on how much and what kind of data they can ingest into their data repositories. The mantra had become, “Keep it all in case it’s needed”. But to make the hordes of data useful, companies need a mechanism to transform the data into a valuable business asset.  A new mantra is emerging, “Keep it all and let a data refinery sort it out.”

What is a data refinery?

A data refinery is a critical component of a big data strategy, especially for customer-facing enterprises that want to build robust and accurate customer profiles to improve customer interactions.

Think of it as an oil refinery where raw material (oil) comes in and is separated in the different streams for downstream production and products such as gasoline, motor oil, kerosene, and more. Similarly, a data refinery ingests raw material (data) into Hadoop in native format at any scale and can then refine it into other downstream systems or customer-facing applications. Raw data must be refined or explored to understand relationships and whether there is meaning in the data (through tools such as Apache Drill). Next, the data refinery cleans, enriches, and integrates data with other sources of structured data in downstream database or business intelligence solutions to deliver the insights that create more personalized customer relationships.

Harte Hanks builds a data refinery to improve data quality

To serve their clients better, Harte Hanks wanted to ingest and integrate more types of customer data into their clients’ contact databases. They wanted to gain new insights by getting access to the increasing volumes of data generated by people interacting with their clients’ brands over multiple channels. These insights could then feed into their clients’ marketing processes to help drive more effective marketing programs.

Harte Hanks knew their traditional database technology could not manage this huge increase in data volume and complexity, so they selected the MapR Distribution including Hadoop for its big data platform. A key component of the technology platform is the data refinery that cleanses and enriches the growing stream of new data sources that are ingested into the customer databases.

More data yields higher accuracy and new customer insights

The MapR data platform enables Harte Hanks to enhance the performance, scalability and flexibility of its solutions so its clients can more easily and quickly integrate, analyze and store massive quantities of data for deeper insights to better serve customers. This new solution enriches and enhances customer databases by integrating all kinds of digital data, survey data, reference points and more, all while maintaining the performance and ease-of-use they’ve come to expect.

Performance accelerates turnaround time to clients

Harte Hanks is able to increase customer satisfaction through faster time to value and more accurate data sets. Data processing that used to take one to three days can now be accomplished in hours, if not minutes. Their clients can put marketing insights into action immediately for faster results.

Better data = better marketing

The Hadoop-based data refinery can transform a deluge of data into invaluable company assets. Harte Hanks can now offer its clients faster and more accurate customer insights and more complete customer profiles so they can create smarter, more relevant and effective customer interactions.

If you want to learn more about Hadoop or how to get started, MapR provides free on-demand training and examples of big data industry solutions.

About the author

Steve Wooledge, Vice President, Product Marketing, MapR

Steve brings over 12 years of experience in product marketing and business development to MapR. As Vice President of Product Marketing, he is in charge of increasing awareness and driving demand, as well as identifying new market opportunities for MapR. Steve was previously Vice President of Marketing for Teradata Unified Data Architecture. Steve also held various roles at Aster Data, Interwoven and Business Objects, Dow Chemical and Occidental Petroleum.

Steve holds an MBA from the Kellogg School of Management at Northwestern University, and a BS in Chemical Engineering from the University of Akron.

Integrated Marketing Through Connected Consumers



Today’s customers are engaging with your brand through an ever-expanding number of devices and channels, giving you unprecedented customer insight.

At least, potentially.

The problem is that data silos in display, email, social, websites, mobile and physical touch points can be tricky to bring together, leaving customers with inconsistent, disconnected experiences.

The good news is that there are plenty of data integration techniques to get rid of silos and create a single view of the customer by connecting all online and offline interactions – ultimately letting you communicate on a one-to-one, relevant basis with your customers and prospects.

A Complete Framework

The greatest benefit comes from an integrated framework that leverages a mix of the following components customized to your key objectives. There are industry leading providers such as BlueConic, BlueCava, FullContact, and LiveRamp that offer these technologies with great success.

1. Cross Site Data Capture: Enable Personalization with Progressive Profiling

Simply put, for every customer visit, their behavior is captured and turned into meaningful attributes. With every click you learn a little more about the needs, interests and behavior of your visitor. It gives you the ability to deliver dynamic, personalized content without changing the site, and it leads to higher conversion rates and a better customer experience.

2. Device-to-Individual Identification: Recognize a Customer Across Devices 

cross screen data integrationMore than 70% of today’s consumers use three or more internet-enabled devices. The challenge with multiple screen usage is that user identification across screens is tough. But you can funnel data across all screens (mobile, desktops and tablets) into a consolidated view of your audience by tracking, analyzing and organizing incoming device data and then connecting screens, consumers and households.

The key here is that this technology enables websites to keep all of the customer history, even when they switch browsers or devices or delete browser history.

3. Social Network Data and Presence: Identify Unique Individuals across Social Platforms

Imagine how much you would know about your individual customers if you could capture data across all of their social accounts. Well, it is possible consolidate data from over 150+ social sites such as Facebook, Twitter, LinkedIn, Google+, Pinterest, etc. to match and create a complete view of a given customer—in real-time. You can enrich bits of data, like email address, Twitter username, Facebook ID or phone number, to full blown individual social profiles.

4. Offline-to-Online Match: Lines Between Traditional & Digital Channels Blur!

offline to online data integrationNow that you have all of this powerful, integrated data, you can combine it with your CRM database to match individuals to both offline and online behavior. The acquired social intelligence in your CRM enables targeting, messaging decisions, design segmentation, experience and scoring strategies around consumer interests, rather than simply relying on purchase history. You could also recognize an offline customer when they visit you online with no login required. Through this you open up new opportunities for retargeting & understand attribution at every touch point.

5. Influencer and Topic of Interest: Identify Brand Advocates and Their Interests

Brand advocates are powerful, but you need to know how to find them and harness their power effectively. By gathering data not only WHO your brand advocates are but also WHAT they are interested in, you can customize a strategy for each defined by preferences, likes and interests. This will help you to nurture your brand advocates for unbiased review and word-of-mouth promotion.

6. Email Consolidation to Individual: Identify Customers with Multiple Email Addresses

Do you have multiple email addresses between personal and professional use? Maybe you even have multiple emails just for personal use? So do lots of your targets. This technology lets you identify customers across all of their email addresses and figure out which is their primary address, improving campaign response.

How to Do Data Integration Right: Bring a Few Techniques Together

Using any one of these techniques will bring your a step closer to integrated customer data, a connected customer experience, and ultimately more revenue. However, the ultimate goal should be to create an integrated framework that utilizes multiple data integration techniques—the whole is greater than the sum of its parts! If you have any questions or need help creating this integrated framework, get in touch.

Who Wants to Waste Time or Money on Data? Not me.

In my last post, I discussed how building an ideal customer profile is the first step to successful inference marketing—using data from a variety of sources to learn about the customer without requiring him or her to fill out a form. In this post, I’ll go into a little more depth about why you should take the time to build an ideal customer profile and how to get it done.

Why Build an Ideal Customer Profile

By ideal customer profile, I’m not referring to creating a picture of your best customer. I’m talking about determining what customer data points are most important to collect in order to achieve your marketing and business goals. Instead of trying to perfectly complete every contact or account record, data should be fit for its intended purpose, such as more effectively segmenting your email lists or better customizing web, email or other content.  . Deciding up front what specific information you need about your customers and prospects allows you to prioritize your data acquisition activities, only buying or remediating the data that you really need. You should strive for a balance between what’s needed to improve marketing and sales effectiveness and the costs of acquiring, using, and maintaining additional data sources.

Who wants to waste time or money on data? Not me (and probably not you). Take the time to build an ideal profile so you don’t. 

How to Build an Ideal Customer Profile

Overall, it’s pretty simple:

  1. Audit what you have. Come up with an inventory of the different data points you currently collect for each profile.
  2. Determine if are there other data points that would allow you to create better segmentations for marketing and sales.
  3. Adjust for a region, country or segment. It’s no secret that data availability, depth and quality vary by geography. Additionally, data for larger companies is generally more complete and up-to-date than data for small and medium businesses.  Please be mindful of data usage regulations.
  4. Add the data points from steps one and two together to complete your ideal customer profile.

A common set of desirable data includes (but remember to keep in mind your specific objectives):Ideal Profile Visualization

  • Core Contact and Account Attributes: standardcontact profile (name, email, address, title, company) and account firmographics (company revenue, industry, location, number of employees) plus relevant account level transactional data
  • Extended Attributes: supplemental orderived data, such as installed base, wallet size, role, cross-channel shopping, white space, propensity or other modeled scores
  • Social Attributes: includes data on sentiment, interest and intent derived from social interactions or social networks; can be at the contact or account level
  • Behavioral Attributes: engagement activities that may include sites visited, content consumed, campaign response, events attended, etc. Much of this data will come from your own web analytics and response tracking tools, but there are external providers as well.

Sources for web and social data are becoming more available and easier to access, allowing you to build out your profiles. Well known providers of digital and social data include LeadspaceDataSiftWorkDigital, FullContact, Profound Networks, and Gnip. You can also obtain information on content consumption through companies such as Madison Logic. The company tracks and reports content use across a network of over 450 B2B publications. This type of information can be helpful at the account level and, on a permissioned basis, contact level to understand what topic or solution areas your prospects find interesting.

Need Some Proof?

One of our clients recently used this approach as part of a data remediation program. The result? They achieved an ROI of well over 500 times their investment in data. While results can vary, I am confident that this strategy works to deliver a better marketing ROI.

What Next?

Now that you have a template for your ideal customer profile, you need to collect the data to complete it. For suggestions on tactics for updating, appending and enriching your records, check out this white paper.

Solving the mystery: How does your clean data get so dirty?

By: Traci Varnum

Data Quality Stats on Bad DataBy now, you are probably aware that a lack of accurate, clean data can be a huge problem for us marketers. According to an infographic from Trillium Software, 50% of all companies overestimate the quality of their data. Even more troubling, 50% of companies have absolutely no plan for managing their data problem.

So where the heck does all this bad data come from?

A recent article from Direct Marketing News explains that “dirty data” can rear its ugly head in a number of different ways.  Some common examples include:

  • Consumers failing to update their information
  • Brands failing to update their database as prospect/consumer information changes
  • A lack of communication/sharing between internal departments regarding the data necessary to create consistent or complete profiles
  • Third parties providing data without first performing a proper quality check

Given that the average person will change both jobs and living situations approximately 11 times over the course of their life, clean data can get dirty FAST.

With the large amount of data coming from new, quickly evolving sources like mobile and social media, it is now more important than ever to keep on top of your data. While social networks can provide a plethora of valuable data points, this social information can quickly become outdated. For instance, every single time a user creates a new social profile or updates an existing one, this personal information is stored in a secure database, offering up new information to marketers.

Fortunately, despite all of these extra records to worry about, there are ways to keep your database sparkly clean. Okay, maybe there will be just a tiny bit of dust or an occasional smudge–there’s no such thing as a perfect database. But with the right approach, you can substantially improve your data, and as a result, your overall bottom line.

To learn how small data enhancements can lead to big ROI, download our latest white paper: “Good Data: A Marketer’s Best Weapon.”

Temporal Messaging: the hot marketing trend that never was

temporal messaging dataEvery few years a new term or catchphrase comes along that catches fire. Omnichannel marketing is one, for example.

A catchphrase I heard a few years ago—which didn’t catch fire, but still smolders (especially in Europe)—is temporal messaging and its close cousin, temporal rhythms. Simply put, when do consumers send messages to one another? What patterns do their messaging behaviors follow? And how can we use this data?

Why temporal messaging matters

Messaging is just one online behavior, but peak messaging times and patterns are vital to know and here’s why:

  • Messaging is a very good proxy for other online activities. People who are online messaging are also shopping, watching streaming media, etc.
  • Messaging is becoming device agnostic, and people can send or accept messages on a laptop, tablet, smartphone, etc.
  • People message one another with word-of-mouth product recommendations.
  • People message one another when it is convenient to do so, so are likely to share product recommendations within social media.

When is it best to reach them online? Well, it seems people fall into patterns and rhythms of messaging one another: temporal rhythms.

The best study of temporal messaging was performed in 2006 by HP Labs. They studied student use of Facebook messaging, and at the time, only college students had access to Facebook (and 90% of them used it). The title was nearly as long as the study itself—Rhythms of Social Interaction: Messaging Within a Massive Online Network. The researchers tracked 362 million messages exchanged by 4.2 million users over 26 months—an absolute mountain of data by 2006 standards.

It surprised the researchers just how consistent and predictable student behavior was. For example:

  • Facebook messaging peaked during class time and study hours. That made sense—students used computers to socialize, study and exchange information about classes.
  • Facebook messaging tanked on Friday and Saturday nights, as well it should. Presumably, students were socializing face-to-face during those periods.
  • Weekend “rhythms” ran from mid-Friday to mid-Sunday, while weekday behavior ran the five days in between.
  • Seasonal messaging behavior was pretty consistent, including over summer and winter breaks.
  • About six in 10 messages were between students at their own school, the remainder to students at other schools. So, the word-of-mouth “reach” was very significant, both on-campus and on other campuses.

Behold the temporal database

Those findings were surprising and groundbreaking—eight years ago. But HP Labs promoted something very innovative at the time, which was the temporal database. It was (at the time) the latest dimension of online behavior and the least utilized. We knew already just who bought what, where and why they bought it, but now we could measure when—when they talked about it, when they researched it, when they purchased it and when brands could reach them online.

Temporal data would include:

  • Peak online hours for whichever demographic you choose, be it college students, parents of college students, Hispanic consumers, single women between 18 and 34, etc. When are they online most? Least?
  • Peak online hours for specific activities, like consuming streaming media, playing massive-multiplayer online games, online shopping.
  • Peak hours for brick-and-mortar activity, like when consumers (choose the demographic) are in stores with their smartphones or mini-tablets.

This all sounds familiar

But, you say, we do know all of that. Yes, we do—or, at least, we know much of it. That does not mean taking advantage of it is a piece-of-cake. For temporal data to work, there must be sufficient data to make the necessary determinations on communications and applicable layered segmentation. Then, the trick is using it wisely, for example, to target someone in your demographic sweet spot enough times to intrigue rather than annoy. To target them in the hours before their typical brick-and-mortar shop, or when they’re typically relaxing with their devices at home. All temporal data.

I thought the term temporal database would catch fire—it didn’t. Instead, the phrase big data did, and temporal data was wrapped into it.

To summarize what temporal messaging is, it is the construct of hitting the right audience with the right message through the right channel—at the right time. The end goal is to achieve the best possible response.

Sound familiar?

Sure it does. It just wasn’t called that.

Do you track temporal messaging? How do you use temporal data to improve your reach?

10 Tips to Avoid Costly B2B Data Purchase Mistakes

Analyzing B2B DataPurchasing B2B data isn’t rocket science. There are common areas that can be learned quickly, and vendors can help with less common queries. However, once you expand your requirements beyond your country, you might be surprised at how complex buying can become. These 10 commonly overlooked areas require careful consideration, or your data purchase decisions could cause the failure of an otherwise fantastic campaign.

1)      Turnaround times vary, greatly!

In Western markets, 24 – 48 hours turnaround time for counts is the norm.  Other markets respond slower. Far Eastern vendors, for example, can take 5-10 days to return a count. Work this into your timelines. 

2)      Adhere to local data legislation.

Be careful to adhere to local law and best practice, and ensure your data suppliers follow regulations too.  In Germany, double opt-in rules mean there is no such thing as a cold email. Conversely, the UK operates opt-out for B2B, so you can have a broader reach with email campaigns.  This is not just important from a data perspective – there is no point creating a fantastic campaign if it cannot be deployed.

3)      No database is perfect.

Some databases are fresher than others, but none are 100% accurate.  Business data decays rapidly (Watch this video to see how rapidly!), so you need to know local benchmarks and the vendors’ guarantees. That way you can expect certain inaccuracies, order over-supply when necessary and identify if the quality of the data you purchased is genuinely unsatisfactory.

4)      Language.

Can non-English data be handled accordingly?  Can your systems cope with special characters found within many European languages such as German or Spanish?  What about double byte characters from Russia and the Far East?

5)      Variation of variables – do they meet your needs?

Not all vendors collect, manage and store data consistently. Variables like employee size and turnover can be banded or actual, and the latter could be local currency or US Dollars.  Check how vendors report these variables early in the planning process.

6)      NACE vs. SIC vs. NAICS – ensure consistency of selection.  

There are different ways an organization’s industry can be categorized. In Europe, a NACE code is used whereas in the USA, US SIC codes or NAICS is used.  While there are similarities between all systems, there are also subtle differences. Aim for consistent use of industry codes, especially when using multiple vendors.

7)      Put data volumes into context. 

If you listen to vendors’ claims, then every database is the biggest and best on the market. But don’t worry, a bit of common sense will ensure you obtain genuine datasets. If the vendor claims they have 40m businesses in the USA, then it’s probably not true. Why? Research shows there are only 20m businesses, so the 40m figure is more likely to be contact volume, not sites.

8)      Lack of data quality standards.  

In the UK, we have an established association, The DMA, who produce guidelines and member Codes of Practice on acceptable data quality benchmarks. However, in some developed markets – including North America – there are no comparable benchmarks and vendors set their own standards. Don’t make any assumptions; ask suppliers what their guarantees are and why. Ask probing questions about their data collection methods and quality processes.

9)      Know the cost and usage terms.

How do you want to be billed, €, £ or $? If it’s different to the vendor currency, ensure you work in the correct exchange rates and include caveats allowing for fluctuations. How do you access the data? Annual subscription licences vs. per record purchase? Must data be downloaded from a portal or can it be transferred by SFTP?

10)   Data formats vary.

With 180 + countries globally and many of them having individual address standards, there are different ways to represent an address.  Communicate to the vendor exactly what you need for the campaign. Taking international phone numbers as an example, should country code be a separate field? Does the number need leading zeroes?

Buying data can be complex, particularly for international campaigns in markets where you are unfamiliar. The 10 areas above are the tip of the iceberg. Don’t hesitate to contact us if you have any questions about the above, or need guidance on how to apply these tips to your marketing programs.

Analyze First, Buy Second: A Methodical Approach to Data Enrichment

By: Donna Belanger

Sources for Data Enrichment

You already know the important role that high quality data plays in your marketing campaigns. But do you know how to efficiently enrich your data? What data points do you need? What are the best sources for this information? And how do you measure data enrichment efforts? These are all great questions. The key is that there should be a systematic method to enrichment—and randomly throwing data at a problem is not the best approach.

Analyze First

To make the most out of informatics, you need to take a methodical approach of analyzing first, buying second. The strategy should begin with a basic data assessment, which includes examining your current business information and profiling it against a comprehensive contact database. Analyze the results to identify the key characteristics that best describe your customers (such as SIC, company size, turnover, number of sites, etc.). This insight allows you to focus on buying only the data elements that enrich your priority market segments.

After assessing the data you already own, the next step is to expand your prospect universe. Identify what we call “white space,” or areas where there is untapped opportunity, by overlaying prospect data with key customer attributes. This helps to find pockets within the white space that resemble the ideal customer, and these are the areas you should target with your prospecting efforts.

Buy Second

As you know, buying data is expensive, so it is important to focus spending where it will be most effective. Take a close look at your goals and objectives. What are your target markets? What size company makes for the most profitable engagements? Is there a particular geography where the company wants to expand? The answers to these questions will help determine what to purchase, and an experienced planning team can identify the best sources to optimize your investment. For example, if you are eager to expand with enterprise businesses in Latin America, some data sources will be more valuable than others; a good data planning team knows which sources to consider and can facilitate purchases that deliver value.

Measure Data Enrichment Success

Finally, make sure your data enrichment efforts are worthwhile. Through our approach, we analyze each record after the new information has been incorporated into your database in order to determine if it is a marketable record. Can the marketing team reach this prospect with an email campaign? Can a sales person make a visit? Make sure the data supports your campaign goals.

What does enrichment success look like? For every 1% in data quality improvement, marketing can generate 5-6% of incremental revenue.[1] Improving your data will also improve marketing efficiency, reducing the cost of staff overhead, analytics, postage, printing and more.


In summary, the best data enrichment strategy requires the following steps:

  1. Analyze your current data to know what you’re working with and where the opportunities are;
  2. Consider your goals and best opportunities to determine which data to buy from which sources;
  3. Analyze the new data for marketability to determine the success of your enrichment.

The most important thing to remember is that you should remediate and enrich your data in a way that is going to yield an improvement in sales—and this requires a strategy with a targeted acquisition process.

Check out our on-demand webinar to learn more about creating an effective data enrichment strategy: Maximize Marketing Effectiveness with Enriched Data.









[1] SiriusDecisions. “The Impact of Bad Data on Demand Creation.”


Connect with US