So much data, so few data scientists
The market research industry will continue to advance the science and art of research in 2018. This article discusses nine trends Q2 Insights believes will significantly influence the research discipline in 2018:
- Design Thinking
- Customer Experience/User Experience
- Neuroscience-Based Communications Evaluation
- Segmentation and Personalization
- Predictive Analytics is Surpassing Simple Analytics
- Existing Databases are Rising in Prominence
- Data Dashboards and Other Real-Time Digital Reporting
Machine versus human
- Artificial Intelligence
- Scarcity of Data Scientists
Design Thinking, or Human-Centered Design, has really come to forefront in business and marketing recently, and this trend is going to continue to grow. Design Thinking, sometimes referred to as Human-Centered Design, is no longer just for designers. It is an approach to innovation that is being adopted by corporations, government entities, marketers, and designers. Research plays a huge role in Design Thinking and has much to add to the discipline.
Russ Klein, American Marketing Association CEO, had this to say about Design Thinking in early 2017:
“Of course, everyone is talking about the role of Design Thinking as part of the marketer’s expanding skillset.”
Design Thinking is solution-based thinking that involves evaluating a problem or situation and determining a reasonable, practical plan to attack the issue. In seven steps, problems are framed, the right questions are asked, ideas and solutions are created, and the best answers are chosen. The steps are not necessarily linear, can occur simultaneously, and may be repeated. To ensure an ongoing process of innovation, the process is often iterative.
Design Thinking is a process often described as having seven stages: define, research, ideate, prototype, choose, implement, and learn. The research profession introduces its expertise in four of the seven stages—define, research, choose, and learn. The research employed by some Design Thinking practitioners is somewhat rudimentary; this creates a huge opportunity for Design Thinking and research practitioners acting together to make large forward leaps in 2018.
Customer experience/user experience (CX/UX/customer journey)
Customer Experience/User Experience remains a hot topic in marketing, and business in general, and therefore also in research. The business and marketing playing fields have changed substantially, and the ecosystems are now exceptionally vast due to changes in digital and generational shifts. Competition for the hearts and minds of customers is fierce. The competitive landscape has changed dramatically in the last decade and is considerably more complex.
Companies that focus on Customer Experience or User Experience to win, keep, surprise, and delight customers reap substantial rewards such as:
- A relevant, personalized Customer (User) Experience
- Brand differentiation
- Increased revenue
- Reduced customer churn
- Improved Key Performance Indicators (KPIs) due to enhanced customer satisfaction
- Greater employee satisfaction
- Increased collaboration across the company
Customer Journey development has emerged as a key way to provide a framework for understanding the Customer Experience/User Experience in both digital and physical environments.
Mapping the Customer Journey typically includes defining the major and minor touchpoints between the brand, product or service, customers/users, and the environments in which the touchpoints occur. A full commitment to improving and capitalizing on Customer (User) Experience involves not just brand employees getting together to describe the experience. It also involves research with customers to truly understand each touchpoint in the Customer Journey, describing the emotional and rational aspects of:
- Customer Predispositions
- Need States
- Pain Points
- Customer Reaction
- Perceptions of Brand Performance
Neuroscience-based communications evaluation
Traditional techniques such as Focus Groups, Depth Interviews, or Surveys adopted to measure response to communications is being supplanted by techniques derived from neuroscience, which provide a window to the subconscious where most of our decision making originates. Asking questions or relying on self-reports is not the way to access the human subconscious. Neuro-tools provide much more sophisticated and deeper look into the subconscious response to marketing stimuli. By using neuro-tools, marketing is now able to essentially read the minds of consumers. This is a very exciting new territory for researchers.
Specifically, electroencephalography (EEG) is used measure brain responses directly while research respondents view print, television or web communications. With EEG technology, electrodes are placed on the subject’s scalp and “brain waves” are recorded while the subjects view the material. The EEG data are analyzed using a range of very simple (most cases) to sophisticated pattern recognition algorithms, and the effectiveness of the communication medium is assessed. It all sounds simple, but it’s not. The methods are based on an accumulation of academic study in neuroscience. It also sounds expensive, but it’s not. The cost of a study is similar to the cost of a traditional advertising effectiveness study.
While several groups have developed variations of approaches to using EEG to evaluate communications materials, a technology called SST (Steady State Topography), a subcategory of EEG, is the most accurate way to measure any marketing-stimulus as it offers a couple of major advantages:
- The ability to measure a reliable signal from the brain – SST measures the speed of propagating activity in the brain, one of the most useful brain signals.
- The ability to measure the response in the very first viewing. SST is the only brain-mapping technology that allows researchers to understand responses to a stimulus in the very first exposure.
Relative to traditional techniques employed to measure response to communications, EEG, and more specifically SST, has been validated and reliable. Like artificial intelligence, researchers and marketers should be aware of this approach to predicting consumer behavior as it is somewhat revolutionary in the discipline of research.
Segmentation and personalization
Segmentation and personalization are powerful tools in the marketing toolbox and are being used more and more frequently by marketers. Researchers/data scientists are increasingly called upon to assist in these areas.
When segmentation and personalization are used in combination by brands, they can positively influence their marketing Return on Investment (ROI). The bridge between segmentation and personalization is often detailed personas that outline demographic, geographic, behavioral tendencies, cultural influences, and psychographics of segments. The best inputs to personas are the outputs from a segmentation exercise.
Rather than targeting the masses, segmentation allows companies to divide their existing and potential customer base into segments allowing them to target specific needs, wants, behaviors and opinions. Personalization gives customers a sense of identity instead of one the masses.
Predictive analytics is surpassing simple analytics
Market researchers who are also data scientists have many fortune telling talents and fortune telling, also known as predictive analytics, will be hot property in marketing in 2018. Marketers wanting to hedge their bets will no longer be satisfied with analytics that report how a campaign has performed. Rather, they will want to predict the outcome of their marketing activities ahead of time in order to get buy-in from management and the C-suite.
The word “analytics” is bandied around with great gusto in marketing circles these days. Fundamentally, market analytics is all about measuring and analyzing marketing performance to improve its effectiveness and optimize return on investment (ROI). Like most disciplines, it ranges from simplistic frequency counts (e.g. number of people who read, liked, or shared my LinkedIn post) and AB testing to sophisticated measurement using multivariate statistical methods. At their core, most good research companies with solid data science teams provide terrific resources for sophisticated analytics.
Predictive analytics differs from analytics in that it makes predictions about future events—answering question like: What is the optimal price for our widget to maximize profitability? Who is most likely to purchase our product/service? Who is our most valuable customer? The discipline is mostly science but also involves some degree of art, and it involves data mining, text analytics, basic statistics, and multivariate statistical modeling to identify patterns and relationships in structured and unstructured data. An example of structured data is a customer database or an Excel spreadsheet containing survey data. An example of unstructured data is the collection of all the comments made in the last six months about your brand on Facebook.
Existing databases are rising in prominence
Perhaps as a response to the Big Data craze in marketing, using existing databases to obtain insights is rising in prominence in 2018. Technically, existing databases have been important to business and marketing for as long as business has been collecting data on customers. However, with the increased focus on Big Data, many are wondering how they should deal with analysis of Big Data when they should perhaps be concentrating on more manageable existing databases, leaving Big Data to the data scientists wielding sophisticated analytic tools. “Big Data” refers to huge volumes of structured and unstructured data.
The focus of existing databases is on targeted data. Existing databases can be employed for simple analytics or predictive analytics (if the datafile is sufficiently large), and they are useful for quick analysis on demand.
Moving forward, companies will increasingly seek to understand their own existing databases rather than having a laser focus on structured and unstructured Big Data.
Data dashboards and other real-time digital reporting
Data dashboards and other real-time digital reporting are powerful tools in the researcher tool box that save time and money for end-users. Data visualization allows end-users to quickly see data analytics summarized in easily digestible formats. Programmatic platforms allow the creation of research reports based on machine learning. Market researchers are now expected to provide data dashboards and other real-time digital reporting as a matter of course.
Machine versus human
Artificial intelligence (AI)
Artificial Intelligence (AI) is reshaping parts of the research industry by making it possible for machines to perform human-like tasks by learning from experience and adjusting to new inputs. Essentially, computers are trained to process large amounts of data and recognize patterns in the data to accomplish specific tasks.
Examples of areas in which AI is likely to continue to have impact on market research in 2018 include:
- Automated text analytics
- Personalized question creation
Measuring the Previously Unknown
- Measuring the previously unmeasurable elements of human behavior using customized analysis categories
- Sophisticated analysis of Big Data (structured and unstructured data) such as:
- Real time analysis of sentiment in social media (unstructured Big Data)
- Quick analysis of video, audio and text
- Predicting behavior
- Purchase intent and tracking it over time
- Shopper behavior
- Price optimization
While AI is automating some aspects of data science, understanding the “why” of behavior and opinion remains somewhat elusive. It is likely that the human touch will be required for rich insight gathering in 2018 and beyond. Additionally, AI is not necessary or recommended for all situations such as dealing with existing datasets or developing surveys. AI is definitely impacting market research, but it is not replacing it.
So much data and a scarcity of data scientists
Artificial Intelligence (AI), the Internet of Things (IoT), and social media are just some areas in which we are generating an enormous quantity of useful data for marketing. Whatever the challenge – Big Data analysis, exploiting small data, predictive analytics, multivariate statistical analysis, understanding how relational databases work, customer relationship management (CRM), etc. – the sheer quantity of data we humans have generated in the last couple of years alone must exceed the entirety of our collective data accumulation. We have so much data and so much potential to use that data not only for business and marketing decision making but also for general human benefit.
The problem is that globally, domestically, and regionally, we are not feeding the data scientist funnel fast enough. Data scientists such as those at Q2 Insights appear to be a dying breed. It is important that we all encourage the up and coming professionals in our networks to consider a career in data science. For all those with an acumen for math, statistics, and/or data analysis, or an interest in getting into the field of data science, the datafication of the planet is creating wonderful career opportunities for you.
Facing 2018 with a mix of optimism and some concern
2018 is full of promise for researchers and the marketers who work with them. A number of areas are rising in prominence in 2018, although several of them have been around for a while. A key concern to all of us in data science is the dearth of data scientists to address the sheer quantity of data at our disposal. Artificial Intelligence (AI) is one solution but even the discipline of AI requires data scientists.
Kirsty Nunez is the President and Chief Research Strategist at Q2 Insights, Inc., a research and
innovation consulting firm with offices in San Diego and New Orleans. Kirsty and her team are fully versed in all the areas outlined in this article. If you would like to learn more, please reach out to Kirsty and her team at (760) 230-2950 ext. 1 or [email protected].