Don’t apply wishful thinking to your data

Good data can be used to generate bad insights if users don’t guard against bias and adopt sound measurement techniques.



Data is just a pile of numbers until you figure out what they mean. We devise all sorts of metrics and KPIs to find truth in our numbers. But even smart people can unknowingly deceive themselvesby trying to see what they want to see in their data.


So how is it that smart people can misinterpret their data? How do they get it wrong?


It’s human nature


The problem with humans is that they are…human. They will make mistakes, not all of them consciously.


“Most mistakes are made by people not peeling back the onion,” said T. Maxwell, owner of digital marketing agency eMaximize and a member of the Forbes Agency Council.


“For instance, they look at their Monthly Visitors and use it as a key metric to measure growth. When you dig a little deeper you notice that 40% of those visitors are from India and Russia and are most likely bots.”


“Unlike computers, humans are emotional creatures that have embedded experiences, which means their interpretation of data can be led by things like assumptions and recency bias. Interpreting data is more about relinquishing those thought processes,” said Erica Magnotto, director of SEM at Accelerated Digital Media.


“Bias is anything that changes the outcome of a model when it should not have,” added Mark Stouse, chairman and CEO of Proof Analytics. “The best way to break free of bias is to realize that you have it, and then take the steps to enlarge the circle in terms of what you think is relevant to a decision,” Stouse continued.  “This is really the practical value of diversity and inclusion in an operational sense.  It enlarges your perspective and bears against missing something important that could introduce bias into your thinking.  Bias is usually the result of thinking too narrowly.”


Many ways to measure


Which leads to the data itself. There are many ways to measure it. That does not mean that what you are measuring helps create greater understanding. Some metrics are meaningful, others not so much.


 “Often working with a client that has a ‘little’ knowledge of digital marketing is painful and slows things down,” Maxwell said. “[T]hey fall for every gimmick they see on social media and ask their agencies to investigate hoax marketing strategies instead of deploying sound digital strategies with proven, best practices. It’s the job of the agency and the owner to choose which metrics are important,” Maxwell added.


A single course of truth is needed, Magnotto echoed. “[B]oth the client and marketer need to agree on the platform that is considered the source of truth for tracking primary KPIs and other performance.” That way, all parties are looking at the same information the same way.


“It’s the marketer’s responsibility to create reporting that marries both their platform data and the source of truth data for optimal visibility into performance,” Magnotto continued. “For large accounts the primary KPIs should be discussed frequently, and reporting should be consistent so that the client is bought into the agency’s methodology and can acknowledge/agree with the performance being presented. For a client with multiple goals, it’s important to categorize primary and secondary KPIs so there is clear prioritization when looking at reporting.”


“Data is always the numerator in the equation, not the denominator,” Stouse said. “The question or decision dictates the model, and the needs of the model dictate the data required to arm the model.  In general terms, there are two kinds of potentially relevant data — what measures what you are doing (what you control), and what measures what is a relevant headwind or tailwind (what you don’t control).”


Measuring change or changing the measure


And make some allowance for the unpredictable to happen, like pandemics, bad weather, economic recessions or supply chain issues, Maxwell noted. The more this happens, the more optimization is needed to keep a campaign on track.


“Savvy marketers are adjusting marketing campaigns in real-time,” Maxwell said. “It may take two to three months to get an advertising campaign dialed in, so that it only needs minimal tweaks going forward.”


“There does need to be enough data to indicate next steps in optimization; that amount of data is dependent on client KPIs, spend, and time needed for data collection,” Magnotto noted. “In an account with a large threshold of conversions you may only need to run an A/B experiment for a week to collect enough data to confidently pivot into new strategies, whereas smaller accounts may need 30 to 60 days.”


“Research tells us that the unaided human brain cannot handle more than three to four variables. After that, things go tilt,” Stouse said. “And when time lag is a big part of the equation, it makes things even harder. This is why most people default to very short-term evaluation, and they justify it because they’ve always heard that if you manage the minutes the days will take care of themselves,” Stouse continued.


“That’s true if you understand the causality that’s driving the overall situation, but if you don’t, managing the short game will not mean that you’ll win the long one.”




The post Don’t apply wishful thinking to your data appeared first on MarTech.

MarTech

About the author










William Terdoslavich is a freelance writer with a long background covering information technology. Prior to writing for MarTech, he also covered digital marketing for DMN. A seasoned generalist, William covered employment in the IT industry for Insights.Dice.com, big data for Information Week, and software-as-a-service for SaaSintheEnterprise.com. He also worked as a features editor for Mobile Computing and Communication, as well as feature section editor for CRN, where he had to deal with 20 to 30 different tech topics over the course of an editorial year. Ironically, it is the human factor that draws William into writing about technology. No matter how much people try to organize and control information, it never quite works out the way they want to.

(2)