Wednesday, July 14, 2010

Statistics Don't Lie, People Fudge

Bumper Sticker slogans are cute and memorable but that doesn’t mean they are accurate or used in the appropriate context. Take for example, a quote an accountant friend of mine loves to use as a way to dismiss research he doesn’t like, “there are lies, damn lies and statistics.” Always gets a chuckle although especially a bit troubling when an accountant uses it.

But it isn’t the statistics that lie, it's the people who misuse and abuse them including those who would rather dismiss them than inform their opinions.


I'm not talking about the blatant misuse of polling such leading questions recently used by one done by the billboard industry. I'm talking about good data used inappropriately or out of context. It is more often that people lie than statistics.

A good example are various studies quoting that “40% of tourism is driven by or due to cultural and heritage activities. The studies are often accurate as far as they go and context but the way the information is later being applied, often isn’t. Actually nearly 18% of visitor person stays involve participation in one of four areas of culture and heritage and a little over 2% cite the four areas as the “main reason behind the trip.


Of course the 18% is a nationwide tally comparing all activity participation by domestic travelers but it would be a stretch for even the most culturally rich state or community to reach 40%. The gap is typically because the responses leading to 40% haven’t been weighted to be applied the way they are being cited.


For instance:




  • Typically they use only a subset of visitors which makes the percentage seem more impressive but out of synch too. This can be done by limiting visitors to just those person-stays that are “leisure” or an even smaller subset, “vacation.”


  • Rarely do they distinguish participation from motivation and when they do they are typically not weighted (same reason some political polls are so off) which means the percentage only applies to that sample and can’t be generalized.



  • They typically haven’t been “quantity weighted” for multiple responses, e.g. responses on activities add up to greater than 100%.



  • And/or they typically haven’t been weighted for propensity to make sure the sample is truly representative of the general population and not skewed to one segment or another.



  • And/or they typically haven’t been weighted for nonresponse bias when those responding to an outcome variable differ significantly from those who don’t respond. It is a mistake not to adjust for those who state they did not participate in any activity.

Sometimes the information hasn’t been weighed because those commissioning the study don’t understand the importance or want it to be or they don’t want to go to that expense or never included the steps in their request to make the information applicable in the way it is later being used. I’ve personally given agencies a heads up only to see them go ahead and misuse the information to serve an agenda.


People in any activity area who deliberately do this are either idiots or playing agenda politics by hoping never to be discovered or believing if they are, they can run for cover and the misinformation will stick in people’s minds. I’ve also checked with people who do these studies only to learn they are horrified the information is being misused.


It isn’t statistics that lie. It's people. Unfortunately, the manipulation creates credibility issues and when it backfires, blood gets spattered everywhere not just on the perpetrators.


I’m no genius at this stuff but I’ve learned enough from people who are to be suspicious. I’ve also been certain when involvede with studies myself to circulate findings with which I might not agree. If the science is right, its right.


Information shouldn’t be about agendas. I was on a board once where the organization rejected and suppressed some new calculations because they didn’t jive with information they had used for years from much smaller samples that hadn’t been appropriated weighted. Some egos around the table just couldn’t face up to how to explain the differences and opted for suppression instead.


It is the responsibility of DMO’s at local and state levels to vet information and have the courage to challenge it, especially when something “tourism” is being commissioned by non-tourism entities. It may initially result in a tad of friction but if folks who unwittingly misuse or purposely misuse the information aren’t confronted, it will only get worse. Read or re-read the book Crucial Confrontations if you need to know how or where to start.


It is also up to elected officials to have agencies with expertise in a particular area, e.g. tourism, vet research by other agencies on that topic and then reconcile issues, even if behind closed doors to prevent misuse and misinformation to legislators, news media and the public.


It is also important for scientists conducting the studies to be insistent on how the information is to be used and intervene when it isn't. It isn't statistics at risk, it is the credibility of the people who commission, conduct or interpret the information.

No comments: