It happened again.
I attended a speech by a consultant with a current Amazon bestseller, who has consulted with an impressive roster of companies and government agencies. Most of what she had to say was scientifically accurate. Much overlapped with my own preachings in her area of expertise. Then, she reported on a study.
She said Hewlett-Packard, a past client of mine, wanted to find out the impact of digital distractions on productivity. So HP contracted with a researcher at a college in England to conduct a study of 300 people. All were given a standard intelligence test. One-third were told to focus on the test. A second set was allowed to check social media and e-mail. The last, she said, was also asked to focus on the test—after smoking marijuana. She reported that the first group did the best, of course. The surprise was, the pot smokers did better than the multi-taskers. As she said that, laughter tittered throughout the room.
I wanted to believe this. But something smelled, and there wasn’t a haze of pot smoke at the luncheon.
Social science researchers do not typically encourage subjects to break the law, much less publicly admit they conspired to do so. (Medical studies are different, and are tightly controlled.) Even if the researchers wanted to, every respectable university has a human research ethics panel that would have balked, and HP’s public relations department would have had a fit. After the talk I asked the speaker privately if the study was published, so I could get the citation from her. She said she wasn’t sure, that she had found it on the Internet. My heart sunk. To her great credit, she took the time to find a couple of links—not the ones she found earlier, though—and send them to me. One was to a BBC report. As you can see, however, it reports on a survey, not a lab study, and does not say pot was involved. Strike one.
The article lists the university the researcher worked for. I went to its site and plugged in his name. Nothing came up. As someone who does this all the time for Teams Blog, I can tell you that is unusual. Even if he left six years ago, around the time this story came out, if he was a regular faculty member he would have left a trace. Strike two.
Then I conducted another search on his name and a keyword from the study. Almost every researcher has a list of publications on a Web page somewhere. Instead what popped up, along with links repeating versions of the speaker's report, were a couple of respectable science writing blogs slamming the coverage of the study. One included a link to the researcher’s personal site, specifically to a rebuttal (Microsoft Word doc) he felt compelled to write debunking the lies that had grown up around the study. Strike three. This myth is outed.
There is a kernel of truth. Dr. Glenn Wilson, an accomplished psychology scientist, did run a tiny study at the request of HP’s London publicists. Apparently they wanted additional information to support a survey of 1,100 people. Wilson’s project involved eight people, not 300. They were split into two groups, not three. They were employees of the publicity firm, not a large sample, and the study was not associated with the college, at which Wilson served in a junior role. Finally, it did not involve pot.
"This study was widely misrepresented in the media," Wilson writes, "with the number of participants for the two aspects of the report being confused and the impression given that it was a published report (the only publication was a press release from Porter-Novelli). Comparisons were made with the effects of marijuana and sleep loss based on previously published studies not conducted by me. The legitimacy of these comparisons is doubtful..."
As politely as I could, I e-mailed the bad news to the speaker along with the relevant links, suggesting she "might want to change" her presentation. You now know why I am not identifying her. She graciously thanked me and said she was motivated to go find her "original research." A doubly ironic phrase, that. In the scientific world, "original research" refers to an actual study, which even I don’t do, or at least the first study on a particular topic, which this certainly would have been if the story had been true!
In summary, a former executive with a top-school MBA and best-selling book who gets paid big bucks by big companies repeated a myth which the slightest skepticism would have debunked. The advice columnist Ann Landers said it best: "If it sounds too good to be true, it probably is." When something seems improbable or questionable, question. You can’t just ask about the source’s background or character or popularity. You also have to ask where the person got his or her information.
As you know, the same thing that makes the Internet so handy increases the problem. The Web makes bad information just as easy to spread as good information. Newspapers can be nearly as guilty, and I say that as a former newspaper editor. Especially in these days of newsroom cutbacks, the reporter asked to cover a new study may have no prior experience with science writing, much less the specific topic. Often they do not know how to judge the quality or significance of a study. And that isn’t who writes the headlines, which are all that many readers will read.
At least in this case, unlike other work myths I debunk in my classes, the basic lesson is accurate. There is plenty of research evidence that multitasking harms the quantity and quality of whatever you are trying to do, whether it is getting a task done at work or driving while yakking on a cell phone. You don’t need a hit of weed to get the same results.
↧
A Good Speech Goes to Pot
↧