Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
This story is not yet trending on Google news. (I’ll be curious to see if it does, although I won’t draw any conclusions from it.)
Robert Epstein is a research psychologist. (I’ve drawn this conclusion by using Google to look him up and peruse his publications.) Here’s his Wikipedia page. (It was the top-ranked entry when I searched for “research psychologist Robert Epstein.”)
It says he was born on June 19, 1953. He’s “an American psychologist, professor, author, and journalist.” It also says has a doctorate from Harvard. I believe all of that, although I’ve confirmed none of it independently. I also believe this, though I haven’t confirmed it independently:
In 2012, Epstein publicly disputed with Google Search over a security warning placed on links to his website. His website, which features mental health screening tests, was blocked for serving malware that could infect visitors to the site. Epstein emailed “Larry Page, Google’s chief executive; David Drummond, Google’s legal counsel; Dr. Epstein’s congressman; and journalists from The New York Times, The Washington Post, Wired, and Newsweek.” In it, Epstein threatened legal action if the warning concerning his website was not removed, and denied that any problems with his website existed. Several weeks later, Epstein admitted his website had been hacked, but still blamed Google for tarnishing his name and not helping him find the infection.
Epstein has just published a piece in Politico warning that Google might throw the 2016 election:
America’s next president could be eased into office not just by TV ads or speeches, but by Google’s secret decisions, and no one—except for me and perhaps a few other obscure researchers—would know how this was accomplished.
Research I have been directing in recent years suggests that Google, Inc., has amassed far more power to control elections—indeed, to control a wide variety of opinions and beliefs—than any company in history has ever had. Google’s search algorithm can easily shift the voting preferences of undecided voters by 20 percent or more—up to 80 percent in some demographic groups—with virtually no one knowing they are being manipulated, according to experiments I conducted recently with Ronald E. Robertson.
You knew about this research already, of course. I’ve brought it up before.
Funny thing is that the first four or five times I saw a reference to him, it didn’t occur to me to look him up. After seeing this reported again today, I finally decided to look into it more carefully.
Here’s the full paper on PNAS: The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. His co-author is Ronald E. Robertson; if you throw his name in Google, you’ll find his LinkedIn page first, and you probably won’t do any further searching. (I basically believe everything Robertson says about himself there, too, although I’ve confirmed none of it.) My willingness to believe what I find in the top-ranked search result confirms what they suggest in their paper:
Studies using eye-tracking technology have shown that people generally scan search engine results in the order in which the results appear and then fixate on the results that rank highest, even when lower-ranked results are more relevant to their search. Higher-ranked links also draw more clicks, and consequently people spend more time on Web pages associated with higher-ranked search results. A recent analysis of ∼300 million clicks on one search engine found that 91.5% of those clicks were on the first page of search results, with 32.5% on the first result and 17.6% on the second. The study also reported that the bottom item on the first page of results drew 140% more clicks than the first item on the second page. These phenomena occur apparently because people trust search engine companies to assign higher ranks to the results best suited to their needs, even though users generally have no idea how results get ranked.
That more or less describes what I did when I decided that I’d like to know more about their research, so no conflict with intuition there.
I’ve now read the whole paper. I do think they’re raising a serious concern. That said, there’s a limit to the amount of research I can do quickly. You see, when I search for more information about Robert Epstein, for example, I hit a wall very quickly:
Some results may have been removed under data protection law in Europe. Learn more
Just curious: What happens when those of you who aren’t in Europe search under these researchers’ names? Do you learn the same things I do? Do you draw similar conclusions?