Could We Start Trusting the Internet Again?
{{#rendered}} {{/rendered}}
On the Internet, no one knows whether you ever ate at that restaurant or not. Or stayed in that hotel. Or read that book. But that doesn't stop people from writing “reviews” online.
As untrustworthy as the Net may seem some times, there are some hopeful signs that maybe, just maybe, the era of fake reviews may be coming to an end.
Right now, the fake review problem is endemic. You'll find them on Yelp. You'll find them on TripAdvisor. You'll find them on Amazon. Every time you click on a “reviews” button at a retail outlet, on a corporate site, or especially before you download a mobile app, up pop critiques of products and services that at best are skewed and at worse are downright deceptions. It's a problem with what Internet firms call user generated content, a wonderfully blissful business model that uses freely supplied “information” or “reviews” from anonymous folks online and then peddles it to us—along with ads, of course.
{{#rendered}} {{/rendered}}
This information isn't really free or often very useful because, well, not every one who has an opinion about curry is an Indian food gourmet. But the main problem is that people will always game the system. Even in the days of CompuServe, communications and PR professionals scoured the bulletin boards sussing out critical comments and emphasizing positive remarks about the companies they worked for. Conversely, businesses have been accused of writing digital poison pen reviews in an attempt to undermine a competing business.
Then there's the out and out fraud. You'll find classified ads offering money (usually less than $10) to anyone who will post a positive review on a popular crowd-sourced site. Add to this the “incentive” programs: At least one hotel in England offered discounts for positive reviews.
Finally, there's the friends and family problem. Who hasn't been asked by a friend or acquaintance to post a positive review on Amazon to help out with their new ebook or novel? Hey, we're not being dishonest, we're just being, er, supportive.
{{#rendered}} {{/rendered}}
Most reviews are anonymous, making it impossible to tell who wrote it or if they are reliable. Some writeups are obviously scams. (Could that cheap hotel really be better than the Waldorf at half the price? Come on.) Of course, sites like Yelp and TripAdvisor try to cull out the suspicious reviews, much in the same way that sites block posts that contain expletives. But it's an uphill battle, just like trying to block spam. Researchers at Cornell recently published a paper describing how their software could do a better job than human readers uncovering fake or planted reviews, but for decades researchers have said that they would also be able to eliminate computer viruses, and look how successful those efforts have been.
But all may not be lost. There are some signs that people are beginning to notice the extent of the problem and moving to models that use professionals and vet reviews more carefully. Blogs, for example, were once touted as the harbingers of citizen journalism and fiercely independent reviews. But blogs in this idealized, independent sense have faded away, supplanted by “blogs” from official news organizations that are actually columns by professionals masquerading as off-the-cuff blogs. Turns out most people didn't read most blogs, and the authors have found it easier to jot down 140 characters in Twitter than stay up all night writing a witty diaristic post that no one was going to read anyway.
Furthermore, the biggest company online, founded on the idea that information wants to be free and predicted on the concept that the masses know best—Google--has been increasingly moving away from that idea, whether the company realizes it or not. Its search algorithms still follow what we do and link to online and then rank its results according to the crowd. But Google's most advanced products, such as Street View, are increasingly professionally crafted to the core. Street View is a stunning service that puts you right on the ground—virtually speaking—so that you can walk the avenues of New York City or Prague before you leave home. Google's photo trucks piece the globe together, but imagine what a crowd sourced version of Street View would look like. You'd get a billion views of the Empire State building and none of the ATM down the street.
{{#rendered}} {{/rendered}}
The company's latest purchase also points toward a curated and vetted future rather than a crowdsourced utopia. Last week, Google announced it was acquiring the venerable Zagat guides. Even though Marissa Mayer, vice president of local, maps and location services at Google wrote in a blog (or should we call it a press release) that Zagat's “surveys may be one of the earliest forms of UGC (user-generated content)” that's not quite accurate. Zagat isn't a pure content aggregator or crowd sourcing venture. It collects surveys from specific consumers, and then it culls them and distills them to create a coherent review. If Google hews to the established methods and applies its considerable resources to expand beyond the 100 countries and 350,000 annual surveys Zagat currently does, it could a real relief from crowd sourced “reviews.” (Incidentally, Google did tried to buy Yelp a couple of years ago.)
Ultimately, I'm not saying that all the information people put online is suspect and worthless. I'm indebted to one anonymous blogger who put detailed instructions online on how to fix a common problem on my truck (right down to what hex wrenches to use). But most of us are tired of skimming through dozens of restaurant or app reviews we know we can't trust. We're simply tired of getting fooled again.