Google Play icon

When big isn’t better: How the flu bug bit Google

Share
Posted March 17, 2014
Ryan Kennedy
Ryan Kennedy, University of Houston political science professor, and his co-researchers detail new research about the problematic use of big data from aggregators such as Google’s Google Flu Trend.

Numbers and data can be critical tools in bringing complex issues into a crisp focus. The understanding of diseases, for example, benefits from algorithms that help monitor their spread. But without context, a number may just be a number, or worse, a misleading number.

“The Parable of Google Flu: Traps in Big Data Analysis” is published in the journal Science and was funded, in part, by a grant from the National Science Foundation. Specifically, the authors examine Google’s data-aggregating tool Google Flu Trend (GFT), which was designed to provide real-time monitoring of flu cases around the world based on Google searches that matched terms for flu-related activity.

“Google Flu Trend is an amazing piece of engineering and a very useful tool, but it also illustrates where ‘big data’ analysis can go wrong,” said Ryan Kennedy, University of Houston political science professor.  He and co-researchers David Lazer (Northeastern University/Harvard University), Alex Vespignani (Northeastern University) and Gary King (Harvard University) detail new research about the problematic use of big data from aggregators such as Google.

Even with modifications to the GFT over many years, the tool that set out to improve response to flu outbreaks has overestimated peak flu cases in the U.S. over the past two years.

“Many sources of ‘big data’ come from private companies, who, just like Google, are constantly changing their service in accordance with their business model,” said Kennedy, who also teaches research methods and statistics for political scientists. “We need a better understanding of how this affects the data they produce. Otherwise we run the risk of drawing incorrect conclusions and adopting improper policies.”

GFT overestimated the prevalence of flu in the 2012-2013 season, as well as the actual levels of flu in 2011-2012, by more than 50 percent, according to the research. Additionally, from August 2011 to September 2013, GFT over-predicted the prevalence of flu 100 out of 108 weeks.

The team also questions data collections from platforms such as Twitter and Facebook (like polling trends and market popularity) as campaigns and companies can manipulate these platforms to ensure their products are trending. Still, the article contends there is room for data from the Googles and Twitters of the Internet to combine with the more traditional methodologies, in the name of creating a deeper and more accurate understanding of human behavior.

“Our analysis of Google Flu demonstrates that the best results come from combining information and techniques from both sources,” Kennedy said. “Instead of talking about a ‘big data revolution,’ we should be discussing an ‘all data revolution,’ where new technologies and techniques allow us to do more and better analysis of all kinds.”

Source: UH

Featured news from related categories:

Technology Org App
Google Play icon
85,465 science & technology articles

Most Popular Articles

  1. New treatment may reverse celiac disease (October 22, 2019)
  2. "Helical Engine" Proposed by NASA Engineer could Reach 99% the Speed of Light. But could it, really? (October 17, 2019)
  3. The World's Energy Storage Powerhouse (November 1, 2019)
  4. Plastic waste may be headed for the microwave (October 18, 2019)
  5. Universe is a Sphere and Not Flat After All According to a New Research (November 7, 2019)

Follow us

Facebook   Twitter   Pinterest   Tumblr   RSS   Newsletter via Email