Subscribe to our Newsletter

A Comprehensive List of Big Data Statistics

This article was originally posted on Wikibon. Here I selected a few out of the dozens statistics. Enjoy the reading, and visit the original article: it also features a nice infographic on big data. It would be interesting to add stats about sensor data, or data used in engineering (NASA etc.) For instance, how many data points are used to make weather forecasts? How many synthetic molecules are simulated each year, to create new drugs, using computational chemistry models? On how many videos (security cameras) each person in US shows up, each year?

Big Data in Today’s Business and Technology Environment

  • 2.7 Zetabytes of data exist in the digital universe today. (Source)
  • 235 Terabytes of data has been collected by the U.S. Library of Congress in April 2011. (Source)
  • The Obama administration is investing $200 million in big data research projects. (Source)
  • IDC Estimates that by 2020,business transactions on the internet- business-to-business and business-to-consumer – will reach 450 billion per day. (Source)
  • Facebook stores, accesses, and analyzes 30+ Petabytes of user generated data. (Source)
  • Akamai analyzes 75 million events per day to better target advertisements. (Source)
  • 94% of Hadoop users perform analytics on large volumes of data not possible before; 88% analyze data in greater detail; while 82% can now retain more of their data. (Source)
  • Walmart handles more than 1 million customer transactions every hour, which is imported into databases estimated to contain more than 2.5 petabytes of data. (Source)
  • Decoding the human genome originally took 10 years to process; now it can be achieved in one week. (Source)
  • In 2008, Google was processing 20,000 terabytes of data (20 petabytes) a day. (Source)
  • The largest AT&T database boasts titles including the largest volume of data in one unique database (312 terabytes) and the second largest number of rows in a unique database (1.9 trillion), which comprises AT&T’s extensive calling records. (Source)

The Rapid Growth of Unstructured Data

  • YouTube users upload 48 hours of new video every minute of the day. (Source)
  • 571 new websites are created every minute of the day. (Source)
  • Brands and organizations on Facebook receive 34,722 Likes every minute of the day. (Source)
  • 100 terabytes of data uploaded daily to Facebook. (Source)
  • According to Twitter’s own research in early 2012, it sees roughly 175 million tweets every day, and has more than 465 million accounts. (Source)
  • 30 Billion pieces of content shared on Facebook every month. (Source)
  • Data production will be 44 times greater in 2020 than it was in 2009. (Source)
  • In late 2011, IDC Digital Universe published a report indicating that some 1.8 zettabytes of data will be created that year.  (Source)
    In other words,  the amount of data in the world today is equal to:
    • Every person in the US tweeting three tweets per minute for 26,976 years.
    • Every person in the world having more than 215m high-resolution MRI scans a day.
    • More than 200bn HD movies – which would take a person 47m years to watch.
Big Data & Real Business Issues
  • According to estimates, the volume of business data worldwide, across all companies, doubles every 1.2 years. (Source)
  • Poor data can cost businesses 20%–35% of their operating revenue. (Source)
  • Bad data or poor data quality costs US businesses $600 billion annually. (Source)

Related articles

Views: 11172


You need to be a member of BigDataNews to add comments!

Join BigDataNews

Sponsored By

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service