Revision as of 22:45, 11 January 2020 edit2a02:1205:5050:5f40:fd12:fc55:e5a6:1902 (talk) →Usage examples← Previous edit |
Latest revision as of 22:54, 26 December 2020 edit undoJamesLucas (talk | contribs)Extended confirmed users, Pending changes reviewers, Rollbackers12,912 edits making parallel w/ Christian75’s annotation of Exabyte |
(31 intermediate revisions by 19 users not shown) |
Line 1: |
Line 1: |
|
|
#REDIRECT ] |
|
{{Update|type=article|date=October 2019}}{{short description|Multiple of the unit byte}} |
|
|
{{Use dmy dates|date=January 2015}} <!-- Was going forward neutral ymd as more used, do not mind just dmy is easier w/script: First legal date format I found in edit 2005-10-16T22:42:49: "15 October 2005" however "illegal" MDY in 2004: "Dec 11, 2002" --> |
|
|
{{Quantities of bytes}} |
|
|
A '''petabyte''' is 10<sup>15</sup> ]s of ]. The unit symbol for the petabyte is '''PB'''. |
|
|
|
|
|
|
|
{{R cat shell| |
|
The name is composed of the ] ] (P) composed with the non-] unit of a byte. |
|
|
|
{{R with history}} |
|
|
|
|
|
{{R to section}} |
|
:1 PB = {{gaps|1|000|000|000|000|000|B}} = {{gaps|10<sup>15</sup>|bytes}} = {{gaps|1|000|]s}} |
|
|
|
{{R from subtopic}} |
|
: 1000 PB = 1 ] (EB) |
|
|
|
}} |
|
|
|
|
A related unit, the ] (PiB), using a ], is equal to 1024<sup>5</sup> bytes, which is more than 12% greater (2<sup>50</sup> ]s = {{gaps|1|125|899|906|842|624|bytes}}). |
|
|
|
|
|
==Usage examples== |
|
|
] |
|
|
Examples of the use of the petabyte to describe data sizes in different fields are: |
|
|
<!-- To avoid an infinitely long list, limited to one example per field--> |
|
|
* ]indexer ] seeder archive is about 2.5 petabytes of size. |
|
|
* Telecommunications (capacity): The world's effective capacity to exchange information through two-way ] networks was 281 petabytes of information in 1986, 471 petabytes in 1993, 2,200 petabytes in 2000, and 65,000 petabytes in 2007 (this is the informational equivalent to every person exchanging 6 newspapers per day).<ref name="HilbertLopez2011">, Martin Hilbert and Priscila López (2011), ], 332(6025), 60-65; see also and .</ref> |
|
|
* Telecommunications (usage): In 2008, ] transferred about 30 petabytes of data through its networks each day.<ref>{{cite web|url=http://www.att.com/gen/press-room?pid=4800&cdvn=news&newsarticleid=30623 |title=AT&T- News Room |publisher=Att.com |date=23 October 2008 |accessdate=16 August 2009}}</ref> That number grew to 197 petabytes daily by March 2018.<ref>{{cite web |url=https://theintercept.com/2018/06/25/att-internet-nsa-spy-hubs/ |quote=As of March 2018, some 197 petabytes of data – the equivalent of more than 49 trillion pages of text, or 60 billion average-sized mp3 files – traveled across its networks every business day. |title=The NSA's Hidden Spy Hubs in Eight U.S. Cities |website=] |first=Ryan |last=Gallagher |first2=Henrik |last2=Moltke |date=June 25, 2018}}</ref> |
|
|
* Internet: ] processed about 24 petabytes of data per day in 2009.<ref>{{cite web|url=http://portal.acm.org/citation.cfm?doid=1327452.1327492 |title=MapReduce |publisher=Portal.acm.org |accessdate=16 August 2009}}</ref> The ]'s ] is reported to have transferred up to 7 petabytes each month in 2010.<ref>{{cite web |url=http://crave.cnet.co.uk/software/iplayer-uncovered-what-powers-the-bbcs-epic-creation-49302215/ |title=Article |publisher=CNET UK |accessdate=11 January 2010 |archive-url=https://web.archive.org/web/20110615225805/http://crave.cnet.co.uk/software/iplayer-uncovered-what-powers-the-bbcs-epic-creation-49302215/ |archive-date=15 June 2011 |url-status=dead }}</ref> In 2012, ] transferred about 4 petabytes of data per month.<ref>{{cite web|url=https://www.reddit.com/r/IAmA/comments/y81ju/i_created_imgur_ama/ |title=I created Imgur. AMA. |publisher=Alan Schaaf |accessdate=15 August 2012}}</ref> |
|
|
* Supercomputers: In January 2012, Cray began construction of the ], which has "up to 500 petabytes of tape storage".<ref>{{cite web|url=http://www.ncsa.illinois.edu/enabling/bluewaters|title=About Blue Waters}}</ref> |
|
|
* Data storage system: In August 2011, IBM was reported to have built the largest storage array ever, with a capacity of 120 petabytes.<ref>{{cite news|url=http://www.technologyreview.com/computing/38440/|title=IBM Builds Biggest Data Drive Ever|last=Simonite|first=Tom|date=25 August 2011|work=Technology Review|accessdate=18 October 2011}}</ref> |
|
|
* Digital archives: The ] surpassed 15 petabytes, {{as of|2014|05|lc=on}}.<ref>{{cite news|last=Brownell|first=Brett|title=Meet the People Behind the Wayback Machine, One of Our Favorite Things About the Internet|url=https://www.motherjones.com/media/2014/05/internet-archive-wayback-machine-brewster-kahle|accessdate=29 May 2014|newspaper=Mother Jones|date=22 May 2014}}</ref> |
|
|
* Email: In May 2013, ] announces that as part of their migration of Hotmail accounts to the new Outlook.com email service, they migrated over 150 petabytes of user data in six weeks.<ref>{{cite web|url=http://blogs.office.com/b/microsoft-outlook/archive/2013/05/02/outlook-com-400-million-active-accounts-hotmail-upgrade-complete-and-more-features-on-the-way.aspx|title=Outlook.com: 400 million active accounts, Hotmail upgrade complete and more features on the way}}</ref> |
|
|
* File sharing (centralized): At its 2012 closure of file storage services, ] held ~28 petabytes of user uploaded data.<ref>{{cite news|url=http://tech.wp.pl/kat,1009785,title,Byc-moze-odzyskasz-swoje-pliki-z-Megaupload,wid,14990730,wiadomosc.html |title=Być może odzyskasz swoje pliki z Megaupload - Tech - WP.PL |newspaper=Tech |accessdate=14 April 2013}}</ref> |
|
|
* File sharing (]): 2013 - BitTorrent Sync has transferred over 30 petabytes of data since its pre-alpha release in January 2013.<ref name="nofilmschool.com">{{cite web|url=http://nofilmschool.com/2013/11/bittorrent-sync-1-million-users-version-1-2-free-file-syncing/|title=Version 1.2 of BitTorrent Sync Now Available as Free File Syncing Tool Reaches 1 Million Users|date=6 November 2013|accessdate=19 February 2018}}</ref> |
|
|
* National Library: The ] digital archive of public domain resources hosted by the United States ] contained 15 million digital objects in 2016, comprising over 7 petabytes of digital data.<ref name="loc">{{Cite web|url=https://nplusonemag.com/online-only/online-only/the-library-of-last-resort/|title=The Library of Last Resort|last=Chayka|first=Kyle|date=2016-07-14|website=|publisher=n+1 Magazine|language=en-US|access-date=2016-07-19}}</ref> |
|
|
* Film: The 2009 film ] is reported to have taken over 1 petabyte of local storage at ] for the rendering of the 3D CGI effects.<ref>{{cite web|url=https://thenextweb.com/2010/01/01/avatar-takes-1-petabyte-storage-space-equivalent-32-year-long-mp3/ |title=Believe it or not: Avatar takes 1 petabyte of storage space |publisher=Thenextweb.com |date=1 January 2010 |first=Zee|last=Kane|accessdate=14 January 2010}}</ref><ref>{{cite web|url=http://www.information-management.com/newsletters/avatar_data_processing-10016774-1.html |title=Processing AVATAR |publisher=Information-management.com |date=21 December 2009 |first=Jim|last=Ericson|accessdate=14 January 2010}}</ref> |
|
|
* Video streaming: {{As of|2013|05}}, ] had 3.14 petabytes of video "master copies", which it compresses and converts into 100 different formats for streaming.<ref>{{cite news|last=Vance|first=Ashlee|title=Netflix, Reed Hastings Survive Missteps to Join Silicon Valley's Elite|url=http://www.businessweek.com/articles/2013-05-09/netflix-reed-hastings-survive-missteps-to-join-silicon-valleys-elite#p4|accessdate=22 May 2014|newspaper=Businessweek|date=9 May 2013}}</ref> |
|
|
* Photos: {{As of|2013|01}}, ] users had uploaded over 240 billion photos,<ref>{{cite web|last=Miller|first=Rich|title=Facebook Builds Exabyte Data Centers for Cold Storage|url=http://www.datacenterknowledge.com/archives/2013/01/18/facebook-builds-new-data-centers-for-cold-storage/ |publisher=Datacenterknowledge.com|accessdate=21 May 2014}}</ref> with 350 million new photos every day. For each uploaded photo, Facebook generates and stores four images of different sizes, which translated to a total of 960 billion images and an estimated 357 petabytes of storage.<ref>{{cite web|last=Leung|first=Leo|title=How much data does x store?|url=http://techexpectations.org/2014/05/17/hovsdaDSqwrmwqwfEqw-much-data-does-x-store/|publisher=Techexpectations.org|accessdate=21 May 2014}}{{deadlink|date=November 2018}}</ref> |
|
|
* Music: One petabyte of average ]-encoded songs (for mobile, roughly one megabyte per minute), would require 2000 years to play.<ref name="computerweekly.com">{{cite web|url=http://www.computerweekly.com/feature/What-does-a-petabyte-look-like|title=What does a petabyte look like?|accessdate=19 February 2018}}</ref> |
|
|
* ], a digital distribution service, delivers over 16 petabytes of content to American users weekly.<ref>{{cite magazine|title=Steam ISP stats lay Australia's dire internet connectivity bare|url=http://www.pcgamer.com/steam-isp-stats-lay-australias-dire-internet-connectivity-bare/|magazine=PC Gamer}}</ref> |
|
|
* Physics: The ] in the ] produce about 15 petabytes of data per year, which are distributed over the ].<ref>{{cite web|url=http://www.interactions.org/cms/?pid=1027032 |title=3 October 2008 - CERN: Let the number-crunching begin: the Worldwide LHC Computing Grid celebrates first data |publisher=Interactions.org |accessdate=16 August 2009}}</ref> In July 2012 it was revealed that ] amassed about 200 petabytes of data from the more than 800 trillion collisions looking for the ].<ref>{{cite news| url=http://www.itbusinessedge.com/cm/blogs/lawson/the-big-data-software-problem-behind-cerns-higgs-boson-hunt/?cs=50736|title=Big Data Software Problem Behind CERN's Higgs Boson Hunt}}</ref> The Large Hadron Collider is also able to produce 1 petabyte of data per second, but most of it is filtered out.<ref>{{cite web|url=http://home.cern/about/updates/2017/07/cern-data-centre-passes-200-petabyte-milestone|title=CERN Data Centre passes the 200-petabyte milestone|publisher=CERN|accessdate=6 July 2017}}</ref> |
|
|
* Neurology: It is estimated that the ]'s ability to store memories is equivalent to about 2.5 petabytes of binary data.<ref>{{cite magazine|last=Reber |first=Paul |url=http://www.scientificamerican.com/article.cfm?id=what-is-the-memory-capacity |title=What Is the Memory Capacity of the Human Brain? |magazine=Scientific American |date=2 April 2013 |accessdate=14 April 2013}}</ref><ref>{{Cite news|url=http://www.slate.com/articles/health_and_science/explainer/2012/04/north_korea_s_2_mb_of_knowledge_taunt_how_many_megabytes_does_the_human_brain_hold_.html|title=Your Brain's Technical Specs|last=Wickman|first=Forrest|date=2012-04-24|work=Slate|access-date=2017-03-31|language=en-US|issn=1091-2339}}</ref> |
|
|
* Climate science: The ] (DKRZ) has a storage capacity of 60 petabytes of climate data.<ref>{{cite web|url=http://www.treehugger.com/files/2009/12/meet-the-worlds-most-powerful-weather-supercomputer.php|title=Meet the World's Most Powerful Weather Supercomputer|accessdate=19 February 2018}}</ref> |
|
|
*Sports: If you lined up a petabyte of data on 1 GB flash drives that were an inch long and stretched them end to end, they would stretch over 92 football fields.<ref name=":0">{{Cite web|url=https://info.cobaltiron.com/blog/petabyte-how-much-information-could-it-actually-hold|title=Petabyte - How Much Information Could it Actually Hold?|last=Spurlock|first=Richard|website=info.cobaltiron.com|language=en|access-date=2019-11-04}}</ref> |
|
|
* ] holds around a half of ], digitised data (as of ]). Hold in ] (]) thera on two SpectraLogic T950 tape libraries, with the distance of 500m between. One is LTO-5 (]) tape library, second is IBM ].<ref>https://www.computerweekly.com/news/450413621/IWM-digitises-vast-collection-in-SpectraLogic-tape-archive</ref> |
|
|
|
|
|
==References== |
|
|
{{Reflist}} |
|
|
|
|
|
{{Computer Storage Volumes}} |
|
|
|
|
|
|
] |
|
] |