IDC determined that the world generated 161 billion gigabytes - 161 exabytes - of digital information last year and forecasts a staggering 988 billion gigabytes – 988 exabytes of digital information created in 2010. It is an amount of information which can not be imagined, not even with stacks of books or iPods. The previous best estimate came from researchers at the University of California, Berkeley, who totalled the globe's information production at 5 exabytes in 2003. One of the sponsors of the report, data-storage company EMC Corp., commissioned IDC's new look.
Of course there was a fight over the methods immediately. The Berkeley researchers had taken a different trail, as they also counted non-electronic information, such as analogue radio broadcasts or printed office memos, and tallied how much space that would consume if digitized. And they examined original data only, not all the times things got copied. On the other hand, the IDC numbers ballooned with the inclusion of content as it was created and as it was reproduced - for example, as a digital TV file was made and every time it landed on a screen. If IDC tracked original data only, its result would have been 40 exabytes.
But it really does not matter, whatever method you take. The amount increases really fast. When our first grandsons were born I nicknamed them the Tera Kids, boys with a terabyte of digital information by the time that they are eighteen years old. They will have built up a personal file of documents, forms, music, movies and medical scans. This file will be scattered all over the place, from hospital to schools, to computers of their providers. How can they keep track of them and access them?
One of the big problems is also that information gets copied roughly three times. This looks to me a conservative estimate. If you look at many of the newspaper and broadcast services, they copy the news items of wire services like Reuters and AP, often without adding another word or observation. But the news services have to be complete to attract traffic and if they do not attract traffic they will not get any revenues from advertisements. Just linking to the news wire services is not good enough. But a de duplication service would be a solution. Keep only the original source and delete all the copies.
But it is not really the text items which are increasing the digital storage. Music, photographs and video are more bytes-intensive. So we will need compression, storage and distribution tools for them. Undoubtedly people work on these algorithms. But who will come up with an impressive solution and when?

Blog Posting Number: 689
Tags:



No comments:
Post a Comment