New Data Storage Technologies To Keep Up With Internet Demands
It is estimated there are nearly 30 billion web pages on the internet, and about 1,000 exabytes of data. (An exabyte is one million terrabytes, and a terrabyte is one million gigabtyes.) A recent Cisco study found that nearly 10 exabytes of video is being uploaded monthly worldwide. The amount of information being created and stored by consumers and businesses is doubling every two years. This presents unique challenges in how to retain and retrieve vast amounts of digital data.
Moore’s law – the number of transistors on an integrated circuit doubles approximately every two years – is barely keeping pace with the amount of information being created. Hard disk drives (HDD) have increased capacity by nearly 20 times that of just ten years ago. Drives today can store about 3 terabytes. However these are relatively slow compared to the Flash or Solid State Drives (SSD). The tradeoff is that SSD is nearly 10 times more expensive that HDD, and limited to about 250GB. Intel is already building chips with 100 million transistors in an area smaller that this “0”. Their goal is to be able to create electronic structures at the level of the individual molecules
Researchers are continually looking to new ways to ensure that increasing amounts of data can be stored reliably, retrieved quickly and cost as close to nothing as possible. A relatively new area involves using biological storage in DNA strands (see associated article – Biological Data Storage ). Another is focusing on quantum mechanical effect, in which a single electron is used to store a binary one or zero.
A third method concentrates on a structure called nanotubes. These are carbon cylinders less than a nanometer in diameter, thousands of times thinner than human hair.
Yet another technology that is gaining attention is the memory resistor, or memristor. This was pioneered in the 1970’s and but was not achievable on a mass scale at that time. Memristors utilize the relationship between electrical charge and magnetic flux to store and retrieve a single bit of data.
All of these techniques are investigated to address the requirements of needing to store more data and of recalling it. In some applications, such as music or digital photography, speed is critical. In others case, the amount of available storage is key. Some researchers are working to increase the storage density while others are looking to decrease the access time to get the information. A parallel effort is underway in developing software that can predict what data may be requested next and to place that in a prioritized location.
It’s certain that the demands of data will not decrease, and the request for faster access with grow along with the amount of available information. A combination of the techniques above should keep pace.
If you found this article interesting and informative, please be sure to sign up for our weekly e-newsletter as well as daily email / RSS Feeds at SourceTech411 .
Comments are closed.