×

Warning

JUser: :_load: Unable to load user with ID: 78

Print this page
Published in News

IBM to tear down Moore's Law

by on13 January 2012

ibm

Cuts bit size down to 12 atoms

IBM announced on Thursday that its boffins managed to cut the physical requirements for a bit of data, whereby number of required atoms has been reduced from a million to only 12.

Of course, it goes without saying that this means higher density and more space. Indeed, 1TB drives would quickly become old news as 100TB or 150TB would become a common thing.

For its research, IBM used antiferromagnetism to achieve 100 times denser memory. Antiferromagnetism refers to magnetic moments of atoms or molecules where they align with neighboring spins pointing in opposite directions. Note that current devices use ferromagnetic materials.

Antiferromagnetism is of course quite tricky and exceeding a certain temperature, called the Néel temperature, causes bit size to be much more than 12 atoms. Thankfully, this is still much better than what the current technology offers.  

For its experiments, IBM used iron atoms on copper nitrate. However, it is said that other materials could in theory do even better, i.e. use less atoms per bit. 

IBM’s researcher Andreas Heinrich said:”Moore's Law is basically the drive of the industry to shrink components down little by little and then solve the engineering challenges that go along with that but keeping the basic concepts the same. The basic concepts of magnetic data storage or even transistors haven't really changed over the past 20 years (…)The ultimate end of Moore's Law is a single atom. That's where we come in."

More here.


Last modified on 13 January 2012
Rate this item
(0 votes)