IBM announced on Thursday that its boffins managed to cut the physical requirements for a bit of data, whereby number of required atoms has been reduced from a million to only 12.
Of course, it goes without saying that this means higher density and more space. Indeed, 1TB drives would quickly become old news as 100TB or 150TB would become a common thing.
For its research, IBM used antiferromagnetism to achieve 100 times denser memory. Antiferromagnetism refers to magnetic moments of atoms or molecules where they align with neighboring spins pointing in opposite directions. Note that current devices use ferromagnetic materials.
Antiferromagnetism is of course quite tricky and exceeding a certain temperature, called the Néel temperature, causes bit size to be much more than 12 atoms. Thankfully, this is still much better than what the current technology offers.
For its experiments, IBM used iron atoms on copper nitrate. However, it is said that other materials could in theory do even better, i.e. use less atoms per bit.
IBM’s researcher Andreas Heinrich said:”Moore's Law is basically the drive of the industry to shrink components down little by little and then solve the engineering challenges that go along with that but keeping the basic concepts the same. The basic concepts of magnetic data storage or even transistors haven't really changed over the past 20 years (…)The ultimate end of Moore's Law is a single atom. That's where we come in."