Artificial intelligence continues to generate excitement for its promise to revolutionize how computing and machines function. And that promise is already unfolding as AI is slowly becoming an everyday facet of life — guiding smart cars, solving unsolvable scientific problems, streamlining consumer transactions and more.
But beyond the buzz is a little-discussed reality. AI consumes a lot of energy and leaves a Bigfoot-sized carbon footprint. That flip slide has spurred researchers to look at reducing AI’s energy intake, and physicists in London believe they have a solution: nanomagnets.
By using micron-scale magnets, the physicists removed the energy-draining software that’s often behind AI processing. Instead of turning to machinery that needs a boost from an electrical grid, they relied on the basic laws of physics to “pose a question” and “receive an answer” from nanomagnets. If they and others can build on this work, nanomagnetic computing could be the green solution for future AI-driven tasks.
Eliminating the Power-Hungry “Middleman”
Put simply, AI is machines attempting to perform the thinking done by humans, which is known as natural thinking. But there are many categories and classifications to explain how AI works and specify its different aims. One such type of AI uses neural networks to mimic how the human brain works, also known as deep learning. Thousands or millions of processing nodes can form a neural network that, as MIT News explains, assigns and weighs mathematical values as it is trained to “learn” concepts from data.
As a research team of physicists at Imperial College in London observed in their attempt to slash the energy used in AI, much of the math that powers neural networks today was originally used to describe the way magnets interact. It seemed like a great idea to apply this to AI, but there was one hitch: Using magnets in math proved to be too complex. No one knew how to enter or extract data. Eventually, silicone-based computers instead simulated how magnets would have mimicked the human brain.
Pulled to the potential of what was left behind on that old drawing board, the Imperial College researchers went to work as they attempted to “cut out the middleman” of silicone-based computers.
Physics, Instead of Electricity, Powers AI Modeling
Nanomagnets come in various states, behaving differently depending on their direction. Or, as Analytics Insight described it, when a group of nanomagnets is in an energy field, each magnet goes through different states of spin. How they interact with one another creates a pattern, eventually increasing the spins into nano-patterned arrays.
In their experiments, the Imperial College researchers created a technique to make an AI-type prediction: Count the number of magnets in each state after the energy field has passed through and then reach an answer.
“How the magnets interact gives us all the information we need; the laws of physics themselves become the computer,” said Kilian Stenning, one of the researchers. The research team say their findings mean these tiny magnets can be used for time-series prediction tasks such as predicting and regulating the insulin levels of diabetic patients. They also believe it could spell the end of energy-draining silicone-based computing.
The End of Nuclear-Powered Rubik’s Cube Solutions?
A few years ago, researchers from the University of Massachusetts at Amherst assessed the energy patterns for the training of several large AI models. As MIT Technology Review reported, the researchers discovered that the process can emit the equivalent of more than 626,000 pounds of carbon dioxide — nearly five times the lifetime emissions of an average car, including those generated during manufacturing.
The higher the efficiency of an AI model, the more energy consumed. Analytics Insight pointed to the power drawn by the Megatron-Turing natural language machine learning model (named in part after Alan Turing, an unsung hero of World War II and early computer scientist) as it trained on 45 terabytes of data. The model ran 512 V100 GPUs for nine days, needing as much as 27,648 kWh of power, much more than the 10,649 kWh an average household burns through in a year. The Imperial College researchers offered an even starker example: Training AI to solve a Rubik’s cube used as much energy as two nuclear power stations running for an hour.
Bringing AI Computing to a Tiny Scale
Considering those comparisons, it’s no wonder the Imperial College team was excited to say nanomagnetic computing “paves the way for getting rid of the computer software that does the energy-intensive simulation.”
According to the researchers, much of the energy that silicon-chip computers use to perform AI computing is wasted in the inefficient transport of electrons during processing and memory storage. Nanomagnets don’t rely on the physical transport of particles such as electrons. They instead process and transfer information in the form of a “magnon” wave where each magnet affects the state of the surrounding magnets — with much less lost energy.
Conventional computing must process and store information in separate processes; nanomagnetic computing combines them. That efficiency, the researchers contend, could make nanomagnetic computing up to 100,000 times more efficient than conventional computing.
The Imperial College researchers now want to teach their nanomagnets how to process other data and eventually turn them into a real computing device. They view nanomagnetic computing as a more effective solution for powering AI modeling on the edge — for instance, enabling computers to efficiently process data where it’s collected instead of sending it back to big data centers. With tiny magnets crunching complex datasets, your workplace could step up its AI game, and your smartphone could solve Rubik’s cube-type problems wherever you are.
Are you interested in all things related to technology? We are too. Check out Northrop Grumman career opportunities to see how you can participate in this fascinating time of discovery.