How can machine learning be improved to provide better efficiency in the future? This is what a recent study published in Nature Communications hopes to address as a team of researchers from The Ohio State University investigated the potential for controlling future machine learning products by creating digital twins (copies) that can be used to improve machine learning-based controllers that are currently being used in self-driving cars. However, these controllers require large amounts of computing power and are often challenging to use. This study holds the potential to help researchers better understand how future machine learning algorithms can exhibit better control and efficiency, thus improving their products.
“The problem with most machine learning-based controllers is that they use a lot of energy or power, and they take a long time to evaluate,” said Robert Kent, who is a graduate student in the Department of Physics at The Ohio State University and lead author of the study. “Developing traditional controllers for them has also been difficult because chaotic systems are extremely sensitive to small changes.”
For the study, the researchers created a fingertip-sized digital twin that can function without the internet with the goal of improving the productivity and capabilities of a machine learning-based controller. In the end, the researchers discovered a decrease in the controller’s power needs due to a machine learning method known as reservoir computing, which involves reading in data and mapping out to the target location. According to the researchers, this new method can be used to simplify complex systems, including self-driving cars while decreasing the amount of power and energy required to run the system.
“Big machine learning models have to consume lots of power to crunch data and come out with the right parameters, whereas our model and training is so extremely simple that you could have systems learning on the fly,” said Kent.
Going forward, the team aspires to advance the capabilities of their model, including quantum information processing.
What new discoveries will researchers make about machine learning in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
Sources: Nature Communications, EurekAlert!, Ohio State News, Wikipedia