
The rapid advancement of artificial intelligence has led to increasingly sophisticated models, but these systems are still faced with challenges of fundamental efficiency. A team of researchers led by Dr. Suin Yi, assistant professor in Texas A & M College of Engineering, has developed a new approach called Ai Super-Tur, which imitates the capacity of the human brain to learn and adapt. This innovation could considerably improve AI by considerably reducing calculation costs and energy consumption.
Current AI models are based on architectures that separate data storage from processing, requiring enormous computing power and energy to migrate information between these two components. On the other hand, the human brain integrates learning and memory through neural connections called synapses, which strengthen or weaken dynamically according to experience – a process known as synaptic plasticity.
Dr. Yi’s team was inspired by neuroscience to develop AI systems that work more like organic brains. Traditional AI models strongly depend on retropropagation, an optimization algorithm used to adjust neural networks during training. Although effective, retro-propagation is intensive in calculation and biologically improbable.
To remedy this, the team explores alternative mechanisms such as Hebbian learning – often summarized as “cells that shoot together, cuddle together” – and the dependent plasticity of timing (STDP). These biologically inspired learning processes allow AI systems to strengthen connections according to activity models, reducing the need for constant recycling and excessive calculation resources.
One of the most promising aspects of the super-touch AI is its ability to effectively process information in real time. In a recent test, a circuit based on these learning principles allowed a drone to navigate a complex environment without prior training. Unlike traditional AI models that require in-depth data sets and pre-training, this approach has enabled the drone to adapt and learn on the fly, demonstrating faster response and lower energy consumption.
The integration of neuromorphic computer – material that imitates brain processing – still improves the potential of the super -turntable AI. By integrating these learning algorithms into specialized equipment, researchers aim to develop AI systems that require minimum power while maintaining high levels of adaptability and intelligence.
The AI industry develops quickly, current companies to develop larger and more powerful models. However, scalability remains an urgent challenge due to material limitations and the increase in energy demands. Some AI applications already require whole data centers, increasing economic and environmental costs.
Dr. Yi underlines that the progress of the hardware is just as crucial as IA software improvements. “Many people think that AI is roughly algorithms, but without effective computer equipment, AI cannot really evolve,” he explains. The Super-Turing AI offers a paradigm shift by combining software and hardware innovations to create sustainable and scalable AI solutions.
By reimaginating AI architectures to reflect the efficiency of the human brain, the super-touch AI represents an important step towards the sustainable development of AI. This technology could lead to a new generation of AI which is both smarter and environmental responsible.
“Modern AI like Chatgpt is powerful, but it is too expensive and with high energy intensity. We are working to make AI which is both smarter and more sustainable, ”explains Dr. Yi. “The Super-Touch AI could reshape the way AI is built and used, ensuring that its progress benefits both people and the planet.”
You can explore the published research of the team in Scientific advances.
