You are currently viewing AI Mimics the Human Brain to Understand Moving Scenes

AI Mimics the Human Brain to Understand Moving Scenes

Rate this post

Scientists at Scripps Research have developed an AI model called MovieNet, which processes videos in a way similar to how the human brain interprets real-life scenes. Unlike traditional AI, which focuses on still images, MovieNet can recognize complex, moving scenes. This breakthrough could improve areas like medical diagnostics and self-driving cars, where understanding subtle changes over time is essential.

The AI is inspired by how neurons in the brain work. Researchers studied tadpoles’ neurons, which respond to movements like shifts in brightness and rotation. These neurons assemble visual details into a sequence, much like a movie clip. MovieNet uses similar principles, encoding video into recognizable cues and distinguishing subtle differences in dynamic scenes.

In tests, MovieNet performed better than humans and even leading AI models like Google’s GoogLeNet. It achieved 82.3% accuracy in identifying swimming behaviors of tadpoles under different conditions. Additionally, it uses less energy and processing power, making it an eco-friendly AI option.

MovieNet also shows promise in medicine. It could detect early signs of diseases like Parkinson’s by spotting small, hard-to-notice changes in movement. The AI might also improve drug testing by analyzing how cells respond to chemicals in real-time.

Scientists believe MovieNet is a step toward creating AI that thinks like living organisms, opening the door to more efficient and sustainable technologies.

Source: indiaai