Researchers have built an artificial intelligence system that detects the moment a fruit fly starts singing its courtship song and immediately shuts down the neurons that produce the song.
As a result, it is now possible to directly investigate which brain cells are responsible for fleeting interactions with animals.
AI stops flies from singing
In a small courtship chamber, a male fruit fly began to stretch its wings and sing, but the system blocked the movement before the sound could develop.
Using these recordings, Professor Azusa Kamikawachi Nagoya University They showed that even when multiple flies move closely together, the software can pick out which male to court.
Each time the wing began to rise, the program instantly identified that behavior and triggered a light pulse that silenced the targeted neurons without affecting nearby animals.
This ability to link individual behavior to instantaneous selective neural control sets the stage for testing how the brain drives social behavior in real time.
AI detects behavior quickly
Rather than tracking individual legs or wings over time, the new AI system, called YORU, identifies the entire posture as one movement within one video frame.
After training on labeled examples, the software drew boxes around the actions and named them as they appeared.
Across flies, ants, and zebrafish, YORU achieved 90% to 98% accuracy for several demanding social behaviors.
These quick calls were important because the system had to work before the short flap or head turn was over.
Overall posture detection works
When animals overlap, tracking of body parts can cause limbs to be lost or identities to be swapped, leading to disjointed behavioral labels.
Traditional tools tracked important points frame by frame, but social contact obscured those points and made the math unstable.
Widely used marker-free tracking is also possible tool They struggled in crowded arenas with overlapping bodies hiding limbs and confusion about which animal was which.
By treating general posture as a cue, YORU continued to work in crowds, but long sequences still presented challenges.
Control your brain in real time
In closed-loop systems, meaning they react immediately to detected motion, speed was as important as accuracy.
The entire test loop from camera frame to trigger pulse averaged about 31ms, which was fast enough for many operations.
Compared to a typical pose tracker, the same setup ran about 30% faster, reducing the average latency from about 47 ms.
With such low latency, YORU can turn on the lights while the action is still occurring, rather than after the fact.
Light silences certain neurons
To control neurons according to cues, the researchers first manipulated flies so that selected brain cells responded to green light.
through optogeneticsUsing Light to Switch Specific Neurons On or Off, 2015 review They tracked how light-sensitive proteins control signal transduction.
When YORU sensed wing elongation, it sent a signal to a lamp, and the light silenced courtship neurons, reducing mating success.
“The fly’s courtship neurons can be silenced the moment YORU detects wing extension,” said Kamikawachi, lead author of the study.
Aim at flies in the crowd
Previously brain control The setup illuminated the entire arena at once, so all animals received the same commands at the same time.
By sending each frame’s position data to a projector, YORU illuminated one fly while the others continued to move.
During the two-fly test, the moving light remained on the intended target for 89.5% of the stimulus time.
This precision allowed the researchers to change one animal’s neural input during a social moment without disrupting the rest of the animals in the group.
Brain activity and behavioral patterns
In addition to controlling behavior, the same software also helped interpret brain activity by matching an animal’s behavior to what its cortex was showing.
use calcium imagingIn a study that tracked the glowing signals that followed neuronal activity, the researchers linked running and grooming in mice to different patterns.
The map constructed from the YORU labels then matched the map constructed from human scoring, supporting the tool as a reliable readout.
These links can help scientists determine which neural signals reflect actual behavior, rather than treating every brain flicker as meaning.
System limitations
Some social acts only look different over a few frames, so a single-frame detector may miss their start or end.
Without built-in identity tracking, YORU can identify a behavior, but not necessarily see which individuals continue the behavior later.
Hardware also sets limits, as projectors and controllers introduce extra delays that can cause fast animals to escape the lights.
Although the approach could be scaled up with better predictions and lower-latency instruments, each laboratory will still need careful coordination.
Future research directions
Making the system usable was important because many biology labs lack the staff to code and calibrate models.
Using a graphical interface, the user trained a new motion detector from a small set of labeled frames and clicked to run the test.
Because YORU treated motion as objects, it could be connected to lights, cameras, and other equipment already on the bench.
Broader access could accelerate research linking circuits to social choice, but ethical standards will need to keep pace.
By combining instantaneous behavioral detection with equally rapid neural control, the system allows scientists to test for cause and effect at the very moment a behavior unfolds.
Future work will focus on capturing longer and more complex behaviors and trimming hardware delays. This allows you to precisely target individual animals, even within larger, more dynamic groups.
The research will be published in a journal scientific progress.
—–
Like what you read? Subscribe to newsletter We bring you fascinating articles, exclusive content, and the latest updates.
Please check it out earth snapThis is a free app provided by. Eric Ralls and Earth.com.
—–