The Houston Astros' José Altuve steps up to the plate on a 3-2 count, studies the pitcher and the situation, gets the go-ahead from third base, tracks the ball's release, swings ... and gets a single up the middle.
Altuve has honed natural reflexes, years of experience, knowledge of the pitcher's tendencies, and an understanding of the trajectories of various pitches.
What he sees, hears, and feels seamlessly combines with his brain and muscle memory to time the swing that produces the hit.
The robot, on the other hand, needs to use a linkage system to slowly coordinate data from its sensors with its motor capabilities.
A paper by University of Maryland researchers just published in the journal Science Robotics introduces a new way of combining perception and motor commands using the so-called hyperdimensional computing theory, which could fundamentally alter and improve the basic artificial intelligence (AI) task of sensorimotor representation -- how agents like robots translate what they sense into what they do.
"Learning Sensorimotor Control with Neuromorphic Sensors: Toward Hyperdimensional Active Perception" was written by computer science Ph.D. students Anton Mitrokhin and Peter Sutor, Jr.; Cornelia Fermüller, an associate research scientist with the University of Maryland Institute for Advanced Computer Studies; and Computer Science Professor Yiannis Aloimonos.