A team at Stanford University in California designed motorized computer monitors that can mimic the movements of a person on screen.
The robotically-actuated screen technology, as it’s called, uses computer software to make the screen perform motions taken from a person’s gestures such as head nods and hearty laughs when controlled by a Wii games controller.
Researchers from Sanford’s Center for Design Research also linked a robotic arm to the screen to include gestures such as waving and tapping on a table.
Check out this video to see how it works:
But could this technology be used for anything more than a parlor trick? Some say yes.
New Scientist reports:
“At last month’s Human Robot Interaction conference in Boston, the team revealed that volunteers found the idea beneficial. With the proxy motion switched on, people were perceived to be ‘more friendly, less dominant and more involved’ in conversations. ‘Consistency between physical and on-screen action improved understanding of the messages that remote participants communicated,’ said the researchers.”
This new technology could benefit those who telecommute to work or school to give people a better sense of being a part of from a distance.