Researchers from Georgia Tech, Google Glass and Microsoft have collaborated to build a system that lets disabled people control wearables such as Google Glass using just tongue, ear and jaw movements.
Ron Arkin weighs in on Amazon's Alexa and other "hearing robots."
Larry Sweet analyzes advances in automation and offers a prediction for the future.
View All News
© 2005 — 2011 Georgia Tech College of Computing :: School of Interactive Computing