Avatars With Swagger

One day, the avatars now inhabiting the virtual worlds of interactive computer games will look as old-fashioned as the herky-jerky rhythm of a Charlie Chaplin film. In fact, that day is fast approaching—thanks to research under way in Karen Liu's lab at the College of Computing.

Liu, an assistant professor in the School of Interactive Computing, is taking computer-generated animation to a new level by designing computational models of "truly agile and responsive virtual humans who can move and manipulate autonomously and realistically in a physically simulated world."

Present modeling techniques (such as motion-capture technology, which "By looking at one motion sequence of a person walking, we can extract that style and apply it to a new motion, like walking up stairs."relies on sensors worn by actors and fixed cameras feeding data into a computer) provide only a superficial sample of human movement because the information is kinematic rather than dynamic, according to Liu.

"In order to realistically simulate a character, you have to understand the biomechanical workings of the human body," she notes. The problem is that biomechanical knowledge is quite limited, and even if it wasn't, the volume of data would be impractical for computers to handle. The action of a single muscle could involve hundreds of variables. "We have a lot of computer resources today, but there's still not quite enough to simulate something as complex as the human body," Liu says.

Liu manages the limits on computational power by mining motion-capture data for a relatively small amount of well-chosen biomechanical data. She developed algorithms based on the "optimality theory of movement," which holds that people naturally move in ways that waste as little energy as possible. By inserting specific motion-capture segments into the algorithms—a person walking happily, for example—the program identifies certain biomechanical parameters and the most energy-efficient motion values associated with a happy walk. The algorithms also extract information pertinent to muscle strength and stiffness. This combination—of motion-capture data and biomechanical data derived from the motion-capture process and other sources—provides the foundation for a remarkably realistic reproduction of movement.

The technique can also be applied to an individual's particular style of walking or moving and then incorporated into a computer-generated human model. "There's something unique about the way each of us walks," Liu explains. "By looking at one motion sequence of a person walking, we can extract that style and apply it to a new motion, like walking up stairs."

Motion-tracking sensors provide raw data that Karen Liu, assistant professor in Interactive Computing, turns into realistic, mobile virtual characters.

Motion-tracking sensors provide raw data that Karen Liu, assistant professor in Interactive Computing, turns into realistic, mobile virtual characters.

 

Human animators remain integral to process

To further improve the performance of computer-generated models, Liu is investigating ways to capture dynamic motion data associated with more challenging biomechanisms, particularly balance.

No matter how sophisticated the data may be, the flesh-and-blood human animator's role remains the essential bridge between data and performance. Liu has developed a physics-based algorithm that enables animators to quickly and easily synthesize complex dynamic motions, such as gymnastics, from basic pose sketches. Another tool simplifies the animation of complex interactions between humans and their environment. Dynamic controllers designed with Liu's novel algorithms enable animators to create complex control strategies by combining simple ones.

A performance-based interface allows users to control a virtual character with their body movements in a manner familiar to Nintendo Wii players, but in a freeform or “express” fashion that maintains both physical realism and the user's personal style. Further research is aimed at devising techniques to allow several users to interact virtually to achieve collaborative or—in many games—adversarial tasks.

"This system could also be used to train a team of specialists to work synergetically in conditions that involve uncertainty, stress and risks," Liu says. "It can also facilitate empirical studies on human problem solving and decision making."

Besides its application for interactive video games, Liu's research could be useful in simulations, the motion picture industry, robotics and biomedicine.