March 1996 - Volume 12:4
By Norman I. Badler
Most virtual worlds have been populated by fairly simple objects with simple appearance or motion. With the increase in rendering and computational power of modern workstations, more interesting inhabitants can now be added to our virtual worlds: simulated humans. Creating simulated human agents that behave realistically is one of the research goals of Penn's Center for Human Modeling and Simulation. The Center's Jack visualization software contains a powerful and extraordinarily interactive 3D human model that is used to analyze how people will interact with a wide variety of systems or environments.
One application for real-time human models is human factors analysis, which involves visualizing the appearance, capabilities, and performance of humans as they execute tasks in a simulated environment. Human factors applica tions serve a broad population that knows how to design things but does not usually have prior skill in computer animation of people. Human models can also be applied to training situations. For example, in medical training Jack can be both patient and medic in an emergency care simulation.
The Jack model contains almost all the essential human skeletal joints and it can be scaled to different body sizes based on population data. The figure can be manipulated so that it moves in several directions simultaneously: For example, it can grip a moving steering wheel with both hands while sitting in a car, looking out the rear-view window and pressing the floor pedals with its feet. Jack can walk and turn naturally, grasp objects, and follow objects with his eyes. He can even tell you if the load he is carrying exceeds NIOSH guidelines or his strength limits.
For a virtual reality experience, a Jack system can be configured with immersive VR glasses, digitizing glove, and 3D magnetic body tracking, permitting the user to visualize and move his or her entire body (not just a "disembodied" hand) in the virtual environment.
The Jack software runs on Silicon Graphics worksta tions, which have 3D graphics features that aid interaction with highly articulated figures. The environment provides state-of-the-art 3D rendering through hardware, ray-trace, or Renderman interfaces. There is also an API (Application Programmer Interface) through which Jack acts like a server for human motion for other software or CAD systems.
Graphics available from Jack's Web site. Above: Simulated driving tasks such as gripping a moving steering wheel and braking. Inset: Modeling an emergency care situation on a battlefield.
Jack is presently the virtual employee of choice at institutions as varied as heavy equipment manufacturers, vehicle designers, and the military, making him clearly a "Jack-of-all-trades." And along with his physical develop ment, Jack's cognitive capabilities are expanding. He can play "hide and seek" and engage copies of himself in limited conversation. Among the next steps in Jack's evolution are speech synthesis and understanding, nonverbal communica tion, and personality development.
Jack has enjoyed funding from numerous sources, including ARPA, NSF, Army, Air Force, ONR, NLM, and several industrial sponsors. The software is licensed commercially by the HMS Center and is available to the Penn community at no charge except for manuals and training costs. For more information, contact Karen Carter, Associate Director HMS, Computer and Information Science, Moore Building, Philadelphia, PA 19103-6389 (215/898-1488) or visit Jack's home page on the Penn Web (http://www.cis.upenn.edu/~hms/jack.html).
NORMAN I. BADLER is Director of the Center for Human Modeling and Simulation in the School of Engineering and Applied Science.