In Partial Fulfillment of the Requirements for the Degree of
Doctor of Philosophy
Will defend his dissertation
Animating large-scale crowds has emerged in recent years as a challenging problem in film, game, virtual training and education applications. Traditional crowd simulation approaches typically focus on navigational path-finding (waypoint generation) and local perception (collision avoidance) problems using simple steering rules and social forces, which are able to successfully synthesize the local movement trajectory of each agent within a crowd. However, relatively few existing efforts explore how to optimally control individual agents¡¯ detailed motions and global strategic formations throughout a crowd. The presented research proposes an agent-based framework for embedding realistic motion variations and group formation behaviors into crowd simulations. In the low-level simulation, the key idea of adding variation of agent motions is to dynamically controlling motion styles of agents through maximizing the style variety of local neighbors and universal style utilization while keeping the style consistency for each agent as natural as possible. In the high-level formation control, instead of hard-coding formation topology as key-frames or scripts, the global configurations of different group formations in our framework are parameterized from arbitrary user sketches, which are interactively specified during the simulations.