In Partial Fulfillment of the Requirements for the Degree of
Doctor of Philosophy
Will defend his pre-defense
Animating large-scale crowds has emerged in recent years as a challenging problem in movie, game, virtual training and education applications. Traditional crowd simulation approaches typically focus on the navigational path-finding (waypoint generation) and local perception (collision avoidance) problems using simple steering rules and social forces, which are able to successfully synthesize the local movement trajectory of each agent within a crowd. However, relatively few existing efforts explore how to optimally control individual agents’ detailed motions and global strategic formations throughout a crowd. This project proposes a hierarchical data-driven scheme for embedding realistic motion variations and group formation behaviors into crowd simulations. In the low-level simulation, the key idea of adding variation of agent motions is to dynamically controlling motion styles of agents through maximizing the style variety of local neighbors and universal style utilization while keeping the style consistency for each agent as natural as possible. In the high-level formation control, instead of hard-coding formation topology as key-frames or scripts, the global configurations of different group behaviors in our method are parameterized models learned from the example data in real world, which contains more sophisticated dynamics than heuristic rules.