SIGGRAPH 2024 CG research trailer
SIGGRAPH has released its annual trailer of research topics to be presented at this year’s conference.
Among the many interesting papers are a progressive simulation method for cloth and thin shells, and a wildfire simulation that incorporates convection, combustion, and heat transfer. Fluid dynamics research will be represented by a new LRAN fluid solver, and new multiphase flow fluid simulation techniques effectively handle complex fluids. Additionally, a position-based dynamics approach has been developed for simulating quasi-static hyperelastic materials.
In the modeling and manufacture area, a new mesh simplification algorithm has been developed to balance accuracy, triangle quality, and feature alignment; and there’s a new design system for crafting intricate free-form surfaces from fabric through smocking. A new method for generating surface-filling curves on meshes aids robotic path planning for tasks such as painting, and inflatable surfaces can now be created from inextensible material sheets.
It goes without saying that there’s a heavy neural networks and AI presence this year, including neural style transfer technology that applies 2D image styles to 3D meshes, even in dynamic simulations. Neural fields have been advanced to encode discrete surface geometry, enabling expressive facial animations from a single portrait. High-resolution frames can be generated from low-resolution inputs in real-time using neural networks, and 3D NeRF generation combines text-based descriptions with sketches for photorealistic results. A GPT-based architecture generates sewing patterns and textures them using stable diffusion.
Research in robotics and animation includes a method for stylized walks for physical robots and a computational design pipeline for soft robots that optimizes shapes for specific tasks. A character control framework using a diffusion model allows for real-time animation, while another diffusion model facilitates world-building.
Texturing and painting techniques are also covered, with a diffusion model that allows for painting textures on UV-mapped 3D surfaces. Hand-drawn trajectories can now guide animations generated from a still image.
Finally, research into hair and fluid simulations include real-time complex hair simulation achieved by extrapolating from guide hairs, and enhanced magnetic ferrofluids simulation with an induce on boundary solver.
To watch the trailer, visit YouTube and find out more on the SIGGRAPH website.
This is why I love this field. The inspirational work from papers stands out amidst a sea of shit releases, allowing us to breathe fresh air through new research.”
It’s nice to see AI growing together with the traditional methods. Exciting times.