Suggestion: use an accelerometer data on mobile and use that to directly replace gravity. I expect to be able to tip the phone to drape the cloth, and shake the phone to get waves of motion.
I think the little tears were fine, but my expectation of the weight of the cloth wasn't so much that it would start to rip on its own after a certain point. It felt more like a wet dough at a certain point than cloth.
I wonder if cloth simulation could be integrated as a CAD primitive that somehow outputs reasonable BRep geometry?
Could you take an AI 3D scan of someone's face, virtually lay a heavy cloth over it, then add whatever you wanted to make a mask?
Could you make the deformed cloth surface into one side of a cube, where the other side was flat for easily working with it, and use that to make custom pseudo-vacumformed cases for things?
Or just stack up boxes and simple shapes, and use the cloth simulation to build organic looking industrial design within a more traditional CAD workflow?
I highly recommend watching the relevant section of that video (4:38 to 8:59) and then implementing it yourself in whatever system you know that can draw lines and circles (I did it in Godot; it took only a few minutes to learn enough Godot to start on the algorithm).
It's absolutely mind-blowing that so little code can produce such a beautiful result. It's also fun to play with the parameters and see how they affect how the cloth feels.
Reminds me of a great video not long ago that went over the main ideas behind weaving and knitting. Feels like you almost certainly have to take some of those ideas in mind when doing a simulation like this. Would be curious to read a breakdown of how this was made and how it incorporates the concepts that go into different fabric.
AFAIK more advanced realism-focused cloth sims are still mostly bundles of spring constraints, and most fabric behaviors are encoded as different spring tolerances, forces, and friction.
I made this a bit ago for fun and funnies to test the idea of tearaway ads. It's very prototype but still pretty satisfying (desktop only but there's a gif on the repo)
Nice first approximation. The cloth has no momentum, a piece of cloth that clearly would swing down, past vertical, and then swing up just damps down and stops at vertical.
Also the falling pieces don't accelerate downward, which looks unnatural
I was curious and was able to build something very similar quickly using Gemini 3 via Google AI Studio. Never would have imagined a few years ago how easy some of this has become to prototype.
This is great! The only part that broke the immersion (for me) was that the cloth bits fell at a constant rate - I'd expect them to accelerate due to gravity, and maybe flutter as they fell.
Of course cloth sims of varying fidelity are everywhere. Even games have had cloth sims for decades at this point.
But it is also something that remains a research problem how to do efficiently and with good results; pretty much every year in siggraph you see couple of new papers around cloth sims. For example this year we got this https://youtu.be/d9TZhtXeMio
At least solver seems faster (if not better) in later versions? p.s. My try at 'flag in the wind' in Blender from around 2022:
https://0x0.st/s/aJ6DNj2pEHzRdBiscEIsbQ/KCsK.mp4
I do remember it took me all day to get somehow realistic motion.
I'm going to assume it is "more than you think; not as much as you'd like" because I don't have the time to burn this morning to replicate your research.
Suggestion: use an accelerometer data on mobile and use that to directly replace gravity. I expect to be able to tip the phone to drape the cloth, and shake the phone to get waves of motion.
reply