Encounter at a Procedural Android Hive City
https://youtu.be/07Ik2ZIx2lM?si=Wugtg8dIUpCey9wG
https://redd.it/1nwt6pw
@proceduralgeneration
https://youtu.be/07Ik2ZIx2lM?si=Wugtg8dIUpCey9wG
https://redd.it/1nwt6pw
@proceduralgeneration
YouTube
Infinicity - Android Hive City Encounter
The androids soak up natural resources stacking them in the form of cities whether to mimic or mock their creators is yet to be determined.
How to do metaball-like morphing with procedural meshes
Hey, I know questions aren't really posted here much, but I was hoping to get some thoughts from folks.
I am doing some mesh generation of characters/creatures that can be animated, closest thing I can think of would be something like Spore. Where you have mesh parts that can be connected together, body, legs, heads, claws, etc.
The parts for me are defined as a center spline with loops of splines going around it that define the shape and size at any given point along the central spline. This makes it easy to generate a decently high quality mesh for a given part by walking the center spline and sampling the spline loops to create edge loops.
The issue with this approach of course comes when trying to combine the meshes of parts, like a leg with a body, or toes on a paw, or spikes/horns/fur clumps with anything else. I want them to morph like metaballs so it looks like a single solid mesh and more natural.
I thought about just using metaballs and remeshing, but that produces poor topology, and poor edge loops.
I've been looking at different research papers that have been published, but nothing has really come up that feels like it could be adapted.
Converting my splines to SDFs shouldn't be too hard, so that can be sampled at least. But even with that, I haven't been able to figure out how you would approach actually connecting up the loops/parts.
Anyone here have any ideas for different approaches I could use or maybe things to look up?
(For clarification, I just mean the concept of creating creatures with parts is like Spore)
https://redd.it/1nxywuw
@proceduralgeneration
Hey, I know questions aren't really posted here much, but I was hoping to get some thoughts from folks.
I am doing some mesh generation of characters/creatures that can be animated, closest thing I can think of would be something like Spore. Where you have mesh parts that can be connected together, body, legs, heads, claws, etc.
The parts for me are defined as a center spline with loops of splines going around it that define the shape and size at any given point along the central spline. This makes it easy to generate a decently high quality mesh for a given part by walking the center spline and sampling the spline loops to create edge loops.
The issue with this approach of course comes when trying to combine the meshes of parts, like a leg with a body, or toes on a paw, or spikes/horns/fur clumps with anything else. I want them to morph like metaballs so it looks like a single solid mesh and more natural.
I thought about just using metaballs and remeshing, but that produces poor topology, and poor edge loops.
I've been looking at different research papers that have been published, but nothing has really come up that feels like it could be adapted.
Converting my splines to SDFs shouldn't be too hard, so that can be sampled at least. But even with that, I haven't been able to figure out how you would approach actually connecting up the loops/parts.
Anyone here have any ideas for different approaches I could use or maybe things to look up?
(For clarification, I just mean the concept of creating creatures with parts is like Spore)
https://redd.it/1nxywuw
@proceduralgeneration
Reddit
From the proceduralgeneration community on Reddit
Explore this post and more from the proceduralgeneration community
This media is not supported in your browser
VIEW IN TELEGRAM
Organ-based damage system with procedural body destruction
https://redd.it/1nyrgig
@proceduralgeneration
https://redd.it/1nyrgig
@proceduralgeneration
EvoMUSART 2026: 15th International Conference on Artificial Intelligence in Music, Sound, Art and Design
The 15th International Conference on Artificial Intelligence in Music, Sound, Art and Design (EvoMUSART 2026) will take place 8β10 April 2026 in Toulouse, France, as part of the evo* event.
We are inviting submissions on the application of computational design and AI to creative domains, including music, sound, visual art, architecture, video, games, poetry, and design.
EvoMUSART brings together researchers and practitioners at the intersection of computational methods and creativity. It offers a platform to present, promote, and discuss work that applies neural networks, evolutionary computation, swarm intelligence, alife, and other AI techniques in artistic and design contexts.
π Submission deadline: 1 November 2025
π Location: Toulouse, France
π Details: https://www.evostar.org/2026/evomusart/
π Flyer: http://www.evostar.org/2026/flyers/evomusart
π Previous papers: https://evomusart-index.dei.uc.pt
We look forward to seeing you in Toulouse!
https://preview.redd.it/9f1wlp3auitf1.png?width=4167&format=png&auto=webp&s=2678237f805dceaa63f67e175f1dad717d57b34b
https://redd.it/1nzoisv
@proceduralgeneration
The 15th International Conference on Artificial Intelligence in Music, Sound, Art and Design (EvoMUSART 2026) will take place 8β10 April 2026 in Toulouse, France, as part of the evo* event.
We are inviting submissions on the application of computational design and AI to creative domains, including music, sound, visual art, architecture, video, games, poetry, and design.
EvoMUSART brings together researchers and practitioners at the intersection of computational methods and creativity. It offers a platform to present, promote, and discuss work that applies neural networks, evolutionary computation, swarm intelligence, alife, and other AI techniques in artistic and design contexts.
π Submission deadline: 1 November 2025
π Location: Toulouse, France
π Details: https://www.evostar.org/2026/evomusart/
π Flyer: http://www.evostar.org/2026/flyers/evomusart
π Previous papers: https://evomusart-index.dei.uc.pt
We look forward to seeing you in Toulouse!
https://preview.redd.it/9f1wlp3auitf1.png?width=4167&format=png&auto=webp&s=2678237f805dceaa63f67e175f1dad717d57b34b
https://redd.it/1nzoisv
@proceduralgeneration
Was looking for a bug in my generator that did not exist
https://redd.it/1nzwopd
@proceduralgeneration
https://redd.it/1nzwopd
@proceduralgeneration
This media is not supported in your browser
VIEW IN TELEGRAM
Chunk loading system for procedural terrain - looking for LOD strategies
https://redd.it/1o0bpqn
@proceduralgeneration
https://redd.it/1o0bpqn
@proceduralgeneration
Terrain generation system inspired by LayeredProcGen, featuring raymarched terrain shadows, LOD stitching etc.
https://redd.it/1o18vb9
@proceduralgeneration
https://redd.it/1o18vb9
@proceduralgeneration