If you want to learn how and why stable diffusion works: https://www.paepper.com/blog/posts/how-and-why-stable-diffus...
> at market price $600k
I don’t think they are really spending $600k. Chances are that they own a GPU cluster so they don’t need to pay the cloud premium.
Could training be crowd sourced among consumer GPUs like Folding at Home?
Is that relatively cheap or expensive?
I have no context for this, do other model dveelopers publish how much it costed to train them?