Since when was the discussion purely feature films? Our little studio can't afford racks full of Xeon CPU renderers, so we do as GP said, using redshift, etc. and just have a ton of GPUs.
Because the commenter I replied to said that only old school artists render on the CPU.
People pick the renderer that suits their needs, and it's not just a matter of old school people picking CPU pathtracers because it's what they know. You picked GPU rendering because you're, as you said, a smaller studio. That suits your needs. It's however not old school for big studios to use CPU rendering because GPU renderers just can't handle the scene complexity required yet.
The reason why most artists highly prefer gpus is it gets their cycle time down. Farms often use CPU’s still because of cost and less pressure on a single image render time.
Usually what one does is break the shots down into individual elements rather than render everything as once and then composite it in. Rendering it all in one go can prevent touchups to individual elements. But I am much more VFX than pure animation - I think animation does more one shot one render work.
Funny thing I mostly know what I am talking about.
> Funny thing I mostly know what I am talking about.
You are only however talking about your subjective experience and projecting it onto the entirety of the industry.
I've worked in both animated features and VFX at a few of the bigger studios. What sort of studio do you work at?
I'd suggest looking at the landscape of big feature films and seeing what studios are using GPU rendering. It's rare to see any of the big studios using GPU rendering, not just because of existing farm hardware, but because none of the current GPUs and GPU renderers are capable of handling the larger scenes required, as well as losing out on some shader functionality like full OSL support among others...
If you look at the ACM breakdown of production renderers, in their rendering special, there's not a single GPU focused renderer there.
Individual artists may want GPU rendering. However your statement of only old school people using CPU rendering is just ignoring the realities of production.
Wrote and sold software into the VFX industry. Most notable Deadline, a fairly popular render manager. Also wrote/sold another dozen or so tools into the industry.
While CPU renderers are popular among the old crowd, no one wants to use them because they are super slow. The cycle time is brutal and it kills artist productivity. Artists want redshift, octane, GPU cycles, and hybrid Pixar Renderman, basically anything with multi-GPU support. Wherever artists can they want GPU-based renders because they are fast.
Large feature film shops are laggards in adopting new technology and often have a ton of workflows/plugins that can not be easily changed. They have their tired and true pipeline. But up and coming studios tend to be nearly completely GPU-based, because it cuts costs, even if it limits things a bit (but much less than before.)
This is also why you see Unreal Engine getting into architectural rendering -- again because it is a much nicer workflow than waiting around for a few hours for V-Ray to finish up.
Many children TV shows are GPU-rendered now right out of game engines, in part because of the cycle time and the lower visual complexity of the scenes.
So I think we are both right. Large studios are using CPU-based rendering -- you are correct. Artists and more nimble studios are using GPU-based rendering as much as they can because it reduces cycle time while being roughly equivalent costs (well except for the last 2 years because of crypto screwing up GPU prices.)
Remember my original statement you took offence to was:
"3D artists these days want their render boxes to be filled with gpus so they can use cycles or redshift or octane to render fast."
and
"The reason why most artists highly prefer gpus is it gets their cycle time down. Farms often use CPU’s still because of cost and less pressure on a single image render time."
I was speaking about what artists want and I am absolutely correct on that. And what artists want will filter into the rest of the industry -- although a bit slower now because of stupid GPU prices.
That is not the part of your statement that I was saying is incorrect. The part that I still say you're wrong about is the statement that people who use CPU renderers do so because they're old school. To claim that is ignoring the realities of how GPU renderers handle large scenes.
You again try and say the big studios are laggard in adopting tech, and have tired pipelines. That's not why they're using CPU rendering. It's because GPU renderers did not scale to meet their needs.
A lot of the big VFX studios are renderer agnostic (e.g ILM), and would have no issue adopting GPU renderers if it met their needs.
You're continuing to project your subjective (and frankly incorrect) opinion that GPU renderers have superseded CPU renderers already onto the industry, without understanding the limitations involved.
You even mention "hybrid Renderman" in which I think you mean XPU, but that's yet another example of a GPU renderer that isn't at parity with the CPU one. The same goes for Arnold GPU etc...
"I was speaking about what artists want" is fine, but you're also dismissing the very real reasons people are still on CPU renderers by saying it's a matter of legacy.
Okay, but you presented it as a whole statement and I clarified multiple times what I was contradicting in your posr? I also presented a nuanced argument based in real world usage, whereas you went off on projecting your uninformed opinions as fact.