CG at 8K resolution and 48 FPS; will it ever become possible on consumer hardware?

Animators on this forum are going to have a heart attack at this prospect.

Peter Jackson made headway into a possible upcoming 48 FPS standard for Hollywood VFX and some high-end TV’s at this year’s CES are showing off a capability of 8K resolution.

When you break down the math, it’s hard not to think that it may be ten years or more before we see consumer hardware that can render animation (or even still images) at such massive resolutions and rates in a decent timeframe. 8K is 16 times the resolution of 1080p, that means for Blender that the Cycles render engine will have to be optimized to where it can spit out images 16 times faster if one wanted to do it today (or it will be a very long wait amid jumps as low as 4 percent with some recent chip offerings from Intel).

Providing that Cycles could even be optimized to where it is 4 times faster even in a future release, consumer hardware will still need to see a 4-fold jump to make 8K at 48 FPS viable for people who wouldn’t have 10K in disposable income to throw at the highest end boutique rendering machines. In a sense, that hints that we might be looking at a time beyond the year 2020 when even the big-name studios have the technology to put together render farms powerful enough to process such huge resolutions, and even then it might also bring in a need for much higher density of storage to store these massive files, much higher density in RAM, and the final release of a new generation of optical storage that is measured in terabytes, but is still largely confined to the lab. Sure, there’s the possibility of using new algorithms to upscale normal 1080p content, but it might be quite difficult for any algorithm to allow such a resolution increase to approach 16x without major blur at the edges.

Is this type of future for the average 3D user even possible, what would it take for us to get there, and will such things as 8K resolution at 48 FPS even take off?

as a shortsighted person, i cant even see pixels in my 1080 desktop monitor, even with glasses on. no idea why we would need more than 4k with a reasonable viewing distance. they’d better do 60fps than 8k :P.

But if 8k will really set foot, i assume the renderfarms just quadrupl to the power of two in size, and all will be well.

Jackson’s has millions to spare, and he’s a techno geek, having a blast possessing the latest and shiniest toys, and although I applaud peoples pushing technologies, there is a point where it’s a bit, well, pointless!

speaking of shortsighted:

technology will always advance, first in the very expensive high end level (hollywood) then it will gradually trickle down to consumers. to imply any other outcome is very shortsighted.

why is this a topic of conversation?

With 16-32 computational demand increase(24->48fps for 32) only plausible way for A DESKTOP PC to deal with it is perhaps through GPU advancements(GPGPU or game render in some cases?) unless there are some other multiprocessing contenders (e.g Xeon Phi) and software to take advantage of it all. Otherwise it’s only a luxury of high end production studios who always had and will have the resource. Even so I doubt the standard will catch on anytime soon both in most movies (bloats production cost too much), let alone TV. Quote from wikipedia: “In the United States, 1080p over-the-air broadcasts still do not exist as of January 2015; all major networks use either 720p60 or 1080i60 encoded with MPEG-2.” - so that’s 15+ years after HD became relevant? In other words my prediction is that none of us has to worry about it for at least a decade as the graphics we output are for standard devices (pc,tv,mobile) where resolution will be limited due to physical device size. If you work in a production studio it’s no concern at all :slight_smile: Further more 8k can also be just a temporary marketing thing as history is littered with tech and standards that fail to take off (such as 3D several times)

As a side note, I always wondered when it will be possible to utilize Unreal4.5+ like technology (maxed out) to actually render out production graphics for video - quality is there already. Lets not forget that biased renderers have dominated up until now(such as renderman), only now we lean towards full raytracing solutions. Biased renderers and game rendering technology is a lot less of a problem render time wise and would make sense with these high resolutions

I don’t see what the problem is. In 10 years, the price of 8K technology would’ve reduced somewhat for the average artist to comfortably afford it/them and comfortably do 8K rendering on whatever updated technology we’ll have then. I’m currently peering into a UHD 4K monitor that didn’t cost too much at all now that they’ve been out for a while. I’m currently running a rig that cost me somewhere between 4 to 5 thousand Shiny Limey Sterlings when I first starting building it last summer but has drastically reduced to only half that value now.

In ten years’ time we’ll see less CG artists talking about good or bad GPU comparisons with regard their card performances with Cycles and more about their performance comparisons with their very own render farms. Ten years ago, when Blender was still young and Cycles was itching its’ daddy’s pants and yet to be born, people would no doubt have been wondering if their rigs would be able to render a single frame in Cycles in less than a million hours. Now the average for people on a budget seems to be about 10 minutes on current rigs that people ten years ago would no doubt have seen as the bedrock of pure technoVoodoo and wondering what kind of ungodly blasphemy has infested their PCs to be able to do such things.

People were probably asking the same in regards to colour television at a time when everyone was so used to black & white and thinking whether adding colour to these moving image shows is worthwhile or pointless. Nowadays, people with a B&W TV set is about as rare as rocking horse sh!t. It’ll take off. Quicker than a ferret with a rocket up its’ backside. Look at the way the average pillock cues up around the corner for the latest gimmicky must-have Iphone that’s no different than the previous model. Now people use them more to talk to people around the world instead of a CB radio. With a huge advancement in 8K res TV sets and monitors etc released for general consumption, the dog will hunt.

Until recently, the rate that resolution was increasing was fairly even with the pace that consumer PC’s were gaining in terms of performance. Now we have resolutions again making major leaps in an era where it’s starting to look like silicone-based computing technology is hitting the wall (the performance of Intel’s latest generations of chip being a lot smaller than seen in the past for instance).

The only way to really move forward with computing right now (so as to make 4K/8K animation and the like feasible for a single person and his PC) would be a fundamental transformation of the CPU away from silicone-based transistors to new types like those based on optics or those made of graphene, types that for a long time now have remained largely confined to the research phase and are still not yet realized as consumable products). It will indeed be a fundamental transformation because it will be a transition away from what computers have been using for over 50 years.

Once we do make that fundamental transformation, than machines that give the performance needed for these things will get here, but the big leaps that we used to see will no longer be coming about until we get computing off of silicone. Even then, the massive jump in performance is not going to happen overnight, it might be a bit more like the current slow transition away from the Edison bulb for lighting.

I personally think the technology push is getting a bit out of hand and Peter Jackson is doing the pushing. During the test viewings of the first hobbit film, the review about 48 fps were quit bad. And still, all hobbit movies are shot in 48 fps. Why? Why do we need 8K? Is this truly because it gives a better viewer experience? I find it pretty lame when the innovation in a vfx movie is that the resolution and framerate are higher than the previous one. I’m done with Peter Jackson!

While 8K, 48fps content is currently possible the infrastructure to deliver that content to homes, at least here in the US, is a bit more problematic. Streaming 4K content requires a connection capable of delivering 15-20Mbps per device. 8K content requires roughly 24Gps. The average connection speed in the US is 7.6Mbps.
This isn’t predictive regarding the adoption of 8K, it does suggest that there are obstacles beyond content creation.

Will it ever be possible? Of course, that’s a silly question. Moore’s Law and all that. The neverending march of technology. Our kids will laugh at what we were forced to render on, just like we laugh at the render systems of the 80s and early 90s.

Is it possible right now? No, of course not. I’d argue that 720p is BARELY possible at high qualities on consumer hardware. Hollywood doesn’t spend money on renderfarms just because it’s fun. It’s because it’s a necessity. Your piddling 4-16 thread home PC can’t possibly hope to keep up, and that’s been the case forever.

CG at 8K resolution and 48 FPS; will it ever become possible on consumer hardware?

You forgot stereo, so the hardware requirements become double insane… by todays standards. Of course it will become possible some day, if people needs/wants it. I think it’s weird that you compare what Peter Jackson is doing, with what a normal consumer PC can do. Even a highend workstation today (48 threads, 256 GB RAM) would take a long time to render 5 minutes of Avatar in 2K stereo (if at all), so forget about gamer pc’s. Today.

Sony may have jumped the shark when they proclaimed, that the PS2 could render Toy Story in realtime (in PAL), which obviously wasn’t true, but the PS4 certainly can. Technology moves forward because we can, and the side-effect is extreme consumerism. which requires more content creation that needs faster hardware, and so on - and CG artists around the world cry in pain. That said, I have experienced 8K in Japan, and it was much better than I could imagine, so definitely a fan. Japan also plans to broadcast the 2020 Olympics in 8K, where TV’s with that resolution will be more common in Asia (not so much the rest of the world I think). VR will push the 8K realtime side of things around that time, and offline rendering will still take hours pr. frame for Avatar 4…

What is the point of 8K on a normal home screen? at about 2K pixels are so small that, unless you sit right in front of the monitor, you could never see them. There is no need of going beyond the capabilities of human senses.

This is also true for cinema, try and see a pixel on an IMAX screen at 4K, good luck with that, now imagine 8K on a 60 inches screen at home, this is a bit redundant!

There is a limit of what We can perceive, and the actual technology took pretty much care of that!

I’d rather see advancements in technologies like 3D, or even holograms, Harvard and MIT are even working on hard light technology, maybe we’ll be able to render images for solid 3d holographic pictures with Blender in 10 years, who knows, now, for me at least, this is way more interesting then adding more resolution to movies.

And yeah, I believe that technologies like MMP is the way to go, at least for the near future!

This tech using ultrasound technology is a bit further along.

You apparently still need a screen to ‘show’ the shape, but it’s a lot closer to holodecks than where we were before.

Yea, just wait for the quantum computers. Imagine rendering with Blender using a quantum computer.

Where does this thinking come from that tells you it should even be possible to produce that level of animation quality solo on one computer? More importantly, did you think of the animators time it will take to make those frames look good at this resolution and frame range? Computer time will be negligible compared to that. You can probably churn out empty black 8K frames right now in realtime if you’d like.

I’m not that optimistic. Look at what NVIDIA from 10 years ago imagined the future of hardware to be like:

(image credit: @fedyac)

The increasing problems in moving to ever-smaller manufacturing techniques are a sign that Moore’s Law is going to peter out sooner rather than later. Technology will settle at a point which is reasonable from an economic standpoint.

Let me again stress that up until now BIASED rendering dominated the market as raytracing was too slow/expensive. The lines between realtime entertainment and VFX are constantly edging closer and GPU is the best bet we have (utilizing highly parallel processing.)

Today 4K gaming is becoming a topic and the game engine quality is already quite excellent at 30-60+fps with the biased GPU based rendering (not raytracing like cycles). You can imagine 8k is not out of reach at all in near future( even now ). On 1 Desktop PC you would want to have 1 frame of animation render below 2-5 minutes, that means you can further increase the engine rendering quality and complexity up to ten to several hundred times and have beautiful, plausable 8k renders with that approach.

I have actually used such approach with maya VP2.0 to get away with cinematic rendering on very tight deadline. With some passes and heavy composition it really saved my skin. I have also tried Cryengine and Unreal 4.5 however their mesh importing is extremely slow and pipeline setup would have been pain in the neck with some data being very hard to export.

Game Engines and traditional 3D DCC apps are also somewhat merging and becoming to overlap. Blender is almost ideal application in that sense with its own game engine/renderer integrated as data sharing is not a problem (unlike with external engines). All you need is quality.

8k image of UE renderings:
http://images.gamersyde.com/image_unreal_engine_4-21778-2539_0001.jpg
http://cdn.overclock.net/7/7f/7fbc2424_iyG8WBQuyOlMc.jpeg
http://i6.minus.com/iFInQAkeisfL1.jpg (cryengine)

You should take in account that the hardware market is developing fast, very fast
so GPU and CPU’s that are 1000$ now cost a few hundred in half a year
Right now there is no way you can run 8K without a High End SLI or Crossfire solution
But wait, soon 4k will become normal and 6k is top then, so by the end of 2015 8k is getting affortable

But you have to take in account that sooner or later even GPU’s such as Titan Z will become a weak low end GPU (one day)
Because some GPU will outrun it

Same goes for CPU’s, but in CG the CPU does not matter too much (a 2 core 3ghz is fine)

This looks indeed promising, thanks for the link.

BeerBaron, it is a well know fact that CPUs are getting to a point where they just cant break a certain physical barrier, which is why CPU makers are adding more and more cores to circumvent this problem, just think of the new Xeon with 18 cores, 36 threads as a good example.

MMP, which is actually an old technology, is the way of the future, at least in a foreseeable future, which is why I am hoping that Cycles GPU development will continue, otherwise we’ll be seeing a plateau as far as rendering power goes, and real soon!