GPU does not work on OS X - “CUDA error at cuCtxCreae: Out of memory”

I try to to install GPU in Blender.

  • Blender 2.73
  • OS X 10.9.4
  • Memory: 32 GB 1600 MHz DDR3
  • graphics card - NVIDIA GeForce GTX 660M 512 МБ

I installed the latest Nvidia Driver from the Nvidia website. After restarting Blender, GPU is now available in the settings, but when I go into the render mode in 3d View, it does not work, and I see the message “CUDA error at cuCtxCreae: Out of memory”.


It also does not work if I press “render” or “animation” in the right pane, and the same message appears:


What could be the cause?

  • Memory cache limit at maximum - 30098
  • Feature Set - Supported

How to fix it?
Thanks in advance!

Your graphics card has very little memory. You may not have sufficient for your scene.

Does it render on cpu
Does it render the default blender scene

i am on mac too with 32gig ram, my nvidia card has 2gb, and still i often get out of memory when rendering gpu. it began with blender 2.70, before i could render almost everything on gpu. the scenes need now more memory on the graphiccard, i was told, since something in cycles changed…
still everything renders on cpu, since 32 gig ram is plenty…

Richard Marklew
Yes CPU works fine (but longer :))
no, GPU does not render default blender scene, and this message appears again!

Doris
Yes, looks like I have the same problem, but in my case, GPU does not work at all- even with default blender scene).

in my case a scene does not render on gpu if it uses more than 1.3 gig to render on cpu. even just 1.3 gig does not render on gpu, which kind of implies that gpu render needs at least 0.7 gig ram more than cpu…so, if you have only 0.5 gig, that might not enough to render at all on gpu, sadly… i am angry about this too, as i bouth the larger card, instead of the 1 gig card, to be save for render…alas, 2 gig is not enough in many cases… sad

Doris, Yes, I understand you. You were buying a graphics card for a PC? I ask because, as I understand, mac have a major - it is impossible to replace it hardware components, such as graphics card. So I’m going to start working on the PC.

no, i am on mac… i meant, when i bought my mac new, i could choose between 2 cards, and i chose the one with more memory, because i thought i could render always gpu with it…

When using the GPU to render in blender your system memory amount doesn’t matter one bit, even if you had 1000 exabytes of memory it wouldnt matter unless your were using the CPU to render. blender will be using the onboard video card memory when using cuda to render, it won’t even look at the system memory. If your video card has under 2GB memory then don’t bother rending a full scene with it.

I speacifically bought a Nvidia GT 740 with 4 GB memory on a budget, it’s a low end card but it runs surprisingly well with cuda render. I can render full scenes because of the amount of memory on the card.

I encourage anyone who is looking to buy a video card for cuda rendering to seek memory over power since blender is becoming more and more complex on each release which means you need much more memory to render even simple scenes. Also blender continues to optimize its rendering engine so having a mid range - low end video card isn’t a deal breaker anymore as long as it has a good amount of cuda cores (500 and up) The next video card I get will be 4GB and up.

yes, so true, but when i bought my card, its 2gb were highend, and only half year later it is low end :frowning: … now it is already deep sea…

It seems that from the outside of the camera viewport you have a scene with a sailing ship and the sea. While your polygon count is quite low, if you are useing the Ocean plugin, at high resolutions it can use a lot of vram. I would start very low. Maybe Resolution 5. Viewport render cycles, and then increase the resolution one step at a time. You will see the maximum you can go to when it crashes or you get the out of memory message again.

I have the same issues on a 2gb Asus card. The way I understand how Blender renders GPU is that is puts the whole scene to be rendered in GPU memory, no paging files (I think it is a GPU thing). So if your GPU is using part of its memory for display, that memory is not available to render at all. Hence you never get to the 2GB limit. Let’s see the cost of new GTX980TI is, OMG!!! Sorry. I fell out of my chair. The textures kill you. A small 4k texture takes up 64MB of memory on the GPU., and 8k takes a whopping 256MB. I know that Blender developers have the GPU scene size on their radar, but I have no idea when they plan to enable one to render the larger scenes. If you can get by with 1k textures they only take up 1mb.

I’ve been mostly using 2.73a as is seems to work better with CUDA. 2.74 and 2.75rc1 do seem to have some problems.

I haven’t pinned it down exactly but it seems that it may be the viewport Render mode under GPU that may be leaking CUDA memory. If I use only CPU for viewport Rendering and GPU only for full renders I can go a lot longer before memory is exhausted.

When the CUDA memory is jammed up I have to reboot or log out and back in to get the driver back in a clean state.

I was also getting a lot of hard crashes on 2.74 ( hanging the machine completely ), though 275rc1 seems to have regained some stability there.

I’d like to report this as a bug, but it still seems a bit vague.

Latest Yosemite version 10.10.3
Late 2013 27" iMac. 3.4GHz. 24Gb Ram.
nVidia 775M 2GB.
Latest CUDA drivers 7.0.36

Hi, please add your system specs, we dont even know your OS.
I have no problems with Cuda on my system, specs in signature.

Cheers, mib

Latest Yosemite version 10.10.3
Late 2013 27" iMac. 3.4GHz. 24Gb Ram.
nVidia 775M 2GB.
Latest CUDA drivers 7.0.36

Hi JiggyWig, spoke with Blender OSX maintainer and the only think may force these error is render preview and F12 at the same time.
Do you have problems with every file?
If you can reproduce it it would be a important bug report.

Cheers, mib