Parametrically controlled materials

Is there a way to control materials parametrically? If you’re not familiar with parametrics, it essentially means rule-based design.

This short animation illustrates what I’m interested in doing:

The size and color of those rectangular prisms are defined by its distance from that moving dome.

Let’s say I want to create a grid of 100x100 light bulbs on the floor. A beach ball is thrown into that grid and bounces around. By some sort of magic, the light bulb is able to sense how far away the beach ball is, and grows brighter as the ball gets closer, and dimmer as it bounces away. Is there a way to do this without manually animating 10,000 individual materials?

Alternatively, I want to use image sampling to control that grid of lights. This is an example of what image sampling looks like:

http://www.sean-madigan.com/wp-content/uploads/2012/05/Image_Sampler2.jpg

In the example above, each circle in that grid has a radius that is controlled by how dark its corresponding point in the image is. Instead of circles, I want light bulbs, and instead of radius, I want emissive-ness. And instead of an image, I want an animated texture.

Does anyone know if this is possible to do in Blender?

You can definitely do it with drivers, but that might require giving each object a different python expression, and that’s a big pain in the butt (I don’t think it’s possible for a driver to read in relative DNA paths, only absolute … I could be wrong though)

I think the newest version of blender lets a material read in the object coordinates of another object. Maybe that will do the trick?

Perhaps when Blender’s new dependency-graph is done, your question will suddenly have a much more obvious answer.

Now in 2.74, you can use texture coordinates from external objects. You can solve your bouncing ball situation with something like this:


On your second example, you can just have a common projection of the texture on all your light bulbs (can also be from another object, like a plane), and use the texture connected to the strengh of your emission shader. For the animated texture, just open a movie or a sequence into your ImageTexture node.

Oh!
How do you give your node-connectors those little control points so that you can shape and organize the connections?

Add => Layout => Reroute

Or just drag the mouse over an existing connection while pressing SHIFT+LMB, which will place a reroute automatically. :wink:

Aw yisssss.
Thanks!

These are great responses! Definitely tangible things for me to sink my teeth into. Thanks!

Thanks for the very helpful responses. I will look into Sverchok, it looks pretty promising.

Secrop’s screenshot also looks very close, but I would need each cube to be one uniform color. Some of them, at the boundary of the white circle, are half light and half grey. I’ll be toying around with that node setup, though.

It’s possible to achieve that… making some rounding to each coordinate axis’ of the geometry position (with the help of some adding and multlipying)! ((‘Round’ and ‘Module’ in the math node are quite usefull…))
This will keep the distance, not to from the point being rendered, but from the nearest point in some grid with the ‘add’ vector as a center, and a multiply vector as a scale… (not very versatile)

Use Sverchok only for this kind of task, it is exact solution.
There is node to define vertex color and than use vertex color to render