Nodes input output compatibility

Hi

I’am learning node system of cycles engine. Sometimes I see connection between sockets like
color socket (output) -> fac (Input)
image socket (output) -> color (input)

It is for me still not intuitive which sockets are compatible as for output-input connection.

Color socket with color socket or image with image is of course compatible.

In the documentation of nodes I didn’t find description about socket types with which a choosen socket can be connected.

Is it somewhere good described?

For example on the following page

http://wiki.blender.org/index.php/Doc:2.6/Manual/Composite_Nodes/Types/Convertor

we have Image connected to fac socket. Image is 2D and fac is 1D (value). How do Nodes work that such a mapping work?

Thank you

…and fac is 1D (value)
… for another one or two 2D component each 1D element. Don’t know if it is 100% correct but you can think of it as a photo editor layer mask being in use for some 2 other layer operation. Usually that would be b/w image values; if you use color on Fac input it is converted automatically (and is not what you would quite want, but somehow does work).
In general any type of output can be seen as an image.

first, images are color information and in this sense they are 3D (red, green and blue channels). The 2D coordinates of an image are used by the vector socket, which is 3D (U, V and W coordinates) but in some situations (live UVmaps) the W axis is simply discarded. In the case of an image, whatever the value of W is, the color output will allways be the UV coordinates.

As an overview of all the inputs and outputs:

Fac is a float number. When connected to a color or a vector input, the value is transformed into 3D (i.e. Fac=0.5 >> Color=[0.5, 0.5, 0.5])

Colors and Vectors are arrays of 3 floats. In practice, they are exactally the same, and is possible to connect vectors into colors, and viceversa.
They are represented as [float Red, float Green, float Blue] or [float X, float Y, float Z]. When connected to a Fac input, Cycles will convert the color information into grayscale by performing the following formula: Value = R0.2126 + G0.7152 + B*0.0722

1 Like

Thank you for responses. The explanation from Secrop is very precise. Maybe would be good to put it in the documentation. Do we have it already somewhere in doc?

Everything is in the documentation… I think. The formula for the RGBtoBW I looked in the source code, but I’ve seen it somewhere in the docs in blender.org (the different scales for each color depends of the luminance strenght of each color… Green is brighter than the others, that’s why it has a stronger influence).

The only thing I didn’t reference was the shaders’ output, but these are not understandable in terms of values.
And the Vectors can sometimes mean different things depending their reference space. If you want, I can expand this topic even futher. :wink:

Hi

And the Vectors can sometimes mean different things depending their reference space

You could extend this topic. Good to get new knowledge.

The formula you gave converts RBG to BW when you use the RGB to BW node. This formula is luminance based.
When you plug RBG output to Value input directly (yellow socket to green socket) different formula is used.
RGB gets converted to single value by using “Value”, not “Luminance”.
The proper formula is: Value = max(R, G, B)

humm. i’ve tested and plugging a color into a value input still gives the luminance, not the max value. Just try it with [0,0,1], you’ll see that the result will not be 1 (which is max(0,0,1)), but equal to 0.0722.

(about your sockets colors, I’m a bit confuse… my green socket is the shader socket, not a value socket, you might have some other template… because connecting a color to a shader socket produces no results)

Sorry for that. You are right. Tested this just now. I was sooooo sure about what I said, that I wrote my post without testing. I was wrong.

About socket colors: I wrote “green”, but I meant “gray”.

I wrote my post too quickly. I’ll have to be more careful next time.

No problem BartekSkorupa… before i did the first test, and look for the code, i was sure the value would be the median from the three values… :slight_smile:

Normally, you will use two types of vectors (they look pretty much the same):
-Location vectors
-Direction vectors

Location vectors are points, or positions. They are normally used by texture nodes to determine which color should be computed. They can be representing different spaces… for example, Position will be a location related to the world system coordinates in blender units, Object will be related to the objects’ origin/scale/orientation, Window returns the coordinates of the rendered image and UV returns a translated 2D coordinate from the point in the surface being hit.

Direction vectors are normalized (their lenght is 1), and their origin is normally the point being rendered. Normals, Tangents, Incomings or Reflections are represented in world vectors, with all axis being parallel to the world axis. This is why we need nodes like ‘Normal Map’ to transform a normal texture into world coordinates.

There’s a lot one can do with vectors, and nodes like ‘Vector Math’, ‘Vector Transform’ or even ‘MixRGB’ are great tools to work with them. You can allways decompose the vector and use the ‘Math’ node on each component, if you need something more exotic.