Game Engine Resource Kit 2.7x

Proposal:
As a long time GE user, I feel one of the main problem with the GE seems to be the lack of good components for people to reuse, it currently takes way too long just to set up some simple behaviour or effects. Looking at some of the Unity and UE4 demo made me realize there is a lot we can do in Blender to make the development experience more friendly.

So I am going to try and create a library of free, reusable component that people can just drop in to their own projects.

Benefits:

  • Provide a faster way to bootstrap a project. Import a model, drop in a mouse look camera module. Boom. Done.
  • Avoid the need to ‘reinvent the wheel’. Nobody wants to write another resolution changer/launcher script.
  • Showcases ‘best practices’. By open-sourcing this library, hopeful we can create some high quality components.
  • Act as test suite and/or demos for the BGE.


The Library:

You can access the library here on Github.

As of November 2014, the library is under constant development and will target the latest version of Blender.

This is where I need your help. I have some ideas regarding what I should include. But the input of the community would be great!

So far, I am thinking of including:

  • Basic starter framework Blend file (fly through, object viewer, etc)
  • pre-scripted cameras(fps, orbit, side scroller tracking, top down, etc)
  • Material library
  • 2D User Interface widgets (probably going to be BGUI based)
  • GLSL fragment and vertex shaders (shader template, hardware skinning, BRDF??)
  • 2D filters (ambient occlusion, color grading, etc)
  • FX (particle effects, smoke, etc)
  • various game-related python scripts (launcher, resolution changer, joystick parser, network parser, etc)

Your Thoughts
So what do you think of this project? What would like to see in this library? Will you contribute?

Sounds great and yes , I will contribute but , the first thing I would like to see is a shader that gives proper global illumination. After all , that’s the reason why BGE looks a little bit worse than Halo when it comes to realtime visuals. Halo was released in 2001. We are not in 2001 anymore people. Here is a short list of games from 2001 that BGE is not able to match in 2014:

  1. Battlefield 1942
  2. Halo Combat Evolved

The reason is not able to match those is because of performance or bugs that mess with the performance. Even the basic mouse look and keyboard input is not happening yet smoothly because of stuttering issues.

??

:expressionless:

I want to see what he has, and the bger does just fine for me.

Can you clarify what you mean by GI in this case? Not many graphics engine today support GI, and almost all of them requires extensive support from the underlying graphic pipeline. i.e. Global illumination with a Forward Renderer such as Blender’s OpenGL engine is just not going to happen. I don’t think that’s something you can do with a single fragment shader (efficiently).

I do agree having access to fancy visual effects can probably attract a lot of new users. However, to keep these users, I think we’ll need to improve BGE’s production pipeline. This is where the Kit comes it, it will have some good examples of how to setup a project, use linked assets, and do everything ‘right’ so that the scene won’t collapse under its own complexity. The long term goal is to make things more modular.

  • GLSL fragment and vertex shaders (shader template, hardware skinning, BRDF??)

This would put the armature rasterier load unto the gpu?

Deformed meshes eat up more resources then anything,

If we could get armature skinning done on the gpu(not sure if it is already?)
but for some reason armatures seem to cost the most in a scene.

Yup! Multiple armature definitely is a performance killer. Try having more than half a dozen characters on screen will murder your framerate. Moguri already has a multithreaded BGE armature branch with experimental GPU skinning support: https://github.com/Moguri/blender/tree/thesis

I had done some proof of concept hardware skinning stuff with dfelinto, it’s not as ‘native’ as Moguri’s implementation, but I think it can still improve things a bit.

I’m usually designing my components in a way that it can be used like that.

A) Append a scene/group …
B) add an instance of the group to your level …
C) (optional) configure some settings …
done.

  • They are (usually) very easy to use.
  • This means the hard work is at the component itself. The designer of the component needs to ensure the component is working as expected. Overall it is not that complicated, just more work (for the designer).
  • The .blend files are available and everyone can look at the implementation.
  • By linking groups (rather than appending) it ensures you can update the library files (if they ever get an update) just by replacing the file.
  • It fits very well in multi file distributions (lots of small files in several folders) which is typical for larger projects
  • it does not fit into single file distributions (everything in a few files)

You can find my library in my signature.

Remarks:
There could be some more support towards this method from the BGE:

  • establishing preset configuration parameters - currently properties that needs to be setup manually
  • showing related documentation - currently webpages and text blocks, which does not bundle component and documentation very well
  • better interface to link groups (currently via a “loading” view - as text)
  • better interface to add instances (currently via text menu - creates conflicts with groups of the same name)
  • usually you need some Python to run a group instance (the component designer, not the component user)

Since the instance/instance members are accessible (groupObject/groupMembers) the support of groups is much better than before.

I have been working on a project to teach the bge, but I got pretty into developing gameplay elements and have not went though and labled everything,

It has a physical actor rig, meaning if you can animate it he will try and do it,

this could make workflow INSANELY fast.

Have you seen Wrectified?

I/we have made a few very reusable things
MouseForce - > apply torque with mouse
ForceMoveTo -> apply force to move a object to a point or object
TorqueTrackTo -> apply torque to match a object orientation
3d dynamic object manipulation and assembly.

But the idea is to make the game 100% modular, mit 3.0 cc, with all ad revenue in game going to blender.org.

So if we make a really good game and give it away on steam, and have it pump ads in splash screens unless you donate… (proceeds to blender.org or direct bge development)

What I mean by G.I. is , at least some proper looking real-time lighting and shadows that cover a bigger area than just the player and no zigzags in those shadows. Either with shadow filtering or post-processing to eliminate the aliasing. I think Unity is using a distance controlled crossfade between baked lightmaps and the real-time lights but that part sounds too complicated , probably not worth adding. Plus , how are other games able to put 64 animated characters on the screen (as in BF1942 or BF3) and still have performance left for logic , sounds and physics ?

In this file , the monkey is pushing the logic to 8% or 20% sometimes. As soon as I remove the monkey , logic goes back to zero. It must be some kind of bug. Can anybody figure out what’s wrong ?

Attachments

AtlasBakeMirrored.blend (1.57 MB)

shape keys as animations are hard on the gpu, that is all CPU side vertex transformations?

it’s note done on the GPu, so moving vertices 1 at a time * 60 per second = bog down

Monster, I wasn’t aware of your library before, but it looks like you have a great start. If you are willing to contribute some of those toward the Resource Kit, it would be more than welcomed!

I agree with everything you said about development. My ideal workflow is the same:

  1. link in an object to scene
  2. configure some parameters to add variation or functionality to object
  3. Run game

Currently there are some limitations on how the Blender library system is setup, such that I can’t edit game properties on a linked object. This severely limits the usefulness of a library. I will need to work out a solution to that.

That brings up another thing… While there are a lot of cool stuff we can do, and better light and shadow support is definitely possible, even without touching Blender’s source code. We need to strike a balance between feature and reusability.

For example, martinsh’s GLSL demos are cool(and I have the utmost respect for him), but many of the scene are notoriously hard to dissect and reuse. That is something I want to avoid. If the user can’t drag a material onto an object directly to apply it, then I don’t want to include that material in the library. They shouldn’t have to create 20 empties paste in 7 shaders to get some water effect.

Maybe I am aiming too high, i don’t know… Will have to try and see what’s possible.

No problem you can take it from the above URL.

Are you thinking of a separate website to keep the resources?

That is no big deal. The user should never ever access the internals of a component. The user should access and provide data via properties at the instantiating object and/or any other objects (e.g. child objects). The component needs to access this data and if necessary update them on the fly. [It would be an improvement if the component could establish these properties beforehand (GUI) that a user does not need to type them.]

As a demo please have a look at the MouseOrbiter. It takes the configuration from the instantiating object and its children. It returns values at the same way. This enables a user to apply own logic on top of the component.
[The current implementation of MouseOrbiter is not updated to the latest API, which would simplify the usage a bit]

Is it possible to create a world space ambient occlusion GLSL shader , as in , dynamic projective texturing from a render to texture buffer , to get some good looking soft shadows in real-time ? At least to cover the area around the player and only draw the part that is inside the camera frustum. If we can figure out that part , we can add emissive materials later using the same trick.

I agree with Mike. There are some nice GLSL examples out there but very complicated to set up and not very reusable.

BluePrint, thanks for the help. I removed the animation action brick and logic is still going too high. Maybe there is some bug with the ray or random module.

If you want all of a item to use the same linked data,
Maybe have an item set, and a “brain” that they all.copy there property from?

If property On=False--------------and----------------Copy Property X from “Brain”
________________________________----------------On=True

I don’t see any spikes maybe it was a work around you did?


This sounds like a great idea, a thread like this one with a bunch of Development kits to pick up and start working with. A user friendly system so people who don’t like to mess with code could just replace the character in the blend file and his/her animations and have a working nice camera for whatever game they’re making. I’d help in anyway I can, I know modeling, sculpting, texturing and animation.

An a none code guy myself I could give you some hints on what people want or need as well, the last thing they want is to mess around with anything that has to do with coding. Ideally for a camera setup for example, the guy would just unparent the character to his bound then put his own character in there instead. Hit start and his character would be running around. That is what people want, when we got enough game setups we could put them on a site and through them we’d see more nice little games made with the blender game engine.

I can’t stress this enough, a setup isn’t a super complicated lalalala thing with a bunch of properties and values and stuff. Should be simple enough for very little to have to be changed for it to be implemented in any game. The instructions should be below 100 characters in length on each setup.

I have a basic Ai hull with attack animation, using steering in resources. It uses a ray
To see.
Just replace gfx mesh, and adjust the property sensors for animation length.

edited, edited