Bliss 3D Toolkit

Wait, we can post non-BGE games with blender models in the WIP section?

Personally, I think by now, this is the most promising of the alternative engines that looks like it can read .blend files and thus serve as a suitable replacement for the BGE.

The only thing I’m waiting for really (along with seeing when complex games are possible) is support for Windows, I could see the userbase for this engine go up quite a ways as a result.

Also, will it be possible to have true commercial freedom with the games you make then (I see AGPL for instance), but what does this mean for encrypted game runtimes)?

Progress:

  • Lots of refactoring/bug fixing.

  • Frustrum culling implemented
    [LIST]

  • Details: extract clip planes and test bounding spheres; bounding boxes also available but spheres are fastest. This can be optimized further but I went with the simplest approach.

  • I woud like to implementing actions next (shapekeys/armatures later)
    [/LIST]

I’m porting my old stuff and working on physics (ODE). This also gives me an idea of how the transition experience will be for BGE users. It’s mostly replacing importing the BGE module and the attribute names. However, overall it’s starting to take shape!

I’m not really against Windows. One of the reasons I put it on the backburner is PyPy3 is not fully supported on Windows 64-bit, although 32-bit is supported. Technically, the engine (all Python code) can run on regular Python, which makes for some interesting comparisons. All the libraries I use are available on Windows so everything should run on Windows (and Mac).

I link against Blender so this entails the code has to be be GPL’d also, but at the same time makes things easier…we’ll see. I see this limiting commercial games more so than say hobbyists, which is probably most users of open-source engines.

Just some background info: The BGE both embeds and extends Python. The BGE is a C++ program that has a Python interpreter. The BGE loads all the scripts and passes it to the interepreter to execute. In addition, the BGE provides an API through the BGE Python module, which allows users to access the BGE C++ API. So from the BGE point of view, it is embeding Python and from the Python point of view (user scripts) it is being extended by C extension modules (BGE module). Blender itself does this in the same manner with BPY, etc.

Embedding an interpreter makes more sense if the user does not have access to the underlying code like in proprietary engines. It can also provide a higher-level way to work with the program. This method allows for supporting multiple scripting languages. So if the BGE had a LUA interpreter, LUA bindings for the C++ would need to be provided like with Python.

However, I believe there are some significant downsides. IMHO, embedding creates unecessary complexity, at least for an open-source game engine. It creates a strong division between the low-level code and the high-level code, which requires more boilerplate code and makes things more difficult because both sides don’t communicate well since they are too isolated.

Tim Sweeney provided some really good insight on this with UnrealScript/Unreal Engine:

The first three generations of the Unreal Engine included a sandboxed scripting language, UnrealScript, which provided a simple interface for gameplay programming that was shielded from the complexity of the C++ engine.

The scripting approach is very welcoming to new programmers, but eventually it breaks down and becomes an obstacle to innovation and shipping. We experienced this over time as the Unreal Engine grew until finally, in 2011, we moved to a pure C++ architecture. The causative factors were both pervasive and general:

  • As an engine and its community grows, there is increasing pressure to expose more of the its native C++ features to the scripting environment. What starts out as a sandbox full of toys eventually grows into a desert of complexity and duplication.
  • As the script interface expands, there is a seemingly exponential increase in the cost and complexity of its interoperability or “interop” layer where C++ and script code communicate through a multi-language interface for calling functions and marshaling data. Interop becomes very tricky for advanced data types such as containers where standard scripting-language idioms differ greatly in representation and semantics from their templated C++ counterparts.
  • Developers seeking to take advantage of the engine’s native C++ features end up dividing their code unnaturally between the script world and the C++ world, with significant development time lost in this Interop Hell.
  • Developers need to look at program behavior holistically, but quickly find that script debugging tools and C++ debugging tools are separate and incompatible. Seeing where script code had gone wrong is of little value if you can’t trace the C++ that code led to it, and vice-versa.

It is these reasons, ultimately, that led to Epic’s move to pure C++. And the benefits are numerous: UE4 is a unified and fully-debuggable code base, freed from Interop Hell and totally open to programmers to study, modify, and extend. There are side-benefits, too, such as increased performance in gameplay code, and ease of integrating other middleware written in C++.

Building Unreal Engine 4 as a unified C++ codebase has been very freeing, giving engine and gameplay programmers enormous flexibility to write code without unnecessary interop barriers.

This isn’t to say that C++ is the ideal language for simple gameplay code. It is more complex and more dangerous than UnrealScript, C#, and JavaScript. But that is another way of saying that it’s more powerful.

By making peace with complexity and writing code in C++, there is absolutely no limit to what you can accomplish, whether it involves debugging your entire codebase in-context, interfacing to low-level engine systems, modifying them, or talking to the operating system or advanced third-party libraries.

Now, I take a different approach for various reasons including the above. I prefer to extend Python rather than embed it. PyPy itself doesn’t fully support C extensions, rather it prefers CFFI, which in fact is a different and IMHO better way of writing C extensions. It’s best to simply think of your game as being in Python, all the way from the engine to the user code. So write in Python and extend with C as needed. For example, with the BGE, overriding the game loop through Python is not a simple task.

With simpler games, I can get away with Python, and with PyPy I can probably cover most cases. Of course, with engines that demand squeezing every drop of performance, it’d make more sense to do it in C++ albeit with the added complexity.

This is a good read on extending versus embedding.

One of the reasons why the BGE is so easy to use is because you can just start writing Python scripts in the text editor and have it run in the engine (no compiler needed).

If I had to actually get the source and use a compiler to code logic with it, I may just as likely grab the source of Blender (including the BGE) and start developing on that (fixing up and enhancing the old engine as learning to use a compiler could be seen as a gateway to full-on development).

I think, if one were to take about a year coding logic nodes, SCA style, they could make a system like the BGE for logic that was superior to UE4

https://wiki.unrealengine.com/Blueprints,_Creating_C%2B%2B_Functions_as_new_Blueprint_Nodes

C++ level support derived from image and link based coding, can write code for the user.

Artists are not the best coders, coders are not the best artists,

I would like a “Bridge”

Ray -Property Player -Set property own[‘Target’]----------and----------Steer to target

or

Message - set property target---------------and-------Track to target

AI and complex tasks could be insanely fast, as they would be Pure C++ after initializing,

another system that would be quite useful

Lists as properties

and list actuators and list sensors

if property target is in property List1 set property index-------and-----------edit list at index

I’ve setup the inital stubs for something like a component system (at the user level), however I think the architecture of the BGE made it a hassle to implement. In fact, this method is the intended way to use Harmony and what I’m doing right now. Anyhow, it’ll probably be the closest thing to logic bricks.

It works as such:
Blender “Object” C Struct --(a)-> Harmony Python “Object” Class --(b)–> User-defined Python subclass of “Object” --©–> Attach component objects (OOP: HAS-A)

  • a - Harmony loads the blend file, converts structs to Python objects, then frees the blend file

  • b - User specifies their “Object” subclass in Blender in custom properties panel; similar to importing a module/class.
    [LIST]

  • This subclass is defined in the engine’s “user” folder (which contains user’s code etc. seperate from the standard modules provided by the engine, although they can access them also).

  • Blend files can really go anywhere (outside the engine’s directory) although it makes sense for users to put their blend files in the user directory with the code. Currently, this is what I do.

  • c - (tentative) engine’s “user” directory contains a component folder that contains objects user can attach to their “Object” subclass.
    [/LIST]

In theory, components can be created for any data type, not just objects, so this includes materials, textures, lamps, etc. Components (python modules) could be distributed between users, etc. kind of like plugins/dlls/shared libraries. It’s hard to visualize this right now but it’ll come together later.


The text editor is convenient for simpler purposes since it simplifies Python’s module system from the user, but larger projects eventually need to use proper modules mode so you can encapsulate things properly. Personally, I don’t like the text editor method since it gives new users a bad habit and doesn’t encourage understanding the language; it’s kind of bastardizing it by providing improper notions.

However, it makes sense for one-off scripts, Blender GUI editing, and for encrypting your game?. Understandably, it’s tailored towards artists/non-programmers in the context of Blender/3D creation. With game engines, it’s hard to avoid programming (C, Python or otherwise), unless your game is really simple, in that case you could use logic bricks. I think this also relates to that divide I mentioned in my previous post by trying to isolate things too much and somewhat alludes to the idea of trying to do two things at once equally well.

So it’s either that or creating a file where the blend file resides, which is not as convenient but not so complicated either. Maybe if the text editor could support directories, that would alleviate its flat structure, or somehow the text editor created those files for you. In terms of Blender, trying to give it IDE features might be like trying to fit a square peg into a round hole. The text editor wasn’t designed for that, nor was Blender.

As simple as it sounds, I think one of the reasons the text editor is so convenient is you don’t have to deal with two windows, which I have a solution for… :wink:

You do understand that if the sensor could set a property on trigger, and then use that property in the actuator, without python, this would be fast right?

Ray enemy set property target--------------and---------------OBJECT target -property Health minus 5

This code below is pythonic as I don’t know C

Ray->own[‘target’]sens.hitObject----------and---------------scene.object[own[‘target’]][‘Health’]+=(-5)

So the Property actuator needs a object, and then a property (own is also a choice)

and a ray can set a property on TRUE or a Radar or near etc.

or

on mouse over any-set property hitPosition-----and-----------add object to position Property hitPosition(would be awesome!)

if you are moving away from logic in the game panel,

why not code logic in game with actual 3d objects :smiley:

3d logic nodes is a core concept in Wrectified,

You need to think in terms of OOP concepts, including classes, methods, attributes. What is the primary responsibility of the class? Game properties are really attributes of the object’s class. Since logic bricks are higher level, it’s not difficult to convert it to code.

For example, a Ray sensor can be implemented as an object that shoots a ray every frame. All those options in the Ray sensor would be passed to the function that shoots the ray every frame. You also have more flexibility so while the Ray sensor can only check one property, you could pass a tuple to that contains multiple properties, to the function to check for. With logic bricks, you’d need a second ray sensor to do this.

When the C structs are “converted” to Python classes, they pretty much retain their same structure, names, etc. So for example, with the BGE you have KX_GameObject and KX_Camera which is a subclass of KX_GameObject (IS-A), however with Blender, Object has a data attribute which depends on its type, so with a camera it’d contain a Camera class, with a mesh, a Mesh class, an Empty class, etc.; Object contains the non-camera related attributes and methods (SRP).

If you are talking about a visual programming system, I haven’t really seen any that can take the place of programming with words - they’re still the best visual programming system, in a sense of the phrase :p.

I see you haven’t touched the materials yet! How will a user create a new material/shader, do you have to code them or there will be some material editor?

I’m still working things out but the user specifies the shaders as a custom property of a material/texture. The shader source files reside in a folder and are automatically loaded. If it’s not specified, a default shader is selected based on the material and texture settings.

With modern Open GL, materials and textures are now represented solely as shaders, so every object/material requires a shader. Right now, Blender will get the job done so no need to create another editor; eventually I’ll build a 3D one :wink:

Another method is to subclass the object and customize the behavior similar to how with the BGE, you use an Always sensor -> Python controller or a Filter 2D actuator. I still haven’t implemented multipass or FBOs.

Nice to see more updates. :yes:

I bet your solution is you can load another application window inside viewport or something. Isn’t that so? :wink:

I see, sounds nice. So in theory youll be able to compile the engine even if you are on a win32bit?

I can’t wait till I have a pc that can develop in it,
Keep up the good work!

http://www.vmware.com/products/player virtual machines may lack in power, But they are a wonderful way to sandbox an operating system with out disrupting your native install.

You could always dual boot! :yes:

The C bindings are compiled on the fly the first time and cached after that. Technically, source code is produced from the bindings and a shared library is compiled (.so), which is like the conventional C Extension module with CPython. If the source code is changed (just for the bindings), it is recompiled again. However, compiling is short, somewhere from seconds to a minute, depending on how large the source is and the shared library.

I haven’t tested Windows yet, but off the top of my head, running on Windows will require some slight tweaking to the code. It does run on CPython so if 64-bit is truly needed on Windows, CPython is there.

In terms of performance, in an unscientific test, when running a simple scene with vsync off, PyPy reaches a minimum frame time of 0.4ms and CPython reaches a minimum frame time of 3.18, which means PyPy is ~8x faster. When I profile it, to no surprise, I see matrix multiplication one of the more time consuming calls, which means I could move to using C for that (probably chop time down to 1/4). Anyhow, there’s some other optimizations that can be done before that like not recomputing matrices for things that don’t change.

However, keepi mind PyPy is not a magic bullet!

If you were to plot the curves, as the scene gets more complex (before GPU bottleneck), my guess is CPython would slow down quickly (linearly) compared to PyPy (logarithmic).

For the visually inclined:

CPython would be the blue line. PyPy would be the red line. And to be fair, C/C++ would be fairly straight and by x=30, the y would be significantly lower than the red line.

You could always dual boot! :yes:

but I have 32 bit processor,

I guess someone with Windows could make the binaries. :slight_smile:

Update:

  • Implemented Text datablocks

  • Added partial support for Groups
    [LIST]

  • I’m still figuring out what’s the best way to do this. In Blender, a Group instance is internally like an Empty that references a Group; thus changes to the Group are reflected in the instance. In Python, Groups are equivalent to any container object like lists, sets, tuples, etc. However, what I’m planning to do is to convert the Group into cloned objects (that still reference their respective data).

  • Implemented checks for LOD

  • Currently, this relies on Groups so until I figure that out…Anyhow, there are various ways to implement LOD.

  • The way it works is in Blender, the user specifies a Group of objects as a custom property. Each object contains a distance property. Users can override the default functionality by subclassing the main object and handling the specified function. This allows users to use LOD for more than just meshes - shaders, materials, etc. anything really.

[/LIST]

I feel satisfied with the design so far. I’m currently going over the code with a fine toothcomb working on documentation; alpha release is imminent!

As for licensing…it will remain GPL/AGPL for now.

A preview of targets for beta release:
Beta Goals:

  • Miscellaneous
    [LIST=1]

  • More CLI options - currently I only have window size arguments; need fullscreen arguments, etc.; pretty simple to implement

  • Database

  • Actions
    [LIST=1]

  • …for various RNA properties, however, armatures/shapekeys/drivers/hardware skinning are slated for later.

  • Mesh

  • Edit mesh in realtime - this requires updating VBOs etc., however if creating new geometry, geometry shaders could be used or edit in Blender and replace the entire mesh.

  • Finish LOD.

  • Implement cloning functions for other data types. Currently, only object is implemented.
    [/LIST]

  • Rasterizer

  • Hardware instancing - personally I need this :stuck_out_tongue:

  • Scene graph

  • Parenting
    [LIST=1]

  • Types - Object, per-vertex, anything with a location…

[/LIST]

  • Audio
    [/LIST]

EDIT:
I just realized writing documentation will take longer than I anticipated, about 2 weeks…:o

“Hardware instancing”, well now thats some big feature, you dont see in other free engines, cool!

Apart from Panda3D, of course! :wink:

Keep us informed Mahalin, always interesting.