Ideas for a BGE GSOC projects

I agree with camelcase changes. I would also remove get current controller in favour of module mode only

Sent from my Nexus 5 using Tapatalk

Is cloth simulation possible, or is too difficult to implement?

Anyone here know of a good source editor?

I want to get started looking into the logic brick -> bullet linkages, and how constraints are handled,

I would like for all constraints to have a python api

like Target, etc,

it would almost make sense that the we make a sensor to collect data from constraints, and trigger logic, as well as an actuator to modify the constraint,

like
Sensor-Constraint-RBJ/6dof X min X max------------------and---------------(Motion)

so if the objects x offset is from min to max (or inverted)----------trigger logic

the property x offset will be accesable

like

sens.offset.x
sesn.rotOffset.x

etc.

Is this a terrible idea?

I think people don’t understand why I wish to do this,

1- grappling hooks etc in game
2- real time balancing via logic
3- better ability to make physical rigs that behave just like a regular armature
4- Custom camera controls - Physics based
5- using sensor linked with keyboard to not allow triggers when Rbj is to far out of alignment, so if I am tilting forward, play death 1 etc.

a 6dof constraint linked to another item with no limits set would ONLY feed the offset angle and distance,

so this is to get more power over the RBJ and receive more output data back from it,

Anyone here know of a good source editor?

Notepad, or if you want something more complete, notepad++

I would like for all constraints to have a python api

Pretty sure they do, if you create them at runtime.

  1. What’s wrong with parenting for this?
    2)Uh, python works fine
  2. It will take years of research, and very hard programming to get physical rigs to act like real animation. Try the game QWOP if you want a challenge
  3. Change the camera from type ‘no collision’ to whatever you want it to be.
  4. Python

If you know the source and target object of a RBJ, you can easily get the rotation values and compare them yourself in python. No need for the RBJ to have an API to support that.

I would like more logic bricks that make it easier to change amount of ammo and things like that and also some that make the object stay snapped to grid :slight_smile:

The idea that ‘this software is perfect, no more development is needed’ is a myth, any software you use will always not yet have those features or pieces of functionality that you would want.

This thread is just more proof of that, even the Maya users on CGTalk have long lists of things that they want and it’s for the 3D program that arguably has the largest feature set.

  1. What’s wrong with parenting for this?
    2)Uh, python works fine
  2. It will take years of research, and very hard programming to get physical rigs to act like real animation. Try the game QWOP if you want a challenge
  3. Change the camera from type ‘no collision’ to whatever you want it to be.
  4. Python

]1 - parenting does not allow for motion to be imparted on the rigid body only the object as a whole
[FONT=sans-serif]

2- python works, but you can’t get the offset( which bullet is calculating every frame anyway) and why would you want to set up 100’s of items to initiate on frame 1?

3- check out my rig,

4 by using a Rigid body as a camera, and using a rigid body joint a floating physics based camera system that collides with obstructions is possible
exposure of the offset angle set and get would allow for a physical cam to be rotated with ease.

5 - C

Whats a “module mode only” ?

So does that mean, that Cython will be faster also on android platform?

I think it would be great to see the Hive system finally implemented into Blender.

It’s is implemented - it’s in ADDON form at the moment, but it works.

One of the less fancy things but should be considered the most important is a good flexible time-step as its the first base of a good game engine. It seems the bge has quite an old fixed time-step. The more commonly used time-step in game engines is a variable one. Now im not saying implement a variable time-step as that may require major changes and break many of peoples projects.

Most game engine programmers ive spoken to have preferred a fixed time-step for there engines as to keep things much more predictable logic and physics wise. Only thing is it means the render frame-rate is locked to the logic/physics and it also stops your render fps from ever syncing up with your monitors refresh rate, meaning frame drops, stuttering, and no smooth movement.

You can test this by having a scene in the bge that you can render at a solid 60/61 frames have your monitor refresh rate set to 60hz set your scenes fps to 61 and you can see one frame get dropped every 60 frames no matter what, its the most jarring when you have a player controller allowing you move and look around.

A recent example was the release of the newest need for speed game that was released on pc. The fps was locked at 30 and it used a fixed time-step. You could override this fps limit but that then doubled the physics and logic of the game and therefor breaking it. Same would happen in the bge if you made your game at certain fps and then tried to increase it beyond what you made your project on.

Now an idea would be to make your project with the fps set to 120, so people with up to 120hz monitors will be able to see the full 120 frames. However then everyone with monitors that have a max of say 60 or anything in-between 60 and 120 will never ever sync their render fps to there refresh rate and even if they can render 120 frames it will not be very smooth at all and cause micro frame stutter/drops, and an un-smooth game isn’t an enjoyable one to lots of people.

What ive seen to combat this downside to fixed time-steps is to add interpolation so that the render-rate can be higher or lower than the physics/logic rate and allow it sync with your monitors refresh rate. ill add a few write ups from devs as an example to something that could maybe looked into for the bge.

Sorry for the essay but felt it needed quite a few examples of why this is something to be improved
http://www.danielborgmann.eu/2010/09/fixed-timestep-with-interpolation-part.html
http://www.flipcode.com/archives/Main_Loop_with_Fixed_Time_Steps.shtml

http://gafferongames.com/game-physics/fix-your-timestep/

This is great information it help me some with my game thankhyou.

I agree with this, and I’ve reported this as a bug. I don’t mean to post this regularly, but I’ve addressed this in my GameLoop patch, which offers the ability to delegate the responsibility for GameLoop management to Python.

Bullet does offer support for state interpolation, and it’s very easy to implement a simple fixed step as Glenn demonstrates. It’s what I’m using for my network games.

In terms of relevancy to GSOC, I intend to support a proposal made to clean up the BGE source code, otherwise I shall continue by myself.

Is there a way to have a GXf frame rate, linked to animation updates, synced with refresh,

and game logic and physics in it’s own time step?

Thread 1 - game

Thread 2 - “render grabber”?

That’s exactly what’s being proposed above.

Never Mind :smiley:

@agoose77
I’m with you on the cleanup. A clean and robust API is crucial, otherwise features won’t be utilized to their full potential. Anyhow, that is my upcomming task…

@SolarLune
If speed is crucial or the bottleneck, why not integrate the particle system into the BGE? For example, Panda3D has a built-in particle system. I can’t help but think a particle system would be best implemented in C or integrated into the game engine. Hey, your setup could become the official BGE particle system, eh? :stuck_out_tongue:

Something I wish I could do, would be to describe the BGE as a modern game engine. TBH, this is my own personal ultimate wish for the BGE. Some things that come to mind are multithreading and opengl support. IMHO, Python is much more convenient and replacing Python with something like LUA would have minimal advantages, especially since Blender itself is deeply intertwined with Python. Mobile support would be nice but it’s not necessary.

My OpenGL skills are still a WIP but hopefully one day we can replace some of the glBegin() and glEnd() calls. I know some of the infrastructure and backends are inplace. IMHO, the best solution would be to create a legacy rasterizer (OpenGL 1.x) and a modern rasterizer (with fully programmable pipelines).
http://blenderartists.org/forum/archive/index.php/t-286738.html

Although, based on what I read in this thread, the BGE still has some fundamental things to work on like the timing and the API.

That assumes the idea that transferring his particle logic into C++ with its own Python functions and UI would be feasible. Don’t forget about Ndee’s easy emit addon, you might find some ideas there as well and some parts might make more sense to copy (compared to X-emitter) when writing the C++ code.

Anyway, perhaps you and Agoose77 should start working off of the same build and submit patches from time to time for inclusion into Master if it’s complete enough, being able to do that is one of the nice things you hear about with GIT.

You want to make sure to profile before doing blind optimizations. First off, most of the BGE doesn’t even use glBegin/glEnd. I think 2D filters and blf are about it (aside from fallback code). Also, they way OpenGL works makes it so you get little performance improvements unless you fix the actual bottleneck in the pipeline, and I don’t remember profiling a game that was bottlenecked at vertex transfer (what glBegin/glEnd, vertex arrays, VBOs, etc. do). Instead, usually shaders (specifically fragment shaders) were to blame. So, how can we make these faster? Better use of static lighting would be one thing. BGE games usually have all of their lighting dynamic, and dynamic lights are expensive. Better (e.g., simpler-to-use) light map support would help here (especially with tasty Cycles baking on the horizon). This would encourage users to use more static lighting. Some games can benefit from a Pre-Z pass to reduce overdraw, which reduces unnecessary fragment shader executions.

Now, could our OpenGL usage be cleaned up? Yes. I think the first step for that is to make sure there are no OpenGL calls outside of the rasterizer. Also, it is good to keep in mind that “cleaner” OpenGL doesn’t automatically mean faster rendering; it just makes things easier to maintain.

making a system that handles moment of polygons over time (particles etc)

that continued doing one thing driven by C unless you tell it to do something may be good…