Can't track time.time() with a game property?

from datetime import datetimefrom time import time


from bge import logic


c = logic.getCurrentController()
o = c.owner


o['micro'] = datetime.now().microsecond
o['second'] = time()
print(time())

With the above script, running on a pulsing always sensor, ‘micro’ constantly changes while ‘second’ sticks to a single value. Strangely, printing time() by itself shows that the system has no problem updating its clock. In the editor, ‘second’ is set as a float, so that’s not the issue. What am I doing wrong, here?

For clarification, I’m trying to keep track of the timestamp to log player input in a somewhat framerate-independent manner. The idea is to send inputs across a network, with a timestamp attached, for the sake of synchronization; and indexing snapshots of each frame.

Typically a frame is seen as one point in time, regardless when exactly an operation is performed. This simplifys the design quite a lot. So frame is the smallest time unit. The game can’t deal with shorter times anyway.

To synchronize network participants, it is better to use a single time for a single frame too. Typically these timestamps are different between each member, thry will always be to late and you need an inital synchronisation to identify the chronological order of messages. If that matters.

Retardless of that I can’t help you with the above issue. It is quite a while since I played with the time module.

This is such a weird problem. I can keep track of the value if I convert it to a string, but holding onto it as a float causes the property to freeze?

I might be going completely mad but I recall experiencing something like this before. Allow me to check what’s going on (I.e some overflow)
Concerning framerate, you can’t use varying frame rates across clients. It just doesn’t work. You loose a significant about of determining and make the problem far more complex. Instead, it is common practice to enforce a Base line simulation frame rate even if you update the graphics at a different rate.

Yes, AFAICT there is a limit to the amount of data that a float can represent. This is because the stored object is converted to the native C type. (Try storing [time()] and you’ll notice it is updating, because a list is not represented easily in C)

Thanks for the tip, goose~! Storing in a one-long list stores the value properly.

How would I go about tracking along that simulation rate, eh? I can set Bullet’s frame rate, but how can I tell when I’ve reached a “tick”? Faster systems seem like they’ll be simple to manage, but what about slower ones? If the physical FPS is controlled, as is the speed of animation, wouldn’t having an input record of finer granularity be harmless?

A tick occurs when the game logic is updated. Using a finer granularity is not sensible, because everywhere you’re using a time unit you will need to interpolate, which is more costly and assumes that you have such precision available. In reality, the network conditions are not stable enough for you to make such assumptions. In addition, a frame may actually take less than 1/ the frame rate, or more, because each individual frame value is not regulated, it is the average frame rate we can control.
Setting a baseline simulation rate is much like other hardware requirements, they’re necessary to ensure a consistent user experience across machines.

It seems as though Blender already has the tick rate controllable, locked in with the physics? Anyway, I suppose I ought to clarify further.

I’m not using the timestamps as exact measurement. When an input/timestamp combo is received from over the network, it’s applied to the frame with the closest timestamp. Since the snapshots are only taken on a tick-by-tick basis, the baseline is maintained. An alternative to timestamps would be to have clients have a specific “zero frame” and go from there. The same functionality applies, but the consistency would be stronger. How’s that sound?

Yes we can control the tick rate, but the value we set doesn’t ensure that a frame takes exactly 1/ tick rate to process. So, using timestamps you’ll notice a discrepancy in how long a frame took to process if you print the delta time value, it changes between frames. So using timestamps isn’t always the best bet to represent ticks. Using them as the baseline time value also has its difficulties.

Another few points to consider:
Server time and client timestamps may differ. How would you handle this? (hint no one can simply tell them to set the time, due to varying network conditions).
Furthermore, packets may arrive at different times due to network jitter - you may receive zero packets one frame, and two the next. The server wants to maintain a smooth simulation, simulating in real time because it is *the man *, and other clients need to receive smoothly updated data, which can only happen if the server applies inputs at a regular interval…

You might find this an interesting read:

It’s about how Age of Empires synchronized all of the players using turn counters and very low bandwidth. The tl;dr is that they queued commands for a few turns in the future and made sure all of the clients were waiting for the slowest computers to finish their turns before continuing.