Skip to content

Update propagation #495

@Korijn

Description

@Korijn

Introduction

This started out titled "update modes", but the topic quickly escalated, touching anything that relates to how interaction is handled and updates are propagated ... We've encountered a feeling of discomfort with the API in several places, and kept referring back to this issue.

On 19-07-2022 Korijn and I got together for a day to discuss this, and we came up with a rough plan. We agree on core principals and the outline, but the details will surface as we implement things.

Problem statement / scope

We are re-envisioning everything in wgpu-py and pygfx that relates to how updates are propagated. This includes gui events, event loop, controllers, the renderer, etc.

We want it to be possible/trivial to use pygfx purely statically. Great for static applications, but also makes it possible for users to handle update propagation by itself.

At the same time, we want it to be possible to opt-in to the builtin update propagation mechanics, and make that work more naturally without having to manually update stuff in the animation loop.

Sub-topics

Scheduling

To render a frame a certain set of tasks must be performed. These must be scheduled. Preferably in a way so that the CPU and GPU don't wait for each-other.

Tasks:

  • Handle events
  • Allow user-code to perform simulations (ticks)
  • Other updates (not sure how to structure yet):
    • Allow controllers to update the camera (can be considered a simulation?).
    • Allow world objects to update themselves (e.g. ruler)
    • (this is what is now the pre-draw hook)
  • Resolve transforms (?).
  • Sync buffers and textures.
  • Do the actual rendering.

Update modes

initial post by Korijn

I was able to pool the collective memory of HN on this topic: https://news.ycombinator.com/item?id=35435669

What really struck me as an interesting option is the way Bevy handles it. The reactive mode seems like a very nice middle ground that should make it easier to integrate pygfx in projects like that of @berendkleinhaneveld and also make it easier to implement things such as camera smoothing

  • Unlimited
  • Target FPS
  • On demand (maybe with a minimal fps of 1 or so)

The renderer

We want the renderer to do one thing: render stuff to a texture. It should be unaware of events or updates.

The viewport

This also relates to #492. The idea is that the viewport takes over the update-related roles from the renderer. That way the renderer can become much simpler (low level), while the viewport becomes a convenient (and central) component to tie things together.

Async

This also relates to #615. As in, making the API async will affect our options for all this work.

Use cases / things to consider

  • Being able to do a draw manually.
  • Closed loop / bring your own event loop.
  • Fix that the ruler lags behind, because controller updates later.
  • Controller damping.
  • Ruler.update() -> don't want to have to call this manually, rather hook to an event or something.
  • Same for Gizmo and other (custom/future) objects.
  • camera.update_proj_matrix
  • Transform propagation
  • Skeleton needs an update (in which it needs access to the bones, currently done via bone.parent)
  • Our transform system also has an update/callback mechanism. Can we get rid of that?
  • Async
  • Qt event loop
  • Multiple subplots
  • Multiple canvases
  • Multiple canvases of different gui systems, and event loops
  • Multiple threads
  • ...

The plan

We want wgpu-py to support rendering to a surface provided by a GUI, and also use events from GUI's in a generic way, but these two concepts should preferably be separated better.

We want the core of pygfx to stay simple, and static, unaware of events. The core consists of renderer, scene (world objects, geometry, materials), camera. People can build scenes with these components without using any kind of event system.

So the core will feel more like building blocks: more flexibility and freedom, also more typing. Hard-core devs happy! The viewport fills this gap and make common cases easier: common users happy!

One particular use-case that illustrates the improvement is when a scene is shown in two subplots (e.g. with different cameras). In the old scenario, the renderer(s) would emit befor_draw twice for the same scene. With the scheduler emitting that event, there'd be just one such event.

Tasks

  • In wgpu perhaps separate the logic required to draw vs logic for events.
  • In wgpu introduce a scheduler / loop mechanic that does the tasks mentioned above, and supports different update modes. Possibly also supporting custom update modes. A way to make it work for all GUI toolkits.
  • In pygfx we may subclass the scheduler to make it aware of pygfx stuff like viewports.
  • The renderer becomes unaware of events, and no longer directly associated with a canvas (see Proposal for renderer / viewport improvements #492).
  • In pygfx world objects also no longer emit events. Picking will happen via the viewport.
  • Extend the scope of the ViewPort` object (perhaps rename it). It won't do anything complicated, but it will tie a lot of things together (e.g. renderer, rect, scene, camera, controller, events, picking).
  • The Display object will be replaced with the ViewPort.
  • Refactor or remove gfx.show().
  • Refactor for updates: controllers, ruler, gizmo, ...
  • Refactor transform update propagation.

Related issues

Work

Other refs

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions