Reanimated Refuse (One Game a Month, April)

Published on 2018-05-03.

I was first informed about the perils of waste disposal in early grade school. I remember going to the gym with the rest of the school (they called it an “assembly”) where a group of adults gave a presentation urging us to put glass and plastic into a different container. The reasoning, as I remember it, was that we would run out of space in landfills to put things if we didn’t recycle them instead. We must have been convinced; I remember us all chanting “REDUCE, REUSE, RECYCLE!” in unison with the presenters.

These days, I would ask more questions.

The United States is decidedly not running out of places to put our garbage in a way that creates a sense of urgency. Since the 1970s, the country has moved from a large number of small, low-tech municipal hole-in-the-ground dumps to a small number of more highly-engineered landfill sites. Long-term waste management seems like an interesting engineering challenge. The organic waste component of a landfill gives it some properties of an enormous compost pile. We can separate the methane gas and use it for energy. The harmful leachate runoff from liquid waste is now caught in liners that are installed in all landfills, so it doesn’t contaminate the ground.

Though I’m satisfied I won’t be living in piles of trash anytime in my next few lifetimes, there’s more to the issue than space. The trend toward a small number of large, strategically placed landfill sites increases the likelihood of regional issues. Since the closest landfill is farther away than the “town dump” was, cities must manage their temporary capacity more carefully. Regional refuse must be transported to a landfill, which contributes to carbon dioxide emissions. Where does all that toxic leachate go, that isn’t as bad as putting it in the ground?

I just wanted to know how guilty I should be when I throw things in the trash, but things get complicated quickly. I don’t think it’s controversial to suggest our best bet is to avoid putting non-biodegradable stuff into holes when there’s no reason to do so. On the other hand, looks like we’re putting talented engineering resources into landfill technology and can accept the cost of storing something that had a net positive, albeit temporary, effect on the owner’s life.

The Project

April’s creation for OGAM is called Reanimated Refuse. This game is about unchecked waste disposal bringing consequences in a more immediate way than environmental destruction. No longer satisfied being relegated to landfills, garbage wants to bury the world. Only you, the mild-mannered, journeyman waste collector, can stop the onslaught.

A first for April is a title screen, which gave it a more complete feel for me. I will try to do this for every game going forward.


RR is a wave-based, fixed shooter with mechanics similar to Space Invaders and Galaga. The collector must avoid being hit while shooting through multiple missions, each with progressively more challenging waves of garbage. Cut from the original idea was side scrolling (a good call). This is because ng doesn’t have support for moving cameras yet.

Here’s an animation showing the gameplay.


The best development moment of the month was the first time I watched the spinning six pack ring come down and take the collector out.

While this is another minimalist effort, I’m satisfied with the completeness of its gameplay and graphics. RR also struck a good balance of reusing existing features (frame animation, tile map support, text rendering) with new features for April (nui improvements, texture rotation and scaling, mouse support and picking).

The Tech

ng received a few major features in March, but the biggest were frame animation and texture atlas support. April’s major feature is nui rendering improvements. There is also a lot of under-the-hood work done to support the entity graph feature, which will be finished in early May.

I’ve done a lot of light refactoring – reorganizing modules, renaming, improving logging – to make things easier to understand for my present and future self.

Here’s the ng diff since the last update:

diff --git a/doc/plan/plan.rst b/doc/plan/plan.rst
index 4e23e3e..d60752d 100644
--- a/doc/plan/plan.rst
+++ b/doc/plan/plan.rst
@@ -1,14 +1,10 @@
 To do

-- Fix pick functionality when device-independent resolution doesn't
-  match actual window size.
-- Reusable logging configuration for each module.
+- Add camera system to support scrolling playfields.
 - Support multiplicative color modification of entities.
-- Add collision info to event system.
 - Implement Python configuration DSL.
 - Add/remove debug and performance statistics with 'z' key.
-- Support center point (pivot point) for rotation.
 - Simple IPython integration.
 - Annotation-based profiler.
 - Add copyright/licensing info to files on build.
@@ -16,6 +12,59 @@ To do

+- Reusable logging configuration for each module.
+- Make nui use ng to do its rendering, instead of using an independent
+  rendering step.
+  - Separate ng local texture rendering step from output rendering.
+  - Make nui also do local rendering to textures.
+  - Make ng do final output rendering of nui gadgets.
+- Support nui font customization.
+- Change picking and collision detection to use world AABB instead of primitive
+  texture box.
+- When objects collide, dispatch a collision event with information about the
+  pair.
+- Implement vector length, normalize, and scalar multiplication operations.
+- Add support for mouse button clicks. Send button up/down events.
+- Fix pick functionality when device-independent resolution doesn't
+  match actual window size.
+- Support center point (pivot point) for rotation.
+- Implement entity graph (parent/child relationships).
+- Send an event after renderer is cleared, so the caller can hook.
+- Support non-animated entity loading from Aseprite JSON format.
+- Support animation loading from Aseprite JSON format.

Let’s look at some highlights for the development month of April.

Entity Graph

Also known as a scene graph, the entity graph allows entities to be transformed relative to one another, in addition to being transformed relative to the world. This feature is a work in progress, so it will be covered in depth in a future update.


During very early prototyping for ng, I made a simple user interface module called nui, which rendered text boxes and bordered windows. All text drawn in all screenshots on this blog use nui.

nui originally rendered its user interface elements (called “gadgets”) in its own rendering pass, so everything nui did was rendered after/on top of everything ng does. This became problematic in two ways. First, separate rendering passes meant no way to layer ng/nui entities. For example, placing a spinning cursor using ng’s animation system on top of a nui window would have been impossible. Secondly, nui entities had no way to participate in the simulation of the game world. Since they were drawn statically, they couldn’t easily move, rotate, disappear automatically after a time, and so on.

To improve this, I made some changes in the way nui and ng work together. First, gadgets now inherit from ng entities, so they can take advantage of all the functionality provided by entities. Secondly, nui now renders the graphics for its gadgets to textures instead of to the display. These textures are associated with the gadget entities before display rendering starts. Finally, all entities, including gadgets, are now rendered by ng to the display in a single pass.

These changes resolve the issues described above, and also resolve the burden of maintaining two different display rendering routines into the future.

Mouse Cursor

There’s now support for getting the position of the mouse cursor within the game window, either in terms of the output resolution or the logical resolution.

The crosshair in the gameplay screenshot above is a normal entity. It follows the mouse cursor location by setting its position to that of the mouse cursor every frame.

mouse_pos = self._ng.mouse_pos()
self._crosshair.position.x = mouse_pos.x
self._crosshair.position.y = mouse_pos.y


“Picking” is now supported. Picking is identifying the entitie(s) at a specific display location, usually that of a cursor.

Targeting and shooting the six pack rings in the screenshot is done by picking. When the player clicks the mouse button, any entities under the cursor location are picked. If this list contains a hostile entity, it is destroyed and the player’s score is increased.

if event.type == Type.MOUSE_BUTTON_DOWN:
    if self._input_enabled:
        if['button'] == MouseButton.LEFT:
            picked = self._ng.entity_pick(self._ng.mouse_pos())
            if picked:
                enemy = self._get_enemy(picked)
                if enemy:
                    self._score += 50

Bounding Box Improvements

Both picking and collision handling routines use an entity’s world axis-aligned bounding box (AABB). Picking works by testing to see if the location represented by the mouse cursor is inside an entity’s AABB. Collision detection works by doing an intersection test between two entity’s AABBs; if the boxes intersect, a collision is registered.

ng’s first quick-and-dirty bounding box implementation was calculated by multiplying an entity’s texture size by its scale. This broke when support for rotation was added. For example, the AABB of a rectangular object that is wider than it is long needs to grow in height and shrink in width as the object rotates around its origin.

AABB calculation now correctly respects all local transformations, allowing for proper rendering, picking, and collision detection of entities in any transformation state.

When the six pack ring hits the garbage collector, this causes their AABBs to pass the intersection test and a collision event to be sent. The game code handles this collision event and uses it to kill the unfortunate garbage collector.

if event.type == Type.COLLISION_CONTACT:
    if self._collector.life_state != LifeState.DEAD:
        if self._collector.entity in['entities']:
            self._input_enabled = False

Per-module Logging Configuration

ng provides a logger for each of its modules (main, rendering, and so on). Initially, these loggers were initialized with a default log level that couldn’t be changed publicly from game code, but only from inside ng’s source code. There’s now a public interface to adjust logging per module. So, in configuration:

    'ng.collide': 'error',
    'ng.event': 'error',
    '': 'error',
    'ng.render.sdl': 'info',
    'ng.sdl2': 'error',
    'ng.timer': 'error',
    'nui.nui': 'error',
    'nui.render.sdl': 'info'

And then in code:

import config

for name, level in config.LOGGING.items():
    ng.log.set_level(name, level)

In Conclusion

April has been a satisfying month. In addition to creating a fun little game, I made progress on critical engine features and a lot of small improvements that will pay off over time.

May will be another major month for engine features. I’m excited to finish the entity graph. I’ve also become more interested in shader programming while studying graphics over the past few weeks. ng’s current renderer uses SDL, which doesn’t support programmable shaders. For this reason, I’ve decided to add an OpenGL renderer to ng. This will be finished in May and will complement the existing SDL renderer for now, although it will probably ultimately replace it.

Since the OpenGL renderer is a major change and I don’t have prior experience using OpenGL, I’ve decided to focus on it, and not to do a game for May. If things go smoothly, I’ll produce some technical demos using the new renderer as it develops.

Thanks for reading. Next time, we’ll review the progress of the entity graph and the OpenGL renderer.