This documentation is still working-in-progress. More media content coming…
Role: Lead Developer
Two FlipDiscs Display, installed in the newly built Climate Pledge Area, by the Space Needle.
In collaboration with Breakfast, the display’s designer and manufacture, the LAB at Rockwell Group designed and programmed the content software system for the two displays.
There are a few components in this system. I will only write about the part I was responsible for:
a node js server
the server works as a master “show control” controller.
- It fetches events data from the arena’s official website by a simple HTTP request.
- it then gathers more company/city data using Pitchbook’s API. add meat to the bones(raw event data), compile a list of “data stories”, later to be sent to the visualizer.
- the server also listens to sports event signals from the show control room of the arena, when there a sports event like “Goal” or “Penalty” happens. The server sends a message to the BrightSign system by TCP and has the display show a pre-rendered video.
- the server has its coded “run of show” and will play a sequence of presentations at different times. e.g. 5 minutes of ambient, pre-rendered video content, and a data story made largely by real-time data and typography play, and then an interactive particle system, etc.
the camera data processor
there are three depth sensor cameras installed on top of each display. they camera real-time depth data in front of the display. the data are sent to the data processor, a unity-based software. the processing software processes the raw depth data, has them go through a variety of computer vision filters and presents a clean and blobby-looking raster. the raster is later served to the visualizer.
a unity-based software. it gets data from the node server, then compiles, and renders content in real-time. there are two types of content. Framework and DataDynamics. Or, motion typography or particle systems.
Mode A – Framework:
Mode B – Data Dynamics: