AudioPixel

From Noisebridge
Jump to navigation Jump to search

AudioPixel.com

Control Tower video with some screenshots and pre-viz

The Rundown[edit]

AudioPixel software provides an abstraction layer where any unit (dmx, midi, udp, etc) can define it's existence and be handled as a mapped node in the 3D pixel mapped universe.

We run 'clips' - which can be custom scripts, algorithms, or any type of content - that centrally drive the behavior of all hardware. Control is mapped in a 3D universe where any type of hardware can be added and intelligently synchronized to be controlled by any type of input.

An one example, during live stage shows, we use a midi dj controller to quickly activate and blend effects, generally without using the GUI. For example, when controlling DMX movers, the animation color data is interpreted as a rotation of it's colorwheel, while the same data is converted into proper RGB data for the LED units.

We're happy to adapt the software to talk to as much of the tower as possible, through published events or otherwise.

Technology Needs[edit]

  • Finalized layout of tower, 3D model with lighting placement
  • A main computer (Preferably Windows)
  • A backup computer
  • Qty 2 EntTec DMX interface (assuming DMX lighting)
  • DMX / CAT5 Cabling

Other Requests[edit]

  • 3D model of tower with LED count / mapping information
  • Finalized packet information structure for Events System
  • List of main colors that the lasers can output? pre-set list of laser effects?
  • List of flame effects? If flame effects listen for triggers outside of their own system?

Interaction Scenario[edit]

Work in progress... please add any notes or questions.

  • The crew powers up the tower
  • AudioPixel software launches and loads the appropriate clip library for the day/time
  • AudioPixel defaults to automode; "relaxed" effects library for non-interactions

Doppler Examples[edit]

  • Crowd forms around tower
  • Doppler data is interpreted by Events System
  • Events System announces ^Doppler|GrowingActivity|7|3| |7$
  • AudioPixel loads up interactive effects library of clips
  • AudioPixel triggers generative particle effects in the zone(s) with active "dancing" or levels 6 or 7 of activity
  • AudioPixel announces color data for tower ("Red" "Blue with Green" or "White" )
    • Lasers choose to use supplied color data, maybe during scheduled Name/DNA effects?
  • Events System announces ^Doppler|GrowingActivity|10|ALL| |9$
  • AudioPixel announce that its triggering "mega-level" effects for levels 9 or 10 of activity in all zones
    • Flame Effects choose to use supplied zone effect data to poof as a "finale" to a lighting sequence?

Operator Examples[edit]

  • Operator has access to a master knob(s) for lighting brightness (automode could ignore setting if untouched for some time)
  • Crowd grows unruly due to darkness of tower, starts acting up
  • Midi controller hooked up to computer running AudioPixel can trigger sparkle, strobe, etc effects
    • Hold down buttons to trigger count-down for crowd pleasing "mega-level" effects
  • Operator sets up audio cable input or enables microphone for audio-reactive effects
        -can you elaborate on how audio-reactive control might work?
        - one or many audio inputs?
        - you prefer audio amplitude or frequency changes to modulate tower effects?