Jump to content

Draft:Realtime Motion Graphics Software

From Wikipedia, the free encyclopedia

Realtime motion graphics software is a relatively recent type of computer program that is a hybrid between Digital Signage, Motion Graphics and video game engine tools. Like digital signage, these tools are designed to present or otherwise be reactive to live and dynamically changing data while also providing the creative design flexibility of motion graphics tools. These tools are often used for theatrical production, live concerts and art installations, among others, where the visuals may need to be timed to human performers or react to humans in a museum exhibit. The user interface presented to the user in these tools tends to resemble a visual programming environment that allows the user to manipulate and transform data within many types such as pixel and geometry data, audio waveforms, MIDI, and data coming from various sensors. In some cases, these tools will translate the visual programming into a very fast performance native code executable program with very little overhead, and can be distributed and used without dependency on the original software.

Origins

[edit]

Realtime visual tools have been around for decades for live television production. Silicon Graphics began producing hardware for mixing live video with real-time composited 3D visuals in the 1980's starting with the 4D Iris machines. SGI machines were often used for live sporting events and for live weather infographics.

In the early 2000's, companies like Green Hippo started producing a turn-key hardware & software system called media server for live events. These systems could provide multiple video outs to multiple projectors, and using calibration tools, one could align the multiple projectors to appear as one large seemless image. Additionally, one could use these tools to calibrate images projected to non-flat or even moving surfaces and the software would, in realtime, pre-map visuals as needed, regardless of differences in venue requirements such as stage size and positioning of projectors.

TouchDesigner is one of the earliest tools directly in this category, originating as a side project of Houdini which has roots going back to the PRISMS software developed by Omnibus Computer graphics in the mid 1980's. It

Usage cases

[edit]

While there are some commonalities of this class of software with tools within game engines like Unreal Engine and Unity, these tools tend to occupy a unique space somewhere real-time graphics engine and software like Adobe After Effects, as they may offer a timeline interface to sequence events at key moments like After Effects, while also providing the visual programming define dynamic interactions like physical simulations. An example might be a keyframed animation of a box with physically animated fluid inside or an aquarium exhibit with a pre-programmed presentation of different species of fish moving about in a glass aquarium with the motion of the fish procedurally defined and also reactive to tapping on the glass.

The demoscene that evolved from the 1980's home computer world to becoming world-wide art expo events has many commonalities and in fact some of the powerful tools that fall under this category initially started out as tools used to assist in producing demoscene productions, such as Tooll3 and Notch, to name just a few.

Examples

[edit]
  • Touch Designer
  • Tooll3

See also

[edit]

References

[edit]