AirPixel is a camera tracking solution for AR in broadcast TV and live events, in-camera VFX on Virtual Productions in LED volumes and for Simulcam in greenscreen studios.
It provides precise position along with tilt/pan/roll values and directly integrates FIZ data and genlock.
AirPixel seamlessly integrates with Unreal Engine and other render platforms such as Disguise, Pixotope and Zero Density.
Based on ultra-wideband (UWB) and inertial indoor tracking technology originally developed for the demanding automotive industry, AirPixel has a fast set-up time, is easy to use, and scales to very large indoor and outdoor venues such as studio hangars and open air stadiums as ambient light has no effect.
AirPixel provides accurate pan, tilt and roll plus X, Y, Z position up to 100 times a second. Advanced filtering is used, combining inertial sensor and UWB data to significantly reduce drift.
AirPixel works in all weather conditions including bright sunlight, rain, mist or total darkness. With an IP rating of 67, the beacons can be safely placed outside without risking weather damage.
AirPixel covers stages of virtually unlimited size without any deterioration in position quality. Up to 250 beacons and 5 rovers can be deployed in a single setup. Beacons can be placed up to 50 meters apart.
Using radio (UWB) rather than optical tracking, AirPixel beacons can be placed behind a green/blue screen.
AirPixel can read the focus, iris and zoom values from a variety of cameras and controllers. AirPixel integrates FIZ data and genlock to provide a complete data set to the render engine.
AirPixel seamlessly integrates with Unreal Engine and other popular rendering solutions such as Pixotope, Disguise d3, Zero Density and MotionBuilder. Data is output using AirPixel or FreeD protocol.
The initial setup can be done in several hours depending on stage complexity and the re-initialisation takes only 2 minutes after power up.
AirPixel is a low latency system suitable for all real-time applications. All data is processed on camera and transmitted with as little as 20 ms latency.
AirPixel uses a small receiver placed on the camera, plus a compact control unit, weighing a combined 600 g (1.3 lb). No further processing is required.
AirPixel has demonstrated its effectiveness in AR production for extensive LED studios, stadiums, and concert venues during the NCAA basketball men's Final Four in April 2022 when large billboards with team logos and advertisements were displayed above the playing field, visible to TV-viewers during breaks.
In order to add this non-static AR content to the camera footage from SkyCam, WarnerMedia (Turner Sports) needen a camera tracking solution that could cover the whole stadium without creating too much cost.
Director Robert Zemeckis’ Pinocchio is a live-action adaptation of Walt Disney's 1940 animated film that is packed with CGI creation and visual effects.
DNEG Virtual Production, a partnership between DNEG and Dimension Studio, utilised our AirPixel ultra-wideband camera tracking solution for wide area shots for the first time.
AirPixel delivered many benefits, incl. the option to view the final scene (real-time composition of the camera footage + Unreal graphic render) without having to wait for post-production.
AirPixel successfully tracked the in-stadium Skycam at State Farm Stadium during Super Bowl LVII to deliver accurate position and orientation as well as the camera’s focus, iris and zoom values. The integration of this data enabled flawless, dynamic matching of live with virtual imagery throughout the show.
AirPixel is highly scalable and easy to deploy, making it ideal for temporary installations within stadiums. It is unaffected by lighting changes or weather conditions, ensuring reliable data is available at any point in the production.
Julian Thomas, Managing Director at RACELOGIC, the company behind AirPixel, explains how AirPixel (called VIPS in this video) is being used as a tracking system to capture the position and orientation of the camera in Virtual Productions and Motion Capture.
AirPixel supports popular rendering solutions such as Unreal Engine, MotionBuilder, and Disguise d3, with new support always being added. Talk to us about how we can integrate with your workflow today.
This is the latest demo video of the UWB/IMU based camera tracking solution, generated in real-time with no post-processing.
In this setup, the position and orientation are being calculated on the receiver on top of the camera. The AirPixel system updates at 100 Hz, but the output to the camera and computer is genlocked at 30 fps.
The tracking data is fed into Unreal which is generating the virtual studio, and the green screen video with the graphics is composed using an Ultimatte 12. AirPixel can also read the focus, iris and zoom values from a variety of cameras and controllers and feed this into Unreal.
This video shows the ARRI stage in Uxbridge, where 12 UWB beacons have been placed around the top of the walls and on various locations around the sides.
The position and rotation of the camera is being measured by the AirPixel system on the camera and sent to the Unreal Engine, which then generates the graphics being presented on the walls. The perspective of the trees changes as the camera tracks in front of the car to give the impression of a real 3D environment from the camera’s perspective.
The in-camera effects are immediate and very realistic, saving a lot of time in post-production, with many obvious benefits to cost and workflow.
AirPixel works by utilising a collection of stand-alone beacons, which can be battery or permanently powered. Using ultra-wideband radio (UWB) they communicate with an on-camera receiver (rover). The rover calculates its position and orientation using the UWB data and an internal inertial measurement unit (IMU). This data is then processed through an advanced filter algorithm to give an accurate output of X, Y, Z, Pan, Tilt, and Roll.
The rover connects to a lightweight control unit which combines the position data with lens FIZ data, genlocks the output and transmits to the render engine via Ethernet or serial connection. Data is also formatted at this point in either AirPixel proprietary format ideal for use with our custom Live Link plugin, or FreeD format for extended software compatibility.
To set up the system, beacons are placed around the perimeter of the stage and/ or above the stage at varying heights and locations. The beacons are then surveyed using a high-speed robotic total station and bespoke in-house software. Using this method, it is possible to set the beacon locations with millimetre accuracy, and still complete setup in less than 2 hours for most configurations.
AirPixel works with many camera rigging systems, having been used on dolly, Steadicam, jibs, cranes and cable cams. It also works in any lighting conditions, including total darkness. In addition, we integrate with popular 3rd party products, to provide the best solution for your needs. This means that accurate data can often be provided on shots where traditionally tracking would be difficult or impossible.
|Update rate||max. 100 Hz|
|Output rate Genlocked||23.98, 24, 25, 29.97, 29.97 Drop,
30,47.95, 48, 50, 59.94, 59.94 Drop, 60
|Position accuracy||X: ±2 cm
Y: ±2 cm
Z: ±5 cm
|Angular Accuracy||Tilt : ±0.2°
Pan : ±0.5°
|Positional resolution||1 mm|
|Protocol support||AirPixel or FreeD|
|Max. tracking speed||270 km/h; 75 metres per second|
|Max. number of receivers||5 (per UWB channel)|
|Max. number of beacons||200 (per UWB channel)|
(assuming 30 m beacon spacing and square volume)
|390 m x 390 m
1.6 million ft2
|UWB Channels||Channel 4 – 3993.6 MHz ±450 MHz
Channel 7 – 6489.6 MHz ±450 MHz
|UWB Transmit Power||-41.3 dBm/MHz|
|Rover dimensions||7 x 4 cm|
|Beacon dimensions||13 x 7.5 cm|
|Power requirements||7 - 30 V DC, 100 mA|
|IP rating||Beacon: IP67
|Operating Temperature||-20˚C to +60˚C|
|Storage Temperature||-40˚C to +85˚C|