/
Igloo Camera Packages Explained

Private & Confidential

Igloo Camera Packages Explained

What are they used for

The Igloo Camera Packages, also known as Igloo Toolkits, enable a user to experience a 3D world within their Igloo Structure. This 3D world may have been created in the game engine Unity, or Unreal, and may include interactive elements, audio systems, post processing effects, and in some cases: 4D Experience elements.

Unity has been the most widely used of the game engines we support. Mostly due to; it’s low bar to entry, simple programming language, and lightweight file sizes. This has driven us to develop the Unity Toolkit further and faster than it’s Unreal counterpart which, by opposition has; Larger file sizes, C++ as it’s main language, and a complex user interface design.

Since the release of Unreal 5 we’ve found an increased uptick in the amount of clients requesting the appropriate toolkit, and also expecting more of it at the same time. To that end, we are currently developing the Unreal toolkit to not just work with Unreal 5, but exceed it’s Unity counterpart’s capability.

What technology is used

Spout

Spout has been the main method for viewing the output of both Unreal and Unity toolkits since their inception. It is also the main method used by the Igloo Software Suite for sharing video streams.

Spout functions by sending a texture to the GPU, which is then allowed to be viewed by other programs on the system. This is extremely fast, and can cope with exceedingly large textures with little overhead due to the minimal amount of encoding required when reading and writing the texture.

OSC

Open Sound Control is a method of sending simple data communications between programs on a Local Area Network. It is the primary method of either game engine toolkit communicating with the Igloo software, and it has it’s roots in Audio production systems (replacing the widely used and known Midi interface)

Unreal has a built in plugin for OSC which is utilised by the toolkit, whereas the Unity version uses an open source implementation of the plugin which we’ve extended for our own purposes.

For achieving 4D effects, we do recommend OSC as the communication method between the various required systems and devices. It’s also widely documented and used in the events industry, so ample support is available.

NDI

NewTek Network Device Index is very similar to Spout in that it’s a stream sharing protocol designed to share large image sizes between programs. However, instead of sending the stream between programs on a single computer it sends them via the Local Area Network. This does have limitations due to increased compression and de-compression of each frame, network speed, and existing network traffic. It is widely used in the film and TV industry as a method of sharing the live feed from cameras with various production and post production systems within the same building. There are methods to stream NDI feeds over the internet, however none of these are free and come with further limitations on quality and speed.

Currently NDI is only included in the Unity toolkit, however there are instructions on how to implement the NDI SDK within Unreal. This is due to the NDI SDK for Unreal being extremely large to package and upload to clients, where only 1 in 10 will eventually require the protocol in their use case.

True Perspective

The newest addition to our plugin is the use of Igloo True Perspective. This is a custom shader system that turns an Equirectangular image into a formatted cubemap or adjusted spheremap to fit our wide range of Igloo structures.

It can also take a head position, which will adjust the screen based on the viewers perspective in real time, like how large, expensive, cluster cave systems can. Unlike the cave systems, Igloo True Perspective is not limited to the game engine content, and can be used on all 360 media that can be viewed on an Igloo. Leading to extremely immersive experiences for the user.

What isn’t yet possible

Unity

Clustering

Currently there is no implementation of a clustering system within Unity to distribute the rendering across multiple machines. However, we have attempted this in the past, and there are possible solutions that exist on the Unity asset store. We’ve just not tested them or required them within our Igloo systems.

Unreal

Post Build customisation of camera system

We have developed a sophisticated XML based settings system for our Unity Toolkit, that allows advanced customisation of the Igloo Toolkit outside of the Game engine editor. This has allowed us to customise any Unity based application to fit any of our Igloo systems.

Whilst this isn’t a feature in Unreal, it has moved from being on the roadmap for a future release, to being redundant. Due to the Igloo True Perspective system, Unreal and Unity are now only need to output an equirectangular image which can be processed by the Igloo software, which removes the need for any adjustment to the output post build.

The Unreal system can however rely on command line endings for minimal post-build customisation ( -NDI for instance, to enable the NDI output instead of Spout)

Warped output system

We have found a significant amount of performance is lost when sharing a large image between two graphics cards. In Unity we have developed a method of using the Igloo warps generated by the Igloo Warper to output directly to each individual monitor without sharing a large texture, or using the Igloo warping software to display the image. This massively increased performance, and is a breakthrough step to larger more complex Igloo display systems.

However, this technology has not yet made it’s way into Unreal. We imagine the future of this technology within Unreal will be as part of the nDisplay system, using our Igloo warps to generate the individual nDisplay screens.

UI System

Unlike Unity’s Canvas → Igloo UI system, the unreal method of creating UI’s is very different and limited heavily to the screen overlay. This means it’s difficult to project the user interface in world space for it to be viewable by the Igloo. However, there are many tools within Unreal for creating diagetic user interfaces in world space which will work well with the Igloo Toolkit.

Clustering

Unreal has solved this issue for us already, by using their widely accredited nDisplay system. However, this comes with it’s own drawbacks and limitations. It also requires a high level understanding of network protocols, cluster systems, and machine hierarchy to implement successfully. However, it’s main advantage of distributed processing power is worth the effort, as incredibly detailed 3D worlds, with Billions of polygons and life-like textures, are available to be projected on a seemingly limitless amount of projectors / LED walls.

It also has been used by the VFX industry to replace green screen production, as it’s faster to do the production live in a small set with a projected backdrop, than a green screen set that requires long post production work after.

Related content

(c) Igloo Vision 2020