LIVE: real-time, multiplayer experiences

faq overview
  • Live: connecting a custom video feed

    You can connect a custom video stream into RADiCAL’s cloud.

    This feature allows you, among other things, to develop your own applications around RADiCAL’s video capture feature and to split the video signal for simultaneous use across other interfaces and applications.

    The custom video stream feature is currently available for web-based applications (ie for browsers), but it will also be available for other software clients, such as Unity, Unreal, and others.

    Get in touch with the RADiCAL team to get access to this feature.

  • LIVE: is it available now?

    We intend to release RADiCAL LIVE to our community soon, a platform that is entirely self-service, easy to use, and massively scalable. Until then, the LIVE platform is available through a developer account that we grant to customers and partners upon request.

    Those who don’t have a RADiCAL developer account can get in touch with us. We’re happy to offer it to as many users as we can, as fast as possible, but we need to coordinate cloud resources to enable a smooth and seamless experience for everyone. We’re therefore sequencing the rollout according to use case, expected engagement and a few other metrics.

  • LIVE: what equipment do I need?

    RADiCAL Live runs on any consumer device that:

    (1) comes with a conventional camera,

    (2) supports conventional web browsers (Chrome, Firefox, Safari), and

    (3) has an internet connection.


  • LIVE: webcams and lighting conditions – what should I know?

    If you’re using a regular webcam, a good lighting source will let you get the most out of RADiCAL Live.


    That’s because bad lighting will reveal a key limitation present in conventional low-cost webcams used in PCs / laptops, whether they’re fully integrated or connected externally. If you’re using a conventional webcam, you should provide bright, frontal lighting of the scene, and especially the actor. Otherwise, you run the risk that your webcam will drop the FPS rate to unacceptably low levels. The quality of your results from RADiCAL Live will suffer accordingly.  There is nothing that RADiCAL can do about that – our software cannot change how the hardware of your camera operates.


    Why are webcams so sensitive? 

    While pretty much all webcams are capable of supporting 30 frames per second (FPS), conventional low-cost webcams are designed to capture footage of the user in bright lighting.  The essence is that the darker your setting, your camera needs to apply longer exposure time and a wider aperture.  Here’s more about why cameras behave this way.


    Does this apply to smartphone cameras or premium cameras? 

    The sensitivity to bad lighting described above is not found in higher quality consumer cameras, including those built into modern smartphones.  However, it will apply to most conventional low-cost webcams built into your PC / laptop or connected externally.  Although some manufacturers include superior hardware and software, you should assume that insufficient lighting will affect your results.





  • LIVE: what does the “simulated data” feature do?

    The simulation data feature allows you to see what a real-time stream of RADICAL Live animation data looks like, without actually having to capture video of a live actor through a Live cloud SDK.  It is essentially made available to support developers, but it is probably useful to anyone looking to understand how RADiCAL Live works.

    While the simulated data stream comes from a pre-recorded video and a completed 3D animation cycle, it might as well come from a Live session.  The data’s format (JSON), its structural organization around skeletal joints, and its FPS rate are identical to what you would see from a Live scene. We therefore recommend that you test and develop around the simulation stream, especially to completed downstream integrations into 3D software clients.

  • LIVE: what is the “audience” feature for?

    The RADiCAL Live audience feature on our website allows you to enter and passively observe a virtual room, and all characters (single player or multiplayer) and their animations within it, in real time.  Your participation as a passive audience member doesn’t affect anything that’s happening in the room. The data you see through the audience feature is equivalent to the RADiCAL Live data visualized in third party software clients (such as editor sessions or packaged games in Unreal Engine, Unity or WebGL).

  • LIVE: how much bandwidth does it require?

    You need enough bandwidth to support a regular video conference call between 2 participants.  In other words, not a lot.  The dataloads we send back and forth between your device and our cloud are rather small.

    What matters arguably just as much (and that can be tricky in some places) is for this bandwidth, albeit small, to be consistent.


  • LIVE: does RADiCAL run on my device or in the cloud?


    • LIVE will use the camera on your device to capture and stream video up into the cloud.
    • LIVE processes the video feed from your local device in the cloud to produce 3D animation data.
    • Finally, LIVE sends a tiny stream of 3D animation data back to your local device, where it is visualized using 3D assets.
  • LIVE: how does “multiplayer” mode work in “virtual rooms”?

    LIVE’s multiplayer layer supports a practically unlimited number of remote participants (actors).  Our cloud-based multiplayer solution enables shared “virtual rooms” out-of-the-box, such that every participant can see themselves and every other participating actor in the shared 3D space, in real time.

    Note that each actor requires their own video stream (and each actor should still be alone in their video stream).

  • LIVE: what FPS does it deliver?

    RADiCAL Live delivers 3D animation data at 30 FPS.

  • LIVE: NVIDIA Omniverse


    We are partnering with NVIDIA to support the Omniverse platform.  Towards that end, we have released the RADiCAL Live Connector for Omniverse.  It enables real-time, multiplayer 3D motion capture inside Omniverse Machinima, for everyone, everywhere, from any device.

    You can see our announcement here.

    Using a Simulated Live Data Stream in Omniverse

    You can see what LIVE looks and feels like in Omniverse, even if you don’t have a LIVE account,  using a simulated live data stream.  Here’s how it works:

    • Install / launch Omniverse / Omniverse Create: If you’re not using Omniverse yet, head over to NVIDIA to install Omniverse and get familiarized. Then launch Omniverse Create.
    • Download the Connector: The free Connector is now available from our downloads page.
    • Integrate the Connector: Follow the instructions over here on how to use the Connector to visualize the simulated LIVE data stream in Omniverse.