faq overview
  • What’s right for me? Real-time or Step-by-step

    There are two ways you can use STUDIO.

    You can record a scene by itself without the AI running in the background, and then process the full video with the AI after you are done recording.

    This will give you results that match the quality of our CORE product in the cloud.

    With our real-time beta product, you can view your motion on our stock character in real-time, which you can also stream to your Unreal Engine 4 project.

    We want to stress that our real-time feature is in beta, and might not give you the results you’ve come expect just yet.  Keep an eye out for improvements in the very near future.

  • Calibration: how does it work in Studio?


    • Do this: make sure the actor starts the scene, from the first frame onwards, in a T-pose facing, and in full view of, the camera, at the center of the scene (stage), with the entirety of the body visible and feet firmly planted on the floor, for 1 – 2 seconds.
    • Don’t do this: we mean “starting” your scene with a T-pose literally. Avoid footage prior to the T-pose in which the actor prepares for the scene. This includes the actor “transitioning” into the T-pose, such as walking into the frame, standing in profile, turned away from the camera, or otherwise anything less than a solid T-pose.


    RADiCAL Studio calibration:

    • Step-by-step (SbS) processing: The Studio app already comes with a mandatory five-second countdown to assist you in finding the best position before the Studio app starts recording your video for SbS processing. Make sure the actor stands in a solid T-pose by the time the Studio app has completed the countdown and starts recording.
    • Real-time (RT) processing: Because there is no five second countdown, you‘ll need a person other than the actor to hit the START button. You can also experiment with external camera integrations (to ensure a better angle and distance) or keyboards / mouse configurations (to allow for remote operation of the Studio app).
  • Studio FBX export

    To download an FBX of your animation data, there is an Export FBX icon located at the top right of your screen.
    Click on that after visualizing your scene, it will take you to the  studio exporter page.
    In the Studio app, your results are saved in a standard destination that you can set as a path through Settings.Choose the file with *output.rad extension and upload it through the exporter page.
    Once the file is uploaded, we’ll run a quick check that the file was generated by the RADiCAL Studio app.
    Once we’re done processing, we’ll send you an email with a link to the scene page where you can download the FBX file.
    The FBX file will be downloadable from a new scene page we’ll create for you inside your Projects  channel.
  • Known Issues

    During our pre-release testing, we uncovered a few things that we are still working through.  If you run in to any, please don’t hesitate to reach out, we are always trying to improve our product.

    • Performance Settings:  we have locked in the best performing AI configuration (a low resolution coupled with our full deep learning model) for both real time and step-by-step processing.  However, soon, we’ll allow users to make certain adjustments to these settings to more granularly reflect the hardware and software environments of the workstations they’re working with.
    • LiveLink may interfere with AI:  we have observed rare situations where using a Live Link stream into Unreal adversely affects the AI in real time scene processing.
    • Step-by-step scenes don’t produce results: we have seen some situations in which scenes processed in step-by-step mode produce no animation data.  This is likely due to a poor initial analysis of the video, especially if the user is not clearly visible or the lighting or other conditions are not supportive of the AI.   If this happens to you, we’ll soon release an update to allow you to re-process these videos, so there’s no need to delete these folders, even if no results are available right now.
    • Trouble connecting the camera into the Studio: We’ve heard from a number of users that they initially had the same problem. But they got their camera to work after unplugging competing applications and devices, which might include OBS, Zoom, or even an HTC Vive and similar. These apps and devices may be blocking the connection for Studio at the moment. If this issue affects you, try these steps: (1) shut down any other app that require access to your camera: OBS, Slack, Zoom, HTC Vive, Oculus, etc., (2) restart your machine, (3) start the RADiCAL Studio.
    • Outside camera feeds to your desktop:  With STUDIO, our app currently assumes that you will be using your built-in camera.  We understand some creators like to use other cameras.  If you are using an outside camera stream in to your STUDIO app, our AI may struggle to pick up your body and might not display your motion.  We recommend that, at this time, you use your built-in camera.
  • Unreal Engine 4 LiveLink

    With the real-time beta feature, you can live-stream your motion from the RADiCAL Studio in to a scene in Unreal Engine 4. To do this, you need a Professional Studio account (or monthly LiveLink access). See our pricing page for more information.

    For indies and students we have special pricing for LiveLink access, get in touch here.


    Live Link Setup

    After starting the Live Link stream from Radical Studio, open the UE4 editor and go to Window -> Live Link. Go to Source -> Message Bus Source -> select the RadLiveLink source. There will be a random alphanumeric string appended to the name, to differentiate between multiple Live Link sources on the network (e.g. other instances of Radical Studio). You can now close this window.


    Live Link Preview

    The Live Link data can be previewed inside the Skeleton, Skeletal Mesh, Blueprint or Animation Blueprint windows for a given skeletal asset. On the right side, go to Preview Scene Settings, under Animation, change the Preview Controller to Live Link Preview Controller. Then, change the Live Link Subject Name to RadicalPose. For Retarget Asset, select the corresponding blueprint Remap Asset file for that skeleton. For example, for the Radical skeleton, choose the RadToRadRemap asset. For the Epic skeleton, choose the BP_RadToEpicRemap asset.


    Remap Assets

    In order to use skeletons other than the Radical skeleton, we need to remap the Radical skeleton’s bone names to their counterparts on other skeletons. For example, if you open the BP_RadToEpicRemap asset and go to the GetRemappedBoneName function, you can see a big switch statement that performs the remapping. For Mixamo, the skeleton bone names match the RADiCAL skeleton bone names, so this step is unnecessary.


    Rotation conversions

    To account for some of the differences between the Radical Studio coordinate frame and Unreal, we have flipped the incoming LiveLink data’s rotation and position axes. You can inspect the conversions at the RadicalLiveLinkRemapAssetBase class and its child classes. We expect that other skeletons will require different rotation adjustments, including swapping axes. We exposed three overridable methods to implement the root bone position, root bone rotation, and non-root bone rotation conversions. Please note that the AI output in Radical Studio uses the hip bone as a root, so position data should be mapped to the hips (or pelvis) in the target skeleton.


    Real-time Remapping/Retargeting

    To remap the LiveLink data (which matches RADiCAL 3.1 skeleton structure) to other skeletons, modifications to the bone rotations may be required. The incoming LiveLink data contains rotation offsets from the RADiCAL’s base T-pose, and hip position. For the Epic skeleton, we tweaked the arm rotations in the AnimBP, as the Epic skeleton uses an A pose by default. For the Mixamo skeleton, there are more complex differences, such as an inverted rotation for the the LeftUpLeg bone’s Y axis (among others).

    *      *      *

    An Unreal sample scene is available for download here.

    Credit for instructions to Xuelong Mu.

  • The AI doesn’t initialize after successfully installing

    Make sure all other apps are closed before trying to record a new scene or playing back an existing scene.

    If that doesn’t work try restarting your machine.

    You also want to be the only person in your frame  and that your entire body is visible within the frame. and make sure that your background is as uncluttered as possible, as certain artifacts may register as another humanoid object by the AI.

    If that still doesn’t work please get in touch.

  • Why didn’t the AI compile on my machine?

    This is rare, but could be happening for a number of reasons.

    Check that your machine meets at least the following system requirements:

    • Windows 10
    • NVIDIA GPU: we recommend at least a GTX 1060 (or stronger). Ensure your GPU is compatible with Tensor RT

    Beyond system requirements, always make sure to close as many other apps and operations as you can.

    If your machine has all of the minimum specs, you’ve closed other apps and restarted your machine, and still does not compile the AI, please get in touch.