Unreal sample project
How can I test RADiCAL’s output?
If you want to see how RADiCAL’s output integrates in to your workflow, you can download free FBX files from videos on our explore page.
You can also download FBX files from the Gen 3.1 samples folder in your account dashboard.
Where can I download Studio?
Is there a tutorial for Studio?
We have a quick, basic tutorial on how to use Studio available here.
What NVIDIA GPU do I need for the Studio App?
RADiCAL Studio requires an NVIDIA GPU to run. At this time, we don’t support AMD, Intel or other cards.
The smallest GPU you can use with RADiCAL Studio is a GTX 1060, which will allow you to do step-by-step (non real-time) processing. For real-time results, the minimum requirement is an RTX 206o, although we recommend a 2080 Ti or better for best results. Below is a complete (as far as we can tell) list of compatible GPUs, as of January 27, 2021:
- Quadro: 8000, 6000, 5000, 4000
- RTX: 2080 Ti, 2080 Super, 2080, 2070 Super, 2070, 2060 Super, 2060
- GTX: 1660 Super, 1660 Ti, 1660, 1650 Ti, 1650
- GTX: 1080 Ti, 1080, 1070, 1060
- RTX 3 series: We’re getting ready for the latest RTX series (3090, 3080, 3070). Until then, proceed with caution. There’s a chance the AI won’t compile.
- MaxQ design: GPUs listed above as being compatible will also work if designated as coming with “Max Q Design.” However, they’re likely to signficantly underperform their conventional counterparts.
- Titan: RTX, V, CEO Edition, Xp Collectors Edition,
- Tesla: V100, P100, T10
We also have a list of GPUs the app definitely won’t work with.
What GPUs will not support the Studio App?
RADiCAL Studio requires an NVIDIA GPU to run. At this time, we don’t support AMD, Intel or other cards.
With respect to NVIDIA cards, below is a list of non–compatible GPUs, as of January 27, 2021:
- RTX series: as for the 3090, 3080 and 3070, we’re getting ready. Until then, proceed with caution – there’s a chance the AI won’t compile.
- GTX: 1050 Ti, 1050, 980 Ti, 980, 970, 780 Ti, 960 won’t work.
NB: MaxQ designation: any GPU mentioned above with MaxQ design won’t work.
I get a warning before installing Studio
This is normal. Click “READ MORE” to see that the code is verified as safely published by RADiCAL.
Once you do that you’ll be able to install Studio.
TIP: Make sure you’re logged in as the administrator of the machine you’re installing Studio on.
What’s right for me: real-time or step-by-step?
There are two ways you can use STUDIO.
You can record a scene by itself without the AI running in the background, and then process the full video with the AI after you are done recording.
This will give you results that match the quality of our CORE product in the cloud.
With our real-time beta product, you can view your motion on our stock character in real-time, which you can also stream to your Unreal Engine 4 project.
We want to stress that our real-time feature is in beta, and might not give you the results you’ve come expect just yet. Keep an eye out for improvements in the very near future.
Camera selection: connecting to Studio
Automatic detection of your built-in camera:
For the vast majority of users, the Studio app will automatically detect the camera you’re using with your Windows machine. It will also automatically acquire the video that comes out of that camera. Conventionally, that will be the camera that comes built-in with your PC.
We recommend sticking to your built-in camera. See why in this post about what camera to choose.
What to do if the Studio app can’t find the camera / video stream:
Here’s what you do if the Studio app can’t find a video feed automatically:
- Physically disconnect (i.e., unplug) all cameras (except a built-in camera, which can’t be disconnected physically)
- Shut down any other software app that may want or require access to your camera: OBS, Slack, Zoom, Discord –
- Physically disconnect (i.e., unplug) devices that may access your camera, e.g., VR headsets (such as the HTC Vive, Oculus Quest, etc.).
Once you’ve done that:
- restart your PC
- connect only the camera you want to use
- fire up the Studio app
You may have to use the Windows 10 control panels and settings to find your device and make it work.
Can I use NDI tools to bypass the webcam?
Several users have reached out about bypassing the webcam so they can use their preferred cameras for capture.
While we haven’t optimized or developed for this, but we have had users tell us that NDI Tools will let them hook up a custom camera to RADiCAL Studio.
We can’t guarantee this will work for everybody, but this is how our users bypassed the webcem:
If you’d like to bypass the webcam for use in Studio:
- To choose another camera to stream into Studio, you will need to download NDI Tools.
- Within NDI Tools, select the Virtual Input feature, which serves to convert your preferred camera feed into an NDI Stream.
- Once you converted your camera feed into an NDI Stream, the Virtual Input will now display the NDI Stream as the standard video source in Windows 10.
Where can I export my animation data/download my results?
Unreal LiveLink Tutorial!
For a tutorial of how to set up livestreaming in to Unreal Engine from RADiCAL Studio, please check out Xuelong Mu’s tutorial here.
Camera: do you recommend a specific device?
We recommend that, at this time, you use your built-in camera. The Studio app currently assumes that you will do that. Why? Because our AI operates largely independent of camera and video quality. The app will pick a low resolution and frame rate available to practically all cameras on the market.
However, there may be circumstances where an external camera is the only or better way to go:
- if your machine is custom-built, it may not come with a pre-integrated camera
- if you’re by yourself and want to ensure the best calibration for real time processing, you may want to try to ensure a better angle and distance to yourself as the actor (as discussed in this FAQ)
If you want to connect an external camera into the Studio App, check out this post on the topic.
Calibration: how does it work in Studio?
- Do this: make sure the actor starts the scene, from the first frame onwards, in a T-pose facing, and in full view of, the camera, at the center of the scene (stage), with the entirety of the body visible and feet firmly planted on the floor, for 1 – 2 seconds.
- Don’t do this: we mean “starting” your scene with a T-pose literally. Avoid footage prior to the T-pose in which the actor prepares for the scene. This includes the actor “transitioning” into the T-pose, such as walking into the frame, standing in profile, turned away from the camera, or otherwise anything less than a solid T-pose.
RADiCAL Studio calibration:
- Step-by-step (SbS) processing: The Studio app already comes with a mandatory five-second countdown to assist you in finding the best position before the Studio app starts recording your video for SbS processing. Make sure the actor stands in a solid T-pose by the time the Studio app has completed the countdown and starts recording.
- Real-time (RT) processing: Because there is no five second countdown, you‘ll need a person other than the actor to hit the START button. You can also experiment with external camera integrations (to ensure a better angle and distance) or keyboards / mouse configurations (to allow for remote operation of the Studio app).
Studio FBX exportTo download an FBX of your animation data, there is an Export FBX icon located at the top right of your screen.Click on that after visualizing your scene, it will take you to the studio exporter page.In the Studio app, your results are saved in a standard destination that you can set as a path through Settings.Choose the file with *output.rad extension and upload it through the exporter page.Once the file is uploaded, we’ll run a quick check that the file was generated by the RADiCAL Studio app.Once we’re done processing, we’ll send you an email with a link to the scene page where you can download the FBX file.The FBX file will be downloadable from a new scene page we’ll create for you inside your Projects channel.
During our pre-release testing, we uncovered a few things that we are still working through. If you run in to any, please don’t hesitate to reach out, we are always trying to improve our product.
- Performance Settings: we have locked in the best performing AI configuration (a low resolution coupled with our full deep learning model) for both real time and step-by-step processing. However, soon, we’ll allow users to make certain adjustments to these settings to more granularly reflect the hardware and software environments of the workstations they’re working with.
- LiveLink may interfere with AI: we have observed rare situations where using a Live Link stream into Unreal adversely affects the AI in real time scene processing.
- Step-by-step scenes don’t produce results: we have seen some situations in which scenes processed in step-by-step mode produce no animation data. This is likely due to a poor initial analysis of the video, especially if the user is not clearly visible or the lighting or other conditions are not supportive of the AI. If this happens to you, we’ll soon release an update to allow you to re-process these videos, so there’s no need to delete these folders, even if no results are available right now.
The AI doesn’t initialize after successfully installing
Make sure all other apps are closed before trying to record a new scene or playing back an existing scene.
If that doesn’t work try restarting your machine.
You also want to be the only person in your frame and that your entire body is visible within the frame. and make sure that your background is as uncluttered as possible, as certain artifacts may register as another humanoid object by the AI.
If that still doesn’t work please get in touch.
Unreal Engine 4 LiveLink
With the real-time beta feature, you can live-stream your motion from the RADiCAL Studio in to a scene in Unreal Engine 4. To do this, you need a Professional Studio account (or monthly LiveLink access). See our pricing page for more information.
For indies and students we have special pricing for LiveLink access, get in touch here.
Live Link Setup
After starting the Live Link stream from Radical Studio, open the UE4 editor and go to Window -> Live Link. Go to Source -> Message Bus Source -> select the
RadLiveLinksource. There will be a random alphanumeric string appended to the name, to differentiate between multiple Live Link sources on the network (e.g. other instances of Radical Studio). You can now close this window.
Live Link Preview
The Live Link data can be previewed inside the Skeleton, Skeletal Mesh, Blueprint or Animation Blueprint windows for a given skeletal asset. On the right side, go to Preview Scene Settings, under Animation, change the Preview Controller to Live Link Preview Controller. Then, change the Live Link Subject Name to
RadicalPose. For Retarget Asset, select the corresponding blueprint Remap Asset file for that skeleton. For example, for the Radical skeleton, choose the
RadToRadRemapasset. For the Epic skeleton, choose the
In order to use skeletons other than the Radical skeleton, we need to remap the Radical skeleton’s bone names to their counterparts on other skeletons. For example, if you open the
BP_RadToEpicRemapasset and go to the
GetRemappedBoneNamefunction, you can see a big switch statement that performs the remapping. For Mixamo, the skeleton bone names match the RADiCAL skeleton bone names, so this step is unnecessary.
To account for some of the differences between the Radical Studio coordinate frame and Unreal, we have flipped the incoming LiveLink data’s rotation and position axes. You can inspect the conversions at the
RadicalLiveLinkRemapAssetBaseclass and its child classes. We expect that other skeletons will require different rotation adjustments, including swapping axes. We exposed three overridable methods to implement the root bone position, root bone rotation, and non-root bone rotation conversions. Please note that the AI output in Radical Studio uses the hip bone as a root, so position data should be mapped to the hips (or pelvis) in the target skeleton.
To remap the LiveLink data (which matches RADiCAL 3.1 skeleton structure) to other skeletons, modifications to the bone rotations may be required. The incoming LiveLink data contains rotation offsets from the RADiCAL’s base T-pose, and hip position. For the Epic skeleton, we tweaked the arm rotations in the AnimBP, as the Epic skeleton uses an A pose by default. For the Mixamo skeleton, there are more complex differences, such as an inverted rotation for the the
LeftUpLegbone’s Y axis (among others).
* * *
Credit for instructions to Xuelong Mu.
Why didn’t the AI compile on my machine?
This is rare, but could be happening for a number of reasons.
Check that your machine meets at least the following system requirements:
- Windows 10
- NVIDIA GPU: we recommend at least a GTX 1060 (or stronger). Ensure your GPU is compatible with Tensor RT
Beyond system requirements, always make sure to close as many other apps and operations as you can.
If your machine has all of the minimum specs, you’ve closed other apps and restarted your machine, and still does not compile the AI, please get in touch.