How can I test RADiCAL’s output?
If you want to see how RADiCAL’s output integrates in to your workflow, you can download free FBX files from videos on our explore page.
You can also download FBX files from the Gen 3.1 samples folder in your account dashboard.
Where can I download Studio?
Is there a tutorial for Studio?
We have a quick, basic tutorial on how to use Studio available here.
What NVIDIA GPU do I need for the Studio App?
RADiCAL Studio requires an NVIDIA GPU to run. At this time, we don’t support AMD, Intel or other cards.
The smallest GPU you can use with RADiCAL Studio is a GTX 1060, which will allow you to do step-by-step (non real-time) processing. For real-time results, the minimum requirement is an RTX 2060, although we recommend a 2080 Ti for best results. Below is a complete (as far as we can tell) list of compatible GPUs, as of January 27, 2021:
- Quadro: 8000, 6000, 5000, 4000
- RTX: 2080 Ti, 2080 Super, 2080, 2070 Super, 2070, 2060 Super, 2060
- GTX: 1660 Super, 1660 Ti, 1660, 1650 Ti, 1650
- GTX: 1080 Ti, 1080, 1070, 1060
- RTX 3 series: We’re getting ready for the latest RTX series (3090, 3080, 3070). Until then, proceed with caution. There’s a chance the AI won’t compile.
- MaxQ design: GPUs listed above as being compatible will also work if designated as coming with “Max Q Design.” However, they’re likely to signficantly underperform their conventional counterparts.
- Titan: RTX, V, CEO Edition, Xp Collectors Edition,
- Tesla: V100, P100, T10
We also have a list of GPUs the app definitely won’t work with.
What GPUs will not support the Studio App?
RADiCAL Studio requires an NVIDIA GPU to run. At this time, we don’t support AMD, Intel or other cards.
With respect to NVIDIA cards, below is a list of non–compatible GPUs, as of January 27, 2021:
- RTX series: as for the 3090, 3080 and 3070, we’re getting ready. Until then, proceed with caution – there’s a chance the AI won’t compile.
- GTX: 1050 Ti, 1050, 980 Ti, 980, 970, 780 Ti, 960 won’t work.
NB: MaxQ designation: any GPU mentioned above with MaxQ design won’t work.
I get a warning before installing Studio
This is normal. Click “READ MORE” to see that the code is verified as safely published by RADiCAL.
Once you do that you’ll be able to install Studio.
TIP: Make sure you’re logged in as the administrator of the machine you’re installing Studio on.
What’s right for me: real-time or step-by-step?
There are two ways you can use STUDIO.
You can record a scene by itself without the AI running in the background, and then process the full video with the AI after you are done recording.
This will give you results that match the quality of our CORE product in the cloud.
With our real-time beta product, you can view your motion on our stock character in real-time, which you can also stream to your Unreal Engine 4 project.
We want to stress that our real-time feature is in beta, and might not give you the results you’ve come expect just yet. Keep an eye out for improvements in the very near future.
Why won’t the AI compile on my machine?
This could be happening for a number of reasons.
- Check that your machine meets at least the system requirements described in these FAQs and on the product page.
- It is possible that the app can’t find your NVIDIA GPU – you can check whether that’s the case by opening the NVIDIA Control Panel > Manage 3D settings > Program settings. If the RADiCAL Studio app doesn’t appear in the list, you should manually add it there. Here’s advice on how to do that (ext.).
- Make sure to close as many other apps and operations as you can.
If your machine has all of the minimum specs, you’ve closed other apps and restarted your machine, and still does not compile the AI, please get in touch.
Camera selection: connecting to Studio
Automatic detection of your built-in camera:
For the vast majority of users, the Studio app will automatically detect the camera you’re using with your Windows machine. It will also automatically acquire the video that comes out of that camera. Conventionally, that will be the camera that comes built-in with your PC.
We recommend sticking to your built-in camera. See why in this post about what camera to choose.
What to do if the Studio app can’t find the camera / video stream:
Here’s what you do if the Studio app can’t find a video feed automatically:
- Physically disconnect (i.e., unplug) all cameras (except a built-in camera, which can’t be disconnected physically)
- Shut down any other software app that may want or require access to your camera: OBS, Slack, Zoom, Discord –
- Physically disconnect (i.e., unplug) devices that may access your camera, e.g., VR headsets (such as the HTC Vive, Oculus Quest, etc.).
Once you’ve done that:
- restart your PC
- connect only the camera you want to use
- fire up the Studio app
You may have to use the Windows 10 control panels and settings to find your device and make it work.
Camera: do you recommend a specific device?
We recommend that, at this time, you use your built-in camera. The Studio app currently assumes that you will do that. Why? Because our AI operates largely independent of camera and video quality. The app will pick a low resolution and frame rate available to practically all cameras on the market.
However, there may be circumstances where an external camera is the only or better way to go:
- if your machine is custom-built, it may not come with a pre-integrated camera
- if you’re by yourself and want to ensure the best calibration for real time processing, you may want to try to ensure a better angle and distance to yourself as the actor (as discussed in this FAQ)
If you want to connect an external camera into the Studio App, check out this post on the topic.
Can I use NDI tools to bypass the webcam?
Several users have reached out about bypassing the webcam so they can use their preferred cameras for capture.
While we haven’t optimized or developed for this, but we have had users tell us that NDI Tools will let them hook up a custom camera to RADiCAL Studio.
We can’t guarantee this will work for everybody, but this is how our users bypassed the webcem:
If you’d like to bypass the webcam for use in Studio:
- To choose another camera to stream into Studio, you will need to download NDI Tools.
- Within NDI Tools, select the Virtual Input feature, which serves to convert your preferred camera feed into an NDI Stream.
- Once you converted your camera feed into an NDI Stream, the Virtual Input will now display the NDI Stream as the standard video source in Windows 10.
Calibration: how does it work in Studio?
- Do this: make sure the actor starts the scene, from the first frame onwards, in a T-pose facing, and in full view of, the camera, at the center of the scene (stage), with the entirety of the body visible and feet firmly planted on the floor, for 1 – 2 seconds.
- Don’t do this: we mean “starting” your scene with a T-pose literally. Avoid footage prior to the T-pose in which the actor prepares for the scene. This includes the actor “transitioning” into the T-pose, such as walking into the frame, standing in profile, turned away from the camera, or otherwise anything less than a solid T-pose.
RADiCAL Studio calibration:
- Step-by-step (SbS) processing: The Studio app already comes with a mandatory five-second countdown to assist you in finding the best position before the Studio app starts recording your video for SbS processing. Make sure the actor stands in a solid T-pose by the time the Studio app has completed the countdown and starts recording.
- Real-time (RT) processing: Because there is no five second countdown, you‘ll need a person other than the actor to hit the START button. You can also experiment with external camera integrations (to ensure a better angle and distance) or keyboards / mouse configurations (to allow for remote operation of the Studio app).
How do I export an FBX from Studio results?
Here’s how it works:
- Make sure you’re logged in with an active Studio subscription on your account.
- Find the animation file (output.rad) on the device, inside the scene folder, where you’ve been saving your work from the RADiCAL Studio app.
- Once the file is uploaded, we’ll run a quick check that the file was generated by the RADiCAL Studio app.
- Once we’re done processing, we’ll send you an email with a link to the scene page where you can download the FBX file.
- The FBX file will be downloadable from a new scene page we’ll create for you inside your Projects channel.
We also have a basic tutorial of how to use Studio here.
Unreal Engine 4 LiveLink: how do I set it up?
With Studio’s real-time feature (in public beta), you can live-stream your motion from the RADiCAL Studio into a scene in Unreal Engine 4.
What you need
To get started, download the RADiCAL UE4 LiveLink plugin, and follow the instructions through our Github repo here. You need a Professional Studio account (or monthly LiveLink access). See our pricing page for more information. For indies and students we have special pricing for LiveLink access, get in touch here.
Live Link Setup
After starting the Live Link stream from Radical Studio, open the UE4 editor and go to Window -> Live Link. Go to Source -> Message Bus Source -> select the
RadLiveLinksource. There will be a random alphanumeric string appended to the name, to differentiate between multiple Live Link sources on the network (e.g. other instances of RADiCAL Studio). You can now close this window.
Retargeting to a character
The plugin contains retargeting assets for converting animation output from RADiCAL Studio to an Unreal Engine editor session.
It will receive data from a local RADiCAL Studio session, either on the same computer or over the local network. We have provided two mannequins: one with the Epic skeleton and the other for the RADiCAL skeleton. The latter is provided principally to serve as ground-truth for verifying animation data.
The Live Link data can be previewed inside the Skeleton, Skeletal Mesh, or Animation Blueprint windows for a given skeletal asset.
Check out the video tutorial here.
Credit for the plugin, tutorial and instructions to Xuelong Mu.
During our pre-release testing, we uncovered a few things that we are still working through. If you run in to any, please don’t hesitate to reach out, we are always trying to improve our product.
- Performance Settings: we have locked in the best performing AI configuration (a low resolution coupled with our full deep learning model) for both real time and step-by-step processing. However, soon, we’ll allow users to make certain adjustments to these settings to more granularly reflect the hardware and software environments of the workstations they’re working with.
- LiveLink may interfere with AI: we have observed rare situations where using a Live Link stream into Unreal adversely affects the AI in real time scene processing.
- Step-by-step scenes don’t produce results: we have seen some situations in which scenes processed in step-by-step mode produce no animation data. This is likely due to a poor initial analysis of the video, especially if the user is not clearly visible or the lighting or other conditions are not supportive of the AI. If this happens to you, we’ll soon release an update to allow you to re-process these videos, so there’s no need to delete these folders, even if no results are available right now.
The AI doesn’t initialize after successfully installing
Make sure all other apps are closed before trying to record a new scene or playing back an existing scene.
If that doesn’t work try restarting your machine. If that still doesn’t work please get in touch.