DS & RED



DS Version 10.1.1 does RED natively


There is an all-Avid solution, though MetaFuze does not yet support audio.

This is covered in depth in Avid's RED workflow PDF. It is also covered on Avid's Website
You will need DS version 10.1.1 and Media Composer v.3 or v.3.5

Please see Avid's site to understand how to set up your DS to work with RED files natively. There are two files that are necessary. You will have to reinstall your DS Com Server:
Locate the DS Com Server Setup.exe here: DS_v10_1_1\DS_v10.1.1\setup\DsComServer

The first time the setup.exe is run it will uninstall the older version of this service. The second time the setup.exe is run the service will properly be installed and the RedDecoder.exe and RedWrapper.dll will now be placed in the proper folder.

Relaunch DS and begin working with RED files.

If you have a DS 10.1.1 system you can begin working with RED files. There is a specific workflow on our A-Z "A" page (you must have a Dual Link system and an AJA card.

If you are doing proxies in 4:2:2 for RED, you need to dust bust and edit in 4:2:2 then switch to 4:4:4 to not need to tweak color correction in 4:4:4 as there is a slight shift between 4:2:2 and 4:4:4.

"Sexy" Bob has done a little web-app to convert ALEs generated out of REDrushes. It rebuilds the reel number to match that of Metafuze and DS so if you need to use REDrushes to make Quicktimes for MC (rather than using Metafuze until it gets audio support) you can convert the ALE it generates.

You can find this at http://www.idolum.com/redrushes-ale.php - just upload your ALE and it will give you back one with fudged reel numbers. Also lets you mark the clips for audio channels as well, so you don't have to remember to modify the clips in MC before you capture.

The DS import does not support audio yet. Future MetaFuze may incorporate audio support. The current one uses the original SDK which does not support audio. Most productions record sound separately so this may not be a problem.

Recapture R3D files in lower compression & higher quality


(case study by Dermot Shane)

After some push'n and pull'n and a huge hint for some smarter folks than i in Montreal this what i came up with;

Unfortunatly the key step (# 7) only is avb in 444 projects, AND DS seems to re-set the "resolution" setting to "Full Premium" once back in YUV meaning just modifying the clip's metadata and returning to YUV does not work either, and this makes it a bit more of a challange, but i do have this working cleanly - try these steps to test it out;

  1. Start a YUV HD project "test YUV"
  2. Capture some r3d files as DNxHD36 using half good
  3. Edit a seq, save as "test_1"
  4. Open a 444/HD project / 709 LUT "test RGB"
  5. Import "test_1" timeline, do not drop it on the RGB timeline (or DS will change the seq settings back to YUV).
  6. Open media tool, drop "test_1" timeline into it
  7. Click on a clip, delete media and select "recapture with options"
  8. De-select "use master clip lut on recapture"
  9. Click on "capture"
  10. Click on "yes" when asked "would you like to modify the clip settings before preforming this operation?"
  11. Adjust the RED SDK settings to taste, white balance, half premium, ISO what ever..
  12. Click "OK" and re-capture the clip with modified settings


- OR-

The first option is much better, as you can keep modifying the r3d's without the shuffle, but i know not everyone has 444 monitoring or realtime YUV outputs from a RGB project, so the B option works with a bit more of a hassle... everything up to step 6 takes no more than 5 min, maybe 3 min.. so not a big deal, it took me alot longer to write this than to do it...

(With thanks to JM & Steve.)

RED ONE to DS Nitris 2k /4k Workflow:


This workflow is tested and valid as of 5/20/2008. Things are changing rapidly on the RED side so there may be improvements soon. Check the RED page for software updates.

The DS side of the workflow is very straightforward. Once you generate DPX files and an EDL you can conform the whole thing easily. However, there is a known issue with DS 8.4 DPX conform when source clips' timecode exceeds 23 frames. This results in unlinked clips on the timeline. You can request a patch from the support or download QFE 2 when it becomes available.

1. Offline Edit


RED records 12 bit linear RGB RAW files in proprietary R3D format. In addition to R3D the camera also generates three sizes of Quicktime proxy files. These files must reside in the same directory as the R3D file they reference or else they will not be able to link to it.

The proxy files are suitable for short form editing. They contain timecode that points back to the master R3D files. For long form editing it is better to transcode the proxies to something else (ProRes in FCPRO for example).

Final Cut Pro can also import R3D files via Log and Transfer plugin.

Media Composer can not read timecode from Quicktime files. While it is possible to edit with proxies in MC it is not easy to reference the orignal timecode for conform purposes. One workaround is using a third party utility called Metacheater. The rest of this workflow assumes you're using FCPRO for offline edit.

Any missing or deleted Quicktime proxies can be recreated using RED ALERT. The reference proxy creation is instantaneous.

2. Create DPX files


Another free companion application is REDCINE. It is very similar in functionality to RED ALERT but it can load multiple R3D clips and export sequential DPX files. You can either export all of your footage to DPX, you can manually pick and chose the takes you need or you can use a smarter approach with a $190 USD program called Crimson which will select just the clips you need. There is also a free trial version of Crimson.

Export an EDL and XML from FCPRO. Follow the software instructions and process this XML with Crimson. It will create a new XML and pointers to the R3D clips you need for the cut.

Open REDCINE and load the pointers that Crimson has generated. Then load the XML. REDCINE will organize the sequence in the proper cut order. Setup your basic project parameters such as 2k or 4k, aspect ratio, etc.

Now you can do a one light correction and repos. For DS use Rec. 709 color space. Remember that your color decisions at this stage may clip color information and the only way to retrieve it will be to redo select clips.

Prior to rendering DPX files ensure that the display quality and the render output quality are at the same setting or you will experience processing speed decrease. On an 8 core Mac it takes about 24x the running time to create 2k full aperture files.

3. Conform in DS


Load all DPX files in your local VideoStorage as per DS manual. I have loaded files outside VideoStorage to avoid DS indexing them and did not notice any performance issues.

Create a sequence that matches the DPX files in terms of size and bit depth (10-bit). For final renders you will need 10-bit or 32-bit float processing precision but for quick preview quality renders 8-bit is sufficient. Just remember to delete all caches and re-render at higher precision in the end.

Load the EDL in a text editor or EDL Tool and truncate all the source reels to 4 characters. For example, if the original source name was something like A001C002… rename it to A001. Save the EDL, load it in DS and follow the standard DS DPX conform procedure. DS will link to the DPX files.

Import files as linear without clamp.

If your computer is sufficiently fast you will be able to play back 2k files in real time within DS. You can also view the files on the video monitor if you switch on HD proxy. It is necessary to render HD proxy cache for real time playback.

4. Output


The way you deliver files depends on the filmout facility.

5. Issues


I ran into issues with off-speed clips in Crimson. This may be fixed in the future.

REDCINE DPX files have only 4 characters for reel name which is shorter than the original R3D files.
It is possible to run into duplicate time codes with such a naming convention. The A001C002… name syntax comes from RED camera where "A" is the "A" camera, "001" is reel 001, "C002" is clip 002 and so forth. The full name also includes the date.

Igor Ridanovic's (basic) tutorial on import to DS

Capture gives you real time playback immediately. Linked clips give you full source resolution, but clips must be processed in order to be played.

Same goes for the Red file imports. Capture the clips at low rez, so you can play them in real time for the "rough" edit and then link the clips at high-rez for the conform and then process just what is needed.

DS cannot play back RED 4K realtime without some really special gear -- and there is no monitor that will display it anyway. Here's your 4K workflow (for going back to film):

or
or

But if your output is not going to need 4K, you can work in a HD seq with RED, capture at a low res, edit away, and then recapture for your final output.

Igor's tutorial on basic RED import to DS

Media Composer RED Workflow

(Thanks to Tony Jover)

  1. Use Matafuze to create MXF media and an ALE
  2. Use ALE to link to MXF media on MC and edit
  3. Export AFE from MC
  4. Conform AFE on DS, linking to the Red files at that time.

(Thanks to Shannon Dunn, Tom Phillips and Dermot Shane).

The difference between R3D and DPX imported to DS


(Case Study)

Not much of a surprise really. Firstly I have to say that I did all this at HD (1920 x 1080) resolution in a 4:2:2 sequence, so that could be part of the reason. The moiré pattern, or rather its absence is a result of the scaling algorithm, as someone already mentioned. It wasn't visible in the DPX files scaled with the Mitchell algorithm in RedCine, because its a softer looking algorithm, which results in a slightly softer overall image. When using Sinc (default in Shake) or Lanczos (both slightly sharper algorithms) the moiré reappears, but I have to say it's something that only appeared in a very specific shot during a focus pull, so it's not like these algorithms always introduce artifacts. Other than that, there is a slight difference no matter which algorithm, OLPF, DeNoise or DeBayer setting is selected between RedCine converted DPX files and R3Ds imported directly into DS, although I don't think it's a quality difference. When layering the footage with a subtract node / composite mode, certain edges showed a difference, though with most algorithms it's impossible (at least for me) to see any difference with the naked eye. There is no color banding, noise or other artefacts with either method, and the only other difference, which I already mentioned a few times is a slight shift in saturation and gamma. My guess is that the difference is due to the import settings (Scale to fit or Half Premium in the R3D dialogue box) the scaling algorithm and the colour space. Next someone would have to do a test at RGB 4:4:4 4K or 2K, but I'll leave that to someone else. I wouldn't expect a difference in quality anyway.

Ultimately this means for us that we can use the direct import of R3Ds in DS, though it would be nice to have the same amount of control as in RedCine, i.e. a wider range of scaling algorithms, which should be included in the capture settings (Scale to Fit...) and the basic DVE.

(Thanks to Peter Mirecki)





Important update


Some important things to note.

R3d files have 2 timecodes. Edge timecode, and Time Of Day (TOD) timecode.
It is very important which of these is selected when filming with the RED camera as it will affect the rest of your workflow.

My tests were with r3d files captured with the EDGE timecode selected during filming.

The quicktime proxies (with REDCODE codec) will reference BOTH timecodes. But...There are 3 different ways to bring media into FCP, and depending on how you bring in media, FCP will look at different timecode. IMPORT, and LOG & TRANSFER will bring in Time of day timecode ONLY. DRAG & DROP, from Finder to FCP, will bring in EDGE timecode. It is possible that DRAG & DROP brings in the timecode selected during filming but in my case this was EDGE timecode so I have no way to verify at this stage.REDCINE will read either timecode and reference back to the correct r3d files. However the DPX files that are generated by REDCINE will have ONLY the timecode that matches that displayed by the RED camera during filming.

So here is an example of problems that can arise:


RED captures displaying EDGE timecode.
FCP captures quicktime proxies using Log & Transfer. All clips now reference TOD timecode.
EDL and XML out of FCP both reference TOD timecode.XML into Crimson. Crimson reads what it is given, in this case, TOD timecode. Intermediaries and XML from Crimson reference TOD timecode.Intermediaries and XML from Crimson into REDCINE.
RECINE reads all the TOD timecode and creates the correct DPX files. HOWEVER the timecode in the dpx headers will ONLY be EDGE timecode.
Conform edl into online machine, fail to link to dpx files and bang head off the wall.

Argueabley the problem is brought in by FCP, but poor documentation and lack of "on-street knowledge" compounds issue. Also, an option in REDCINE to export dpx file with TOD timecode would solve this too.


I am unaware of a way to know which timecode was used in the RED camera other than to create DPX files in REDCINE and open them in a program which allows you to see the timecode.

If TOD timecode was captured by RED it shouldn't be an issue using Log & Transfer. If EDGE timecode was used during capture, make sure the offline editor captures the Quicktime proxies using the DRAG&DROP method, to avoid these issues.

Also: For Crimson to work effectively, the timeline in FCP should be flattened to a single layer.

When bringing the Crimson XML into REDCINE there may be an error message such as "Failed to load construct". To avoid this, in Crimson, UNCHECK the option to add VOLUME/ to the intermediaries. This is the case when the FCP edit was done using proxies


Declan MacErlane








A RED ONE workflow has been posted to Editors' Lounge
Editors' Lounge RED Event Booklet



Special thanks to Michael Radeck for this case study:

12 February, 2008:
Two weeks ago I had the chance to test many things with 5 REDs and the possible post-production workflows. Testing has not finished yet, but our testing indicates there's a better AVID workflow then FinalCut!

The Quicktime "Reference proxies" are max 2K, 1K, 0.5K resolution of the original 4K. With Final Cut theres no FullHD - preview with these files. If you bring this format over HD-SDI you will lose a lot of resolution. Also they are camera RGB - RAW, so you do not reproduce the original colors, all desaturated and colorshifted. If you want real color you have to preprocess with Redcine or Redalert. Preprocessing needs render time. So If you want to make a good Realtime HD - Postpro-Offline you have to render Quicktime ProRES or Quicktime DNxHD. Then you are able to make an offline with Real-time preview in Full HD on Big HD - Screens if you want. If you render Quicktimes then theres no reason anymore to do the offline in FCP. If you want to go to Broadcast then HD is good enough, I have also printed lots of DNxHD stuff with Arrilaser to 35mm. If you print in 2K resolution theres no need to post more then in HD resolution. If you print out in 4K then you should finish in 4K, but the resolution of the RED is not more then a little more then 2K (4K RAW is not the same as 4K RGB so you are far away from 4K Resolution!!), but theres a big issue with focus pulling!! Much more then with 35mm (35mm is not more then 1K in cinema). Problems with backfocus, problems with finding the focus (the preview with the red is only 1K!) use the focus assist function if possible. If you want to bring 2K resolution to digital projection, then you have to produce sharp pictures that are in focus! If you printout you will lose lots of resolution!

Final Cut can not deal with 4k. Neither can Apple's Color, so only AVID DS Nitris can do 4k finishing. Scratch can only deal with the raw files but cannot apply a correct color matrix to reproduce correct colors!! We were not able to grade some pictures with high saturated colors in original, before we applied the colormatrix 709 with redcine, render to DPX and color correct. Please note that all 4k and 2k renders are scratch. (Comment: Lucas Wilson says, "SCRATCH can reproduce "correct colors" just fine.... in whatever color space you need. If you have the NVidia SDI card, you can accurately reproduce whatever color space you want."). See new Comment on using SCRATCH




So AVID - Workflow is much more easy in HD, also Color correction can be done in HD, then connect the timeline to DPX-files in 2k or 4k.

Make DPX and Quicktime DNxHD in Redcine - both can be done on a Windows PC. You don't need a Macintosh computer. The other way: You can make AVID - MXF Files with third-party tools, but you have to pay for those tools.

you will only need a little freeware that extracts the timecode information from the dnxhd-quicktimes to an ale-file (only a mouse click). Bring this to XpressPro or Media Composer and then batch import the quicktimes - very easy (all quicktimes should be in one folder). You should use the raw filename for the quicktimes and dpx files. This is an option in redcine.

Rendering the quicktimes is on a pc faster then on mac!

So forget final cut - it is only quick and dirty.

Also for all: Final Cut onboard - effects are all 8 bit!!! High precision rendering will not work with onboard effects only some third-party effects are able to render with higher bit depth. Final Cut also cannot deal with RGB - all RGB is transformed to legal TV - Levels. You are not able to preserve Superblack levels. So forget Final Cut!

We had 5 Reds, Cooks, Zeiss, Arri, Red - Lenses, 4k Projector, 2k projector, 2k arri laser printout.

The RED TESTING TEAM - Kameraverleih Ludwig - Germany

Cheers Michael Radeck - DS - Artist - Postproduction Supervisor

PS: If you Printout to Arrilaser and calculate the Format resolution for 1:1.85 then you would calculate 2048 x 1107 as the correct resolution, but theres a bug in DS with custom resolutions. If you bring your HD-Timeline to this format, the DS will scale the images. If you link to the HD files with no scaling!!. You will lose 50% of the vertical resolution (v8.4 QFE1). You have to choose the preset 2k widescreen 2048 x 1108!! crazy (when you bring 1920 x 1080 to 2048 x 1108 then : 64 px left and right, 14px top and bottom are black but in cinema you have much more safe area (you should calculate about 12% for that). You have to do this to bring the HD-material in pixel native format to the laser, because with any digital scaling you will lose resolution.



Using SCRATCH
(Added 03.04.2008)
Special thanks to Lucas Wilson, Assimilate, Inc.

REDCINE was a collaboration between RED and Assimilate. Assimilate, Inc. helped write the application and there are many parts of REDCINE that function very similarly to SCRATCH.

  1. Download the free "RedNode for SCRATCH" file. You must do this from a Windows PC, the option will be grayed out for other operating systems.
  2. In that file is a folder called RED LUTs
  3. Load the Rec709 LUT into the LUT section in SCRATCH.
  4. In Matrix, you can choose what you want to do with the color metadata in the R3D file. In SCRATCH, there is much more color control than in REDCINE, so you can choose pretty effectively what you want to do.

The color spaces that are built into REDCINE are a download for SCRATCH. Unless that Rec.709 LUT is downloaded and applied, it is difficult to get an accurate Rec.709 out of SCRATCH with R3D files.

Also, SCRATCH can output to SDI. REDCINE cannot. So by hooking up SCRATCH to a WFM/Vectorscope, you can probably get a more accurate frame-to-frame determination of Rec.709 than you can by looking at an LCD screen with REDCINE and no scope.


Other RED Links
RedUser discussion forum

New Comment on using scratch: (Added 14.05.2008) from Michael Radeck
The tutorial above is incomplete, not solving the colorspace problem:

after talking with german scratch reseller, talking with scratch customer, tryouts on their system, we didn't find the solution. Also the guide above from lucas didn't help. Further discussion with lucas and again testing on the scratch system here in munich, that system seems to be misconfigured, the colorspace option is missing. Now with the help from fxphd.com (as a paying studend) i have the newest scratch software and license. Now i can state: both scratch and redcine can reproduce correct colors, if you change two settings: gamma and colorspace:
What you have to configure in scratch is:

1. loading the Rec709 LUT for correct gamma in the LUT section for Display-Output
2. loading the colorspace Rec709 in the PROCESS section under fxcontrol: where you find the default colorspace "CAMERA-RGB" selected. Click on CameraRGB and select REC709 from dropdown. This is much more significant for correct colors then the gamma!! Because gamma can be corrected with any simple colorcorrection. (the REC709 LUT is only 1D-LUT). This colorspaceconversion needs a much more complex 3D-LUT. You are not able to reproduce those conversions with MATRIX section in scratch, with normal primary colorcorrecting methods.

Revised: Apr 7, 2010 4:25 pm