3D HDTV Stereo Interlacing

In this project we are going to render 3D stereo images on a 3D HDTV.

We are going to use OpenGL for that purpose, so an interested reader should be familiar with rendering 3D scenes in stereo with OpenGL. In particular, setting up the proper left and right stereo viewing matrices for stereo rendering with OpenGL are considered a prerequisite. A good primer on the topic is “Implementing Stereoscopic 3D in Your Applications” by Samuel Gateau and Steve Nash from NVIDIA.

Quad Stereo Buffering

In principle, 3D stereo rendering is no big deal, if we have a graphics card driver that supports Quad Stereo Buffering.

Then we have two rendering passes:

  1. Render the scene with the left eye’s view point into the left back buffer
  2. Render the scene with the right eye’s view point into the right back buffer

In OpenGL:


for (each frame)
   renderScene(from left eye);

   renderScene(from right eye);


Quad Stereo Buffering Support

Q Now the interesting bit is: Which drivers do in fact support Quad Buffered Stereo?

In my humble experience the following drivers do not support quad buffering with OpenGL:

  • NVIDIA consumer drivers for GeForce
  • ATI consumer drivers for Radeon

The following drivers support quad buffering with OpenGL:

  • SGI Reality Engine (SGI is no longer in the business since 2004)
  • NVIDIA professional drivers for Quadro boards
  • ATI professional drivers for FireGL Pro

The following drivers are untested with OpenGL:

  • Intel drivers may be working starting with GMA X3100 cards on Windows.

So if we do not want to spend a whole lot of bucks for an expensive professional board, what options do we have?

For active stereo techniques we do in fact have no option. But with 3D HDTVs, which use a passive stereo technique, we can construct the stereo images by ourselves without going through the driver.

3D HDTV Stereo Interlacing

A 3D HDTV with passive stereo generates the 3D image by having pixels with alternating polarization. This is known as stereo interlacing. Depending on the model, the pixels are organized so that either the rows or the columns of the lcd matrix have alternating polarization. The most common case is that even and odd rows have different polarization. Let’s assume we have a monitor of that type, e.g. the 84” LG UHDTV 84LM960V.

Having the polarization glasses on, the left eye permanently sees the odd rows and the right eye permanently sees the even rows (in OpenGL rows are numbered from bottom to top starting with row index #0).

If we now change our OpenGL rendering setup, so that the left image is projected onto the odd rows and the right image is projected onto the even rows, we see a stereo image!

Stereo Interlacing Pixel Shader

In order to achieve stereo interlacing with OpenGL, we use a pixel shader that kills the even rows for the left image and the odd rows for the right image, like the following shader:


TEMP tmp;

# stereo interlacing
MAD tmp.xy, fragment.position, program.env[2], program.env[2].zwxy;
FRC tmp.xy, tmp;
SUB tmp.xy, tmp, 0.5;
KIL tmp.xyxy;

# write primary color to output register
MOV result.color, fragment.color.primary;


The above pixel shader kills the even rows with the following parametrization:


And it kills the odd rows with the following parametrization:


OpenGL Stereo Interlacing

Here is the according stereo interlacing setup with OpenGL:


for (each frame)

   renderScene(from left eye);

   renderScene(from right eye);


OpenGL Volume Rendering With Stereo Interlacing

The stereo interlacing approach is implemented in the $V^3$ volume renderer. We enable the stereo mode by adding the following option to the command line:

qtv3 --interlace-vertical

Here is a screen shot of the volume renderer showing a CT scan of the ubiquitous Stanford Bunny, the Bonsai #3 and an artichoke. If you view the screen shot on a 3D HDTV with passive stereo and put your stereo glasses on, you see it in 3D:


Implementation notes:

  • Each graphics primitive needs to be drawn with a shader that is prefixed with the stereo interlacing snippet. Otherwise the primitives will not show up in stereo, of course.
  • The interlacing pixel shader needs to be run in full-screen mode with the native resolution of the screen. For example, stereo interlacing with a Full-HD resolution on a 4K native screen is not possible. This requires an additional rendering pass, which scales each Full-HD stereo image pair up to 4K and interlaces it accordingly.
  • Graphics primitives, which are less than two pixels wide, will look odd. Therefore, lines or point clouds are not suited well for displaying with the stereo interlacing approach. Better use 2-pixel wide anti-aliased lines.