Frequently Asked Questions by

320 and 540 Video Users


When my graphics screen is set to 256 colors, why is video sometimes displayed with incorrect colors?

We recommend that you use the Display Properties control panel to set the graphics screen to 32768 colors or TrueColor when running any video applications.

Does Adobe Premiere work on a 320/540?

Yes. Please consult our Adobe Premiere FAQ page.

Do I have to have special software/applications to capture uncompressed video?

No. The majority of Video for Windows or QuickTime video capture applications can capture uncompressed video on the 320 and 540 without dropping frames. Consult the capture page for more details.

Do I need special software/applications to playback uncompressed video?

No. Most Video for Windows or QuickTime video out applications can playback uncompressed video without dropping frames. Consult the playback page for more details.

Can I do screen capture on an Silicon Graphics 320/540?

There is no built-in hardware to perform screen capture. But the functionality can be emulated in hardware. A program is available for download. Please consult the screen capture page for more details.

Why can I only capture up to a 2 GB movie file with QuickTime or Video for Windows.

This is a limitation of AVI file format and QuickTime file format. This problem is addressed in the 4.0 version of QuickTime.   For Video for Windows, the problem is solved under DirectShow.  For instance, AmCap, a utility program that ships with the DirectX Media package, can capture over 2 GB.  Go to the DirectX page to download this.

How do I get Video out to my monitor under VFW?

Video for Windows (VFW) does not specify a video output architecture like QuickTime does.  However, a de facto standard for video output implementation evolved where video output is done through video decompression modules.

Windows recognizes some RGB formats as uncompressed and most other formats as "compressed".  VFW Applications that play compressed AVI files to the computer screen call the video decompression modules to decompress the data to RGB.   Since most PC video boards capture in compressed formats, they often implement their modules to send the data to the video output jack while decompressing.

  • SGI Modules
SGI has several decompression modules for decompressing AVI files in Motion JPEG, Photo JPEG, and 16-bit UYVY formats.  Currently, only the UYVY module plays back to video output.

Note that SGI's quicktime output component supports playing back AVI files as well as QuickTime MOV files, so if the user needs to play back Photo JPEG AVI files to video output, he/she can do so using QuickTime applications.

I have a series of images that I would like to play back to video out. How do I do that?

SGI does not yet provide a utility to do that. However, if you can convert these images to movie file such as an AVI or QuickTime file, you can play it back using out video out playback programs. See playback page for details.

Can a Silicon Graphics 320 do a dual stream video capture?

No, The 320 has two video in channels. However, on a 320, both channels have to be connected to the same physical jack. An example application that uses this feature could bring in S-Video in at full size full rate to disk, while previewing the same stream using the second video channel at quarter size on the screen.

Can I capture compressed video?

The 320 and 540 do not have hardware support for compression built in to the base system. However, there is an SGI MMX optimized software JPEG compression module available via QuickTime and Video for Windows that will allow you to do real time capture up to data rate of 2MB/s. Also, you can soon purchase and install a MJ1100 Dual stream motion JPEG option card that performs real-time JPEG compression and multi stream decompression in hardware.

Can I playback reduced size video and have it zoomed up to the video monitor's sizes?

No. You have to perform the zoom using the graphics pipe or the CPU before sending it to video out.

Do the 320 and 540 support NTSC and PAL resolutions?

Yes. The 320/540 video implementation is compliant to SMPTE 259M and CCIR 601 standards. CCIR601 625 is commonly called PAL, and CCIR 601 525 is commonly called NTSC, by video users.

Do I need special hardware to capture/playback uncompressed video?

If you wish to capture full-rate full size uncompressed video, you will need a set of striped SCSI drives or a RAID array. Even if you plan on doing compressed or non-full rate video, we would still recommend a fast SCSI drive. We recommend the ql1080 controllers from SGI, which we have been using to test our capture/playback capabilities.

Can I do Direct I/O on NT?

Yes, when opening a file, have FILE_FLAG_NO_BUFFERING in the 6th argument of the CreateFile function.

How do I stripe disks on NT?

Bring up the windisk (Start->Administrative Tools->Disk Administrator). Ctrl-click on the disks that you want to stripe together. Right click on one of the selected disks and choose "Create Stripe Set" from the menu. Right click again and choose "Commit Changes". At this point, if this is the first set of stripe disks that you have on your system, you would be asked to reboot to load a new driver. When your machine comes up again, bring up windisk and format the drives. The stripe set will appear as a single disk to the file system and you are now set to use it.

How many disks do I need to do uncompressed real time video capture?

Uncompressed video is about 20 MB/s in 4:2:2 YUV, 40 MB/s in 4:4:4:4 RGBA. Each 10k RPM cheetah SCSI drive can do about 17-18MB/s transfer using direct I/O. So to capture 4:2:2 YUV, you need at least 2 cheetah drives striped together. To capture 4:4:4:4 RGBA, you should have at least 4 disks striped together.

Is there an option for Component Analog Video input/output?

The short answer is no. The long answer is that we were not able to satisfy our customers with our custom Component Analog Video I/O options in the past so we decided that customers who do not want to compromise quality will use the SD1100 digital video option board with their own analog encoder/decoder.

How do I set Genlock sources?

Click on the SGI tray on the bottom right of your task bar, and open up the video control panel. Select the output video jack that you are interested in, and click on the Timing tab. Choose the Genlock source that you wish the video out to Genlock to.

Can I capture ancillary data (close caption, embedded audio) on a 320/540?

Yes. The 320/540 allows you to capture any data within the video raster except for the EAV code, pixel 720 and 721 of a video line in the CCIR 601 specification. However, the hardware does not parse the data automatically. Data has to be transferred to memory and subsequently extracted by software. These features are currently only available via the Media SDK please see http://www.sgi.com/developers/nt/sdk/dmsdk.html for more information..

Can I capture SDTI using the optional SD1100 on the 540?

Yes. SDTI transfer data in the 8 least significant bits of the 10-bit word for every pixel. The most significant 2-bit of the 10-bit word are parity bits. A user can choose to capture 10-bit data in a left justified 16-bit word and extract the 8-bit from the data in memory. However, neither Video for Windows nor QuickTime currently expose 10-bit data via their API. Please see http://www.sgi.com/developers/nt/sdk/dmsdk.html for more information.on the Digital media SDK, which exposes these features.

What is the color conversion precision on a Silicon Graphics 320 and 540?

Our color conversion formulae are derived from the CCIR 601 specification. The multiplication uses 10-bit fixed point precision rounding to 8-bit.

What is the algorithm used for up and down chroma sampling on a Silicon Graphics 320 and 540?

On chroma upsample in video in, the odd pixels' chroma are averaged from its nearest 2 even pixels, except for the last odd pixel, which is replicated from its nearest even pixel. On chroma downsample in video out, the even pixels' chroma consists of 1/4 of each if its nearest odd pixels and 1/2 of its own chroma value.

How can I get at accurate timestamp information on the video streams, like UST/MSC as is available on IRIX?

Currently, accurate timestamp information on the video streams are not available via Video for Windows or QuickTime. However, we do have a Media SDK available for developers interested in UST/MSC information available on the system. Please see http://www.sgi.com/developers/nt/sdk/dmsdk.html for more information..

When I capture video images scaled to half size or less, why don't they look as good as the video captured at full size?

The short answer to this is that we use a different scaling algorithm than is used by inexpensive video cards. I believe our algorithm will give a better image quality for video which is captured and displayed as fields, not as frames. Unfortunately, both VideoForWindows and QuickTime capture video images as frames (with MJPEG the exception). We are working on a software patch which will allow you to choose between the two scaling algorithms.

The long answer is that to scale video to half size (or less), our hardware and software scales every F1 and F2 field image by half (or more) both horizontally and vertically, then combines these two fields to make a frame. Because the single frame is generated from two fields representing two different points in time, the resulting frame shows interleave artifacts where there is motion in the images. If the resulting frame is scaled back up to normal size, the interleave artifacts are further exaggerated. We believe that some other video cards scale to half size (or less) by first throwing out half of the incoming field images (either F1 or F2), then scaling the image by half (or more) horizontally, and then a lesser amount vertically, because each captured field is only half-height already. This method causes temporal information to be lost because half of the fields are tossed out, but avoids any interleave artifacts because each frame is based on information from only 1 field image at a single point in time. The resulting image quality of each algorithm will depend on your application.

     
Back To Index

Last Updated Tuesday, 01-Jun-99 02:48:44 GMT