chipKIT® Development Platform

Inspired by Arduino™

Composite Video -> USB frame grabber

Created Tue, 26 Jul 2011 23:53:17 +0000 by svofski


Tue, 26 Jul 2011 23:53:17 +0000

Work in progress: composite video frame grabber. There are plenty of projects that generate video, this one grabs video generated by something else for a change. Technically it's a bit like "video experimenter's shield", except that it has high resolution and it doesn't need any shield for video experimenting.

It's specifically designed to capture video generated by a stone age home computer that I have. The computer outputs composite signal with 1-bit pixels, 12MHz pixel clock. The complete image is comprised of 512x256 1-bit pixels. Technically it also has 256x256 4-colour mode, but the colour is encoded by the bit pattern and the RGB image can be reconstructed from monochrome.

Here's how it's built, in rough detail. Wavy lines indicate external connections between internal modules of PIC32. They are a nuisance, but I can't think of a way of doing this without them. Other than the wires and connectors for USB and composite video, there are no external components so far.

The system runs @72 MHz, because it's a multiple of 12 MHz pixel clock. Comparator 1 detects HSYNC and VSYNC pulses in the composite video and winds up the timer. I use the core timer for this. The timer shoots the interrupt at specified time, which starts both SPI units, latches and stores the pixels word by word. Then the micro goes to idle until the next Comparator 1 interrupt shoots. Comparator 2 is a "pixel detector", it is necessary because composite video levels are not sufficient to register at the digital inputs: it's not connected to anything internally, technically it's used as if it was a discrete chip. When a frame is acquired, it gets dumped down the USB pipe. When upload of a frame is finished, it starts all over again.

Here's a little video that shows the proof of concept:

The problems to overcome:

  1. input bandwidth. I guess I need a decent impedance matching input circuit. As it is now, 12MHz is unachievable, the input works as a low-pass filter. In 256x256 (6MHz) mode the picture looks pretty nice.
  2. Dancing pixels. Not sure if it happens owing to CPU pipeline, interrupt latency or what. As of now pixels drift subtly to the left and right, which is especially easy to see in 3x oversampling mode. I need them to be stone solid, not stoned.

P.S. I'm afraid nothing of this, except for the pretty board and its bootloader, really has anything to do with «Arduino». It's all plain C++.


Wed, 27 Jul 2011 06:17:18 +0000

Fairly low res, what do you need it for?

What is the frame rate, Horz and Vert rates. Can you just capture it as composite video?

Why not go direct to the data stream being sent to the board?

Always fun!

Alan KM6VV


Wed, 27 Jul 2011 07:45:53 +0000

Thanks for the comments!

It's a retro computer toy, so I don't need it but I want it. If i'm not fed up with this too soon, it may become a part of an installation later.

512x256 isn't really low res, it's not very far from the maximum of what composite video is capable of.

I can capture every field, this is a standard 50 frames/sec PAL composite video. But I can't upload it at the same time. I'm not sure why, but frames take longer to upload than I expected and I'm getting only about 6fps so far.

I'm not sure if I understand about the data stream. Which data stream?


Wed, 27 Jul 2011 18:19:40 +0000

The "data stream" was assuming it was a old "PC" or similar uP system where you could get at commands on the bus being sent to a video card. Basically an "ICE" (In Circuit Emulator) method. Probably not practical unless you are intimately familiar with the uP and hardware...

Yes, I guess it is a fair amount of resolution. Maybe you need more buffering?

Can you just record the PAL (?) data on a VCR?

Alan KM6VV


Thu, 28 Jul 2011 11:14:30 +0000

I see. I'm farily familiar with the architecture. In fact, I made a "clone" of it in FPGA :)

You can't really tap into the stream of commands sent to a videocard because there isn't any videocard or commands. It's a frame buffer implemented in DRAM, 512x256 bit field, 16Kb, plain and simple. All drawing is done by CPU setting bits in memory.

There are existing projects that do similar stuff (scan doublers for VGA), but they rely on tapping to the system pixel clock. My goal here is to experiment and see if quality capture is possible without tapping into the circuits, using only what's provided on video connectors.

Yes, I can just record the "PAL" data on a VCR, plug it into a composite in of my monitor or a TV card. But it won't be my project then :)

I'm not really that much concerned about frame rate as of now. I'm more worried about temporal stability.

A question to MIPS gurus: I'm detecting the sync and I need to perform a few things to reliably shift in the bits. Here's how I do it now:

  1. idle sleep
  2. comparator interrupt -> detect HSYNC pulse -> set timer to beginning of line
  3. idle sleep
  4. timer interrupt, SPI1 on, SPI2 on, shift in the bits
  5. idle sleep
  6. again

How repeatable is this from the point of view of CPU pipelines, interrupt latencies etc? I'm getting some drifting of pixels to the left or right. Maybe there are some tricks to clear up the pipelines, or something, to make the latency 100% identical every time? Some general advice?


Sun, 31 Jul 2011 20:40:34 +0000

Ok, it seems that the latency is not really a problem as things are very stable and repeatable now. In fact, I guess most of my problems come from the electrical side of the circuit. Here's the latest video: Here, everything is perfectly stable at start, but when the picture gets more white, the makeshift video amp starts playing bad jokes with me. But I'm pretty happy with this so far. The framerate tends to be around 21fps, which still could be a little bit faster, but it's ok.