Beating the Bits out of Camera Link

This article is as much a story of a quest as it is a technical article about verifying Camera Link Cable Performance.

December, 2006: Our Vice President, Steve Mott receives a phone call from a distributor who complains that a small percentage of cables are causing noise in their systems. A return authorization was submitted and we waited to receive the cables in order to perform a root cause analysis. The raw cable was identified as the cause of the noise. The problem was, the noise was not consistent. Only about 2 percent of the cables exhibited a problem. We needed an economical way to ensure our customers that every cable we produced would work in their system.

We first looked to the current Camera Link standard which referred us to an eye diagram. We ruled this out immediately as there was no way we could perform a signal integrity test on every cable we built, furthermore, who was to say the eye diagram would ensure the cables would work in the customers system. In this system we can only test one tap at a time, but in a full camera link system there could be as many as 8 taps utilized, 3 on the base cable, and 5 on the full for a total of 8.

Fortunately, I find the Machine Vision industry is full of knowledgeable friendly people who are happy to lend their experience and advice to technical issues. We looked to the frame grabber manufacturers for advice. They were happy to lend a hand, as they seem to be the first in line when customers have an issue with noise. Perhaps this is because on an analog system, the frame grabber was the typical culprit in a noisy system. Overwhelmingly, the frame grabber manufacturers showed us test patterns and suggested a B.E.R.T. test, Bit Error Rate Test.

We needed to build a system to emulate a camera and a frame grabber. Bitflow lent a hand by offering us some software to perform a bit comparison. We then purchased a device from Vivid Engineering, which is a Camera link simulator. The Camera link simulator allowed us to change test patterns and frequencies easily with serial commands. We then had to develop some logic and a GUI interface to build a tester we could use on the production floor.

We initially worked with 4 frequencies, 40, 66, and 85 MHz. We performed a bit error comparison on 1000 images at the three frequencies. The margin of error was zero. If we drop one bit, the cable fails. In order to provide some headroom to our customers and to allow for the fact that not all cameras and frame grabbers are equal, we added 3 MHz to the test. Therefore if a cable is to be certified at 85 MHz it must pass the 88 MHz test.

Good but not great:
At this point we had a test we were pretty confident with. But, as fate would have it, one of our customers who were operating a line scan camera doing flat screen inspection found noise on some 8-meter assemblies. We issued an RMA for the cables and put them on the test bed ... bad news ... the cables passed, we could not see the noise. The only good news is that we had a test that could be adjusted; we simply need to raise the bar. The customer was now on a plane to fly thousands of miles to help us resolve this problem. They suggested that the image we were using in our test might not be harsh enough, so again we turned to our friends in the frame grabber industry for advice.

When we analyzed the test image we were using, we found that the image output a full 8 bits on tap A, but as you moved through the taps, there was far less data being transmitted over the taps so that by the time you reached tap H, there was almost no data on the tap.

The solution would require a firmware update to our Camera Emulator, which would allow us to apply a full 8 bits per tap. We also wanted to stagger the bits so that we could provide some level of bit transmission.

The results:
Shocking! The new test was far harder to pass. In a direct comparison, a 12-meter cable that in the first test passed at 71 MHz, now only passed at 57.

Back to the drawing board:
Our raw cable had to be re engineered, and we had to find a way to provide 85 MHz performance beyond 10 meters as we found in typical cable construction this is not possible, nor was it represented by any manufacturer. We improved the cable design and got rid of the noise, but we were learning.

Can all the Cable manufacturers be lying?
The answer to this question is no. We found that by performing a bit error test on only one tap, that 10 meter cables could actually pass our test, but when the second tap was added ... Forget 8 taps ... the bit errors started to appear. They aren't lying; they're only looking at 1/8 of the picture. The eye mask test is myopic!

The Farce:
One of our competitors when challenged on test methods said, "If you buy expensive cables, they'll pass, and if you buy cheap cables, they'll fail" I could barely contain myself after this comment because I regularly sample cables from all over the industry, and cost has absolutely no bearing. In order to really understand the problems with cables and the camera link standard, you need to emulate the worst case-working scenario that a cable could find. In my experience, simply certifying a percentage of the cables produced is not enough. 100 percent of the cables must be performance tested by a method that is relative to our industry.

Never. We're always looking for ways to improve our test methods. We're currently working on a 10 tap test, and we are certain that as soon as we find a solution for 85 MHz cameras, someone will want to operate over 100 MHz. The test can only be improved. As of today, I am very proud of the fact the CEI is the only manufacturer that provides performance testing on every Camera Link cable it builds.

Why is it so Important to provide 100 percent performance testing:
There is no way to recover a lost bit in Camera Link. If a bit is lost, it's lost forever, unlike Ethernet where there will be a packet re-transmit.
As a cable manufacturer who sells strictly through our distribution channel, I can never be certain where our products will end up. I have to assume that the cable will be used on a mission critical device, which begs the question. If you're wife or your child is about to have a surgery performed by a robotic vision guided system, would you bother to make sure the cable works?

Future solutions:
Transmit pre-emphasis and Receive Equalization: We can alter the bits on the transmit and receive ends of the cable with built in i.c.'s.

CEI now has a line of cables, which utilize pre-emphasis and equalization in order to extend the working distance of camera link cable assemblies. As of today, base cable lengths can be extended up to 25 meters, and full up to 15 meters ... Remember when I said there are more errors as you add taps. This becomes more dramatic when pre-emphasis and equalization is applied. It is feasible that these lengths will be extended even further this year, and more importantly, a solution now exists for camera and frame grabber manufacturers who wish to go beyond the current 85 MHz threshold, and perhaps move toward the 112 MHz available to them today.