
Embedded Image Processing
Increasingly, modern image processing is no longer implemented on bulky computers only, but directly on slim and price-worthy embedded systems. Machines and devices can see! But: A few habits and reflexes, that otherwise are valid, are to be reconsidered in the application field of embedded systems.
Microprocessors and small PC modules have become so powerful, that in the meantime even the digital image processing has progressed into the embedded field. This is not natural because the numeric processing of camera images in real-time usually requires a lot of computing power due to the enormous amount of data and the complexity of the vision algorithms. Fortunately, the power consumption and price of the new computer modules have not increased proportionally to the computing power. In contrary to desktop PCs, modern dual or quad core PCs in credit card form make do with minimal watts and only cost 100 to 200 francs.
A similar development has happened with the cameras: Cheap and compact CMOS sensors have widely replaced the elaborate CCD technology. However, the most high-resolution cameras are still not used, but preferably the ones that picture the relevant object suitably. This way a lot of unnecessary computing power is saved.
Picture 1: O-3000 Embedded Camera.
Challenges for the Developer
1. More is not generally better, bigger is more beautiful. Instead, the developer of the Embedded Image Processing Systems is confronted by a set of elementary questions:
2. How is the image processing realized (standard software)?
3. What computer is to use (standard form factor)?
4. How is the camera connected to the personal computer (standard interface)?
5. How do the images get into the personal software (standard driver)?
6. And more…
To give the answer right away: Standards are in a poor state, at least in the embedded fields. And - this we want to show here – it is quite good and suitable this way. Because embedded systems are as diverse as their applications are homogenous. The only things that matter in the end are the price per item, the construction size, the connecting line, etc. Every standard would only live up to a small group of fitting solutions and would eventually have minimal use. Besides that, every standard causes additional costs for training, the hardware, bigger space requirement and at best a software that requires license, compared to a specific solution. This all counters the original goals – given, obviously, fitting personnel that can develop specific solutions.
Picture 2: Raspberry Pi 3 with O-3000 Camera.
Alternatives to Common Standards
Let’s look at the processors: Usually, neither the classic embedded microprocessors nor the computer modules are compatible with x86, but very often they are based on ARM. This typically eliminates various resource-hungry Windows derivates. As a result, many heavy-weight and expensive standard image processing programs are also out of the race. Instead, slim and application-fitted Linux derivates have captured the field. These solutions can be easily scaled for the relevant application. Free tools like OpenEmbedded, Bit-Bake, Buildroot and more support this process. Instead of buying a standard program for image processing, often one of the bigger and broadly developed open-source graphic libraries for C/C++ is used. A professional example for this is OpenCV, which was developed with a view to real-time applications and thanks to the support of OpenCL and CUDA uses the specific skills of the concerning hardware, like graphics acceleration and multiprocessing. This way, you get a powerful and highly versatile image processing system for little money.
LVDS and MIPI CSI are common standards for the connection of cameras to computers. Due to lacking standardized cables and plugs many embedded PCs don’t have fitting ports. But almost every computer, big or small, has USB and ethernet ports. Even small microprocessors usually have those interfaces. Therefore, they are primary candidates for the connection of cameras to embedded systems. Here, USB is more reasonable and compact than ethernet. Also, the transfer rate of USB 2.0 is higher than the ethernets one at 100 Mbit/s. (If USB 3.0 or gigabits ethernet is available on embedded platforms they usually are unable to fully exploit the transfer rate.)
Eventually, the question about the camera driver for the software connection stays in the room. UVC is a popular USB device class in the world of consumers, but this standard is not suitable for embedded systems due to its overflowing complexity. It only causes a big overhead. Countless suppliers avoid problems by delivering their own precompiled drivers to their camera products. But this has its pitfalls because which combinations of hardware and software are supported here? Usually, it’s the ones you don’t want to apply into your embedded system. Stettbacher Signal Processing AG offers an alternative with its O-3000 cameras (see picture 1): Instead of adjusting to halfheartedly fitting standards, the camera interface is simply revealed. The user has direct access to all functions of the camera and to the image data, all via a simple XML log. A driver revealed in the source code and examples of application make the entry and integration of the camera into the system much easier for the developer. It doesn’t matter if it’s a microcontroller, an FPGA or an embedded PC. There are no barriers regarding the hardware, software, computer language, etc.
Picture 3: Instructions for Raspberry Pi 3.
Application Examples
Firstly, it is shown how, with little handling, a Raspberry Pi 3 is made to take pictures of a O-3000 camera and then to save them in TIFF format. The O-3000 camera is connected to the Raspberry via USB. The camera is also being charged via USB, this way only one cable is necessary. In picture 2, two versions of the camera are shown, but for the application only one of them is required.
Now, on the Raspberry a shell is opened and the command sequence is entered. Please note that a connection to internet is required. The lines marked with <1> make sure the system is up-to-date and all the software bundles that are necessary for the app are installed. At <2> the O-3000 software is downloaded by Github. The commands at <3> form and install the O-3000 driver. At <4> the demo app is compiled. With the command on line <5> the demo program is executed. The Raspberry now scans the pictures from the camera and saves them as TIFF files. The process can be terminated with CTRL-C. The recorded pictures can be viewed with any suitable program.
Picture 4: O-3000 Camera in a Machine Control System.
The second example for application is shown in picture 4. It concerns the controlling of a textile test device that is connected to a O-3000 camera (left front in the picture) via USB. The electronics has various inputs and outputs for sensors and actuators in the device (plug on the right side in the picture). The algorithms of the machine controlling run on an ARM Cortex A9-based computing module that is pinned onto the main print (in the front in the picture). A tailored embedded Linux is used as software. The device’s hard real-time functions are realized in an FPGA. The computing module receives the pictures from the camera and processes them. The goal is to detect changes in the fabric during the testing of textiles. In this case the task is tricky because firstly, the fabrics can be randomly printed and secondly, the fabric is pulled during the testing.