The objective of the project is to design a real time computer application to control a SONY FireWire digital video camera, with an IEEE1394 interface.
Abstract
The objective of the project is to design a real time computer application to control a SONY FireWire digital video camera, with an IEEE1394 interface. The application that’s been chosen is an automatic motion tracking of people in the VISL lab hallway. The goal of the tracking is to display an optimal image of a figure, in sense of appropriate zoom and focus values.
Each couple of images from the camera’s stream is being compared, to achieve a difference image of the hallway. The difference image is then used by the system to recognize figures and to calculate the appropriate zoom value.
The whole tracking process is displayed to the user. The user can choose between watching the tracking stream, in which the figure being tracked is marked, or to view the real time difference image.
The system has been developed in under NI’s LabView environment.
The Problem
Tracking a figure is practically a trivial action for the human eye, whereas for a computer this is a very complicated procedure. First, the computer must recognize that there is a figure to track and follow after (a pretty complicated action for itself). Second, the computer must calculate the location of the figure. At the end, the computer must which zoom/values to apply in order to achieve the optimal image of the tracked figure.
This three actions can be implemented using various methods of image processing.
The Solution
The tracking process in the system is being preformed in three steps:
1. Recognizing a figure: the recognition of a figure in the hallway is done using a reference image of the hallway. A reference image is created by comparing each two consecutive images from the camera stream, as shown the picture below:
1st source image 2nd source image Reference image
The white pixels of the reference image indicate that there has been a change between the two source images. In case that there is a big amount of white pixels (and under the assumption that the hallway itself didn’t move), the system assumes that a figure has entered the hallway and moves to the second step.
2. Calculating the location of the figure: each reference image is being processed by two filters: first, a median filter to remove shot-noise, and second, an LP filter to merge all the white pixels (minus noise) of the figure to form a large object. The system will track that object.

Reference image before filters Reference image after filters
Once the object is formed, the system calculates it’s distance from three edges of the picture – the left, right and upper edges.

Calculating distance from right, left and upper edges
3. Calculating new zoom value: the new zoom value is then chosen according the following condition:
If the object is too close to one of the edges -> reduce the zoom value.
If the object is too distant from all of the edges -> enlarge the zoom value.
If the object is not too close or not too far -> keep current zoom value.
The values “”too close””, “”too far”” and how much to reduce/enlarge the zoom values have been calculated in an empiric methods, by performing a great amount of experience in the lab’s hallway.
Project Tools
1. SONY DFW-VL500 digital camera: The Sony DFW-VL500 is a digital camera which adopts the IEEE1394 (Firewire) interface. The camera incorporates a single color 1/3-inch CCD with square pixels and outputs a 640x480x8 bit VGA format image directly to workstations, PC, or interface cards that support the 1394 digital serial bus. Additional features includes built-in 12X Canon zoom optics with auto-iris control, variable shutter speed, gamma, white balance, gain, and an external trigger input for asynchronous operation. Frame rate may be varied from 3.75 to 30 frames per second.
2. LabView by National Instrument: LabVIEW is a graphical programming language that uses icons instead of
lines of text to create applications. In contrast to text-based programming languages, where instructions determine program execution, LabVIEW uses dataflow programming, where the flow of data determines execution. In LabVIEW, you build a user interface by using a set of tools and objects. The user interface is known as the front panel. You then add code using graphical representations of functions to control the front panel objects. The block diagram contains this code. In some ways, the block diagram resembles a flowchart. You can purchase several add-on software toolsets for developing specialized applications. All the toolsets integrate seamlessly in LabVIEW.
Refer to the National Instruments web site at ni.com for more information about these toolsets.
Example
The following set of pictures shows an example of how the system follows a figure in the hallway:
Acknowledgments
We would like to thank our supervisors Mrs. Ina Krinski and Mr. Johanan Erez for their support and guidance throughout this project. We are also grateful to the Ollendorf Minerva Center Fund for supporting this project.












