WO2011047508A1 - Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens - Google Patents
Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens Download PDFInfo
- Publication number
- WO2011047508A1 WO2011047508A1 PCT/CN2009/074564 CN2009074564W WO2011047508A1 WO 2011047508 A1 WO2011047508 A1 WO 2011047508A1 CN 2009074564 W CN2009074564 W CN 2009074564W WO 2011047508 A1 WO2011047508 A1 WO 2011047508A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- beacons
- tracking
- tracker
- extremely wide
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
Definitions
- the present invention relates to the field of vision tracker, and, more particularly, to an embedded vision tracker with an extremely wide-angle vision system for sequential beacons array.
- Extremely wide-angle lens system means that an entire hemispherical field of view is seen simultaneously.
- the extremely wide-angle lens system would be able to catch whole view information in one image, thus camera scanning is unnecessary.
- Extremely wide-angle lens vision is a new computer vision technology. With the property of spherical field of view of an environment, extremely wide-angle lens vision is one of the methods of image sensing for extensive field and suitable for particular applications such as stereoscopic viewing, global monitoring, security surveillance, multi-targets detection, maneuverable object tracking and so on. It is easier for an extremely wide-angle lens to find and track targets, since the targets stay longer in the 180° field of view of the environment.
- an embedded vision tracker for processing the images of extremely wide-angle lens system to track the sequential beacons array and meet the on-board system.
- It's another objective of the present invention to provide a mobile guiding method (MGM) of tracking sequential beacons array for providing accurate processing of extremely wide-angle lens image for sequential beacons array tracker for vehicle guidance, on-board mobile robot, mobile monitor and other related areas.
- MGM mobile guiding method
- a embedded vision tracker using extremely wide-angle lens said tracker is placed on the top of vehicle and comprises: a Digital Signal Processor (DSP), a Field-Programmable Gate Array (FPGA), a CMOS image senor, a FLASH memory, a Synchronous Dynamic Random Access Memory (SDRAM), and Ethernet interface, wherein the CMOS image senor is connected to extremely wide-angle lens and adapted to output digital image data of a field of view; the DSP is adapted to process main functions; the FPGA is connected to said CMOS image senor and adapted to adjust the capturing time sequence of the CMOS image senor and write the image data to the SDRAM; the FLASH memory is adapted to store software for directing the operation of embedded targets tracker; the SDRAM is adapted to permit reading and writing of the data cache and image data to execute software programs, and the Ethernet interface is adapted to communicate information and data with other external devices.
- DSP Digital Signal Processor
- FPGA Field-Programmable Gate
- a mobile guiding method (MGM) of tracking sequential beacons array comprising: vision system modeling and calibration, tracker initializing, double beacons recognizing, alternate beacons tracking, beacons rechecking and paring, beacons localizing, beacons position rectifying and beacons coordinates outputting, wherein vision system modeling and calibration obtains necessary parameters of the embedded vision tracker; the tracker initializing sets the parameters of the tracker, which enter via Ethernet interface; double beacons recognizing choose the relevant group of the beacons; alternate beacons tracking means tracking the double beacons and generate the tracking-gates of the beacons; beacons rechecking means checking the validity of the beacons in the tracking-gates; beacons localizing is used to switching the sequential beacons; beacons position rectifying transforms the coordinates of the beacons into the space coordinates; the last, output the space coordinates of the beacons via Ethernet interface.
- MGM mobile guiding method
- the processing of alternate beacons tracking comprises: initializing of alternate beacons tracking, beacons sampling, beacons weighting, beacons resampling, and tracking-gates generating, wherein initializing of alternate beacons tracking means building the color histogram of the beacons and generating the samples; beacons sampling generates the samples by randomly sampling around the beacon; beacons weighting computes the weight (confidence score) of each sample, which is adaptively determined by measuring similarity degree of the samples and the beacons; beacons resampling generates the new samples to replace the degrade samples; tracking-gates generating calculates the coordinates of the samples to predict the coordinates of the beacons.
- the present invention is used to tracking the sequential double color beacons array with an extremely wide-angle lens.
- the embedded vision tracker is placed on the top of the vehicle (mobile monitor, robot etc.), and ensure the extremely wide-angle lens plumbed, the navigation landmarks are sequential double color beacons array on the ceiling.
- the embedded vision tracker could capture multiple groups of beacons at one time, and then choose the best available group to track, and finally translate the image coordinates of the beacons to the space coordinates and output.
- Extremely wide-angle lens has a 180° view of the environment; with this kind of lens, it is easier to find and track targets, since they stay longer in the field of view.
- the structure of extremely wide-angle lens is relatively dense and well-knit, while the structure of reflector lenses consisting of two parts is fragile. So it is suitable for application in the field of on-board system and mobile monitor.
- FIG. 1 is a block diagram illustrating an embedded vision tracker for processing extremely wide-angle lens images in accordance with an embodiment of the present invention
- FIG. 2 is a flowchart illustrating the mobile guiding method of beacons tracking by processing extremely wide-angle lens images in accordance with another embodiment of the present invention
- FIG. 3 depicts an environment representation of a vehicle utilizing the present invention to track the sequential double color beacons array
- FIG. 4 is a diagram illustrating an extremely wide-angle lens model.
- FIG. 1 shows a block diagram illustrating the embedded vision tracker 200 that processes extremely wide-angle images in accordance with an embodiment of the present invention.
- the embedded vision tracker 200 includes a Digital Signal Processor (DSP) 214, Field-Programmable Gate Array (FPGA) 210, FLASH memory 215, and Synchronous Dynamic Random Access Memory (SDRAM) 216, each element operably coupled to data bus 211 and address bus 212.
- DSP Digital Signal Processor
- FPGA Field-Programmable Gate Array
- FLASH memory 215 FLASH memory
- SDRAM Synchronous Dynamic Random Access Memory
- a CMOS image senor 213 is operably coupled to FPGA 210.
- Embedded vision tracker 200 also includes an Ethernet interface 217 operably coupled to data bus 211 and address bus 212.
- Extremely wide-angle lens 201 connected to said embedded vision tracker 200, is in possession of a 180° view of the environment.
- the extremely wide-angle lens produces a spherical field of optical view of the environment.
- the extremely wide-angle lens 201 is placed vertically upward to capture the image of the top sequential beacons array.
- the CMOS image senor 213 could convert optical view to digital image.
- the Field-Programmable Gate Array (FPGA) 210 connected to the CMOS image senor 213, adjusts the capturing time sequence of the CMOS image senor and writes the image data to the SDRAM 216.
- FPGA Field-Programmable Gate Array
- Image data stored in SDRAM 216 is transmitted to DSP 214, which may perform multiple functions, including calibration, image processing, sequence beacons tracking, and devices controlling.
- Software for directing the operation of embedded vision tracker 200 may be stored in FLASH memory 215 for executing by DSP 214.
- Synchronous Dynamic Random Access Memory (SDRAM) 216 permits reading and writing of the data cache and image data to execute software programs.
- SDRAM Synchronous Dynamic Random Access Memory
- Ethernet interface 217 One of the input and output means of the embedded vision tracker 200 is Ethernet interface 217.
- the user may input data, such as targets aimpoint and feature, through interface 217.
- Ethernet interface 217 could also receives data processed by DSP 214 and may transmit beacons tracking informations to PC and/or other external devices.
- the mobile guiding method (MGM) of tracking sequential double color beacons array in accordance with an embodiment of the present invention will now be discussed in greater detail with reference to FIG. 2, which shows a flowchart illustrating the mobile guiding method of the embedded vision tracker 200 by processing extremely wide-angle lens images.
- the MGM includes 6 key technical points: vision system modeling and calibration in step 310, double beacons recognizing in step 303, alternate beacons tracking in step 305, beacons rechecking in step 313, beacons localizing in step 315, and beacons position rectifying in step 318.
- the present invention is used to tracking the sequential double color beacons array with an extremely wide-angle lens.
- the embedded vision tracker is placed on the top of the vehicle (mobile monitor, robot etc.), and ensure the extremely wide-angle lens plumbed, as shown in FIG. 3, the navigation landmarks is sequential double color beacons array at the top.
- the embedded vision tracker could capture multiple groups of beacons at one time, and then choose the best available group to track, and finally translate the image coordinates of the beacons to the space coordinates and output the tracking results.
- the modeling of vision system is presented and calibrated in step 301, for obtaining necessary parameters.
- the modeling of vision system is built as shown in FIG. 4. There are two coordinate systems:
- the lens coordinate system This coordinate system is established based on a virtual semi-spherical lens model, which converts from the ordinary extremely wide-angle lens 201 which composed by the multi-layer optical lens. According to the optical axis and semi-spherical lens, a theoretical refracting optical center O, could be established, which could set as the origin of the lens coordinate system.
- This coordinate system is the space coordinate system of the extremely wide-angle lens, and also could represent the space coordinate system of the tracker.
- the image coordinate system This coordinate system is established based on the plane of the CMOS image senor 213, which is two-dimensional (2-D) system.
- the coordinate system relation between the CMOS senor and the image is one-to-one correspondence, so the image coordinates could be used instead of the CMOS senor coordinates.
- the origin of the image coordinate system is top-left of the image and the unit is pixel.
- r Kco , (1) where r is the distance projection point P uv to the image center 0 2 0 , v 0 ) ; CO means the angle of an incident ray; K is the radial distortion factor of the extremely wide-angle lens.
- the image center O 2 (w 0 , v 0 ) and radial distortion factor K needs to be calibrated.
- the image center O 2 (w 0 , v 0 ) could be calibrated by laser calibration method.
- the radial distortion K could be calibrated by model projection calibration method based on the equation 1.
- the primary task of the tracker initializing in step 302 is to set the parameters of the tracker, which includes the feature of the beacons, the number of the tracking samples, and the size of the tracking-gates. These parameters enter via Ethernet interface 217.
- the presetting feature of the targets could be set as RGB color, oriented gradients, intensity, template matching, shape and other features. For extremely wide-angle lens image, there is a serious shape distortion, so the tracker using RGB color as the main feature of the targets in the present invention.
- Double color beacons recognizing means automatic recognizing the beacons in step 303, according the presetting feature of the targets which enter in step 302.
- Targets automatic recognition operation achieves by color threshold segmentation.
- the RGB value have been chosen to extract the target group which composed by two beacons with different colors.
- the tracker needs beacons pairing and labeling in step 304, to choose the relevant group of the beacons.
- the connected region algorithm is used to deal with noise suppression.
- the relevant beacons are characterized by forming a connected set of pixels by processing of connected area segmentation.
- the eight connected region method is used to count the connected components. Every connected pixel which belongs to the same connected components will be labeled to a same value.
- several beacons could be distinguished by different values.
- the two beacons of the relevant group should simultaneously satisfy the following three conditions: the different color between the two beacons, the shortest distance between image centers and the biggest connected components.
- the relevant group of the beacons recognized from the extremely wide-angle lens image in step 304 would be tracked in step 305.
- the tracking method is based on probability approximated by a set of prediction samples of the beacons.
- the tracker uses an alternate-implementation of tracking technology. This technology divides the core processing of tracking into three steps: sampling, weighting and resampling, and alternate implements each step for tracking double beacons..
- the processing of alternate beacons tracking includes: initializing of alternate beacons tracking in step 305, beacons sampling in step 306 and 307, beacons weighting in step 308 and 309, beacons resampling in step 310 and 311, and tracking-gates generating in step 312.
- the initializing of alternate beacons tracking in step 305 includes building the color histogram of the beacons and generating the samples.
- the color histogram of beacons is defined as a method of description of color feature distribution.
- the color histogram is obtained by slitting the range of the color feature into equal-sized bins (called classes), then for each bin, the number of points from the image that fall into each bin are counted.
- the color histogram provides a strong indication of the proper distributional model for the color feature of the beacons.
- the samples of beacons are the probability prediction of beacon's location, which generate by a random sampling around the beacon.
- the number of samples for each target is 50 in the present invention.
- the structure of sample includes the weight and the coordinates of samples in the image.
- the weight (confidence score) of each sample is adaptively determined by measuring similarity degree of the samples and the beacons, using the following equation 2:
- x is the value in the bin of the color histogram of the beacon
- z is the value in the bin of the color histogram of the sample
- m is the length of the color histogram
- beacons sampling means generating new samples by a random sampling around the existing sample (the sample in previous frame).
- the samples of beacon are randomly generated, so the samples need to converge ⁇ 2 times to the location of the beacons in order to ensure that they could better respond the statement of the beacons.
- the processing of converge is described as the following equation 3 : ⁇ 1 ⁇
- y is the center location of the converged sample
- y . is the pixel location of old sample propagated from previous frame
- ⁇ is the number of the pixel of the sample (the height of sample-window multiply by the width of the sample-window)
- ⁇ ( ⁇ ) is Delta function
- b(x) is a function as b : R 2 — > ⁇ 1 ⁇ - -m ⁇ , which means the mapping of the color value of the pixel x . to the bin index of the color histogram
- m is the length of the color histogram
- h f is the value of color histogram of the beacon in the bin
- h f is the value of color histogram of the sample in the bin.
- the beacons resampling in step 310 and 311 After a few iterations, the samples could be degeneracy and could not accurately reflect the statement of the beacons, that is, the weights of these samples are too small. The degeneracy can be reduced by resampling.
- K w threshold factor
- the embedded vision tracker will recheck the validity of the beacons in the tracking-gates in step 313. Compute each weight of the beacons in the tracking-gates and compare them with the threshold, which could be the threshold OC in the step 310 or a new threshold set by user via Ethernet interface 217. If the weights fall below the threshold, the beacons in the tracking-gate are invalid and return the step 303 to recognize a new group of the beacons.
- the embedded vision tracker use the beacon localizing for switching the beacons in step 315.
- a tracking area is set to determinate the validity of the beacons. If either of tracking beacons' position is out of the tracking area, the tracker will return to double color beacons recognizing in step 303 for switching to the next group of the valid beacons.
- the coordinates of the beacons in image coordinate system will be rectified in step 318, for transforming into the coordinates in the lens coordinate system (space coordinate system).
- the beacons are deformation in the image because of inherent distortion of the extremely wide-angle lens, and so the coordinates of the beacons is.
- the image should be rectified to get the undistorted information.
- the target information is useful to tracker, so it only needs to rectify the coordinates of the tracking beacons thought the image distortion rectification in unit 317.
- the rectification algorithm is computed by equation 6:
- K is the radial distortion factor of the extremely wide-angle lens which also calibrated in step 301
- H is the height of the sequential beacons array.
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
An embedded vision tracker and a mobile guiding method (MGM) for tracking sequential beacons array are provided to provide accurately processing for extremely wide-angle lens images for tracking sequential double color beacons array. The embedded vision tracker includes a Digital Signal Processor (DSP 214), a Field-Programmable Gate Array (FPGA 210), a CMOS image sensor (213), a FLASH memory (215), a Synchronous Dynamic Random Access Memory (SDRAM 216) and an Ethernet interface (217). A vision tracking method based on probability approximated by a set of prediction samples of the targets is provided. The embedded vision tracker can be applied to vehicle guidance, an on-board mobile robot, a mobile monitor and other related areas.
Description
EMBEDDED VISION TRACKER AND MOBILE GUIDING METHOD OF TRACKING SEQUENTIAL DOUBLE COLOR BEACONS ARRAY WITH AN EXTREMELY WIDE-ANGLE LENS
FIELD OF THE INVENTION
The present invention relates to the field of vision tracker, and, more particularly, to an embedded vision tracker with an extremely wide-angle vision system for sequential beacons array.
BACKGROUND OF RELATED ART
Extremely wide-angle lens system means that an entire hemispherical field of view is seen simultaneously. The extremely wide-angle lens system would be able to catch whole view information in one image, thus camera scanning is unnecessary. Extremely wide-angle lens vision is a new computer vision technology. With the property of spherical field of view of an environment, extremely wide-angle lens vision is one of the methods of image sensing for extensive field and suitable for particular applications such as stereoscopic viewing, global monitoring, security surveillance, multi-targets detection, maneuverable object tracking and so on. It is easier for an extremely wide-angle lens to find and track targets, since the targets stay longer in the 180° field of view of the environment.
The efficient vision tracking with extremely wide-angle lens in complex environments is a challenging task, especially the serious distortion of the lens. Moreover, in order to track a moving object in real-time without delay or loss of the image data, a processor with the ability of effective computation is required. And for the on-board application, the embedded system which has low energy and small volume is suitable.
SUMMARY OF THE INVENTION
In view of the above described problems, it is one objective of the present invention to provide an embedded vision tracker for processing the images of extremely wide-angle lens system to track the sequential beacons array and meet the on-board system.
It's another objective of the present invention to provide a mobile guiding method (MGM) of tracking sequential beacons array for providing accurate processing of extremely wide-angle lens image for sequential beacons array tracker for vehicle guidance, on-board mobile robot, mobile monitor and other related areas.
To achieve the above objectives, in accordance with one embodiment of the invention, provided is a embedded vision tracker using extremely wide-angle lens, said tracker is placed on the top of vehicle and comprises: a Digital Signal Processor (DSP), a Field-Programmable Gate Array (FPGA), a CMOS image senor, a FLASH memory, a Synchronous Dynamic Random Access Memory (SDRAM), and Ethernet interface, wherein the CMOS image senor is connected to extremely wide-angle lens and adapted to output digital image data of a field of view; the DSP is adapted to process main functions; the FPGA is connected to said CMOS image senor and adapted to adjust the capturing time sequence of the CMOS image senor and write the image data to the SDRAM; the FLASH memory is adapted to store software for directing the operation of embedded targets tracker; the SDRAM is adapted to permit reading and writing of the data cache and image data to execute software programs, and the Ethernet interface is adapted to communicate
information and data with other external devices.
In accordance with another embodiment of the invention, provided a mobile guiding method (MGM) of tracking sequential beacons array, comprising: vision system modeling and calibration, tracker initializing, double beacons recognizing, alternate beacons tracking, beacons rechecking and paring, beacons localizing, beacons position rectifying and beacons coordinates outputting, wherein vision system modeling and calibration obtains necessary parameters of the embedded vision tracker; the tracker initializing sets the parameters of the tracker, which enter via Ethernet interface; double beacons recognizing choose the relevant group of the beacons; alternate beacons tracking means tracking the double beacons and generate the tracking-gates of the beacons; beacons rechecking means checking the validity of the beacons in the tracking-gates; beacons localizing is used to switching the sequential beacons; beacons position rectifying transforms the coordinates of the beacons into the space coordinates; the last, output the space coordinates of the beacons via Ethernet interface.
In the MGM, the processing of alternate beacons tracking comprises: initializing of alternate beacons tracking, beacons sampling, beacons weighting, beacons resampling, and tracking-gates generating, wherein initializing of alternate beacons tracking means building the color histogram of the beacons and generating the samples; beacons sampling generates the samples by randomly sampling around the beacon; beacons weighting computes the weight (confidence score) of each sample, which is adaptively determined by measuring similarity degree of the samples and the beacons; beacons resampling generates the new samples to replace the degrade samples; tracking-gates generating calculates the coordinates of the samples to predict the coordinates of the beacons.
The present invention is used to tracking the sequential double color beacons array with an extremely wide-angle lens. The embedded vision tracker is placed on the top of the vehicle (mobile monitor, robot etc.), and ensure the extremely wide-angle lens plumbed, the navigation landmarks are sequential double color beacons array on the ceiling. The embedded vision tracker could capture multiple groups of beacons at one time, and then choose the best available group to track, and finally translate the image coordinates of the beacons to the space coordinates and output.
Extremely wide-angle lens has a 180° view of the environment; with this kind of lens, it is easier to find and track targets, since they stay longer in the field of view. The structure of extremely wide-angle lens is relatively dense and well-knit, while the structure of reflector lenses consisting of two parts is fragile. So it is suitable for application in the field of on-board system and mobile monitor.
The scope of the invention is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the present invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram illustrating an embedded vision tracker for processing extremely wide-angle lens images in accordance with an embodiment of the present invention;
FIG. 2 is a flowchart illustrating the mobile guiding method of beacons tracking by processing extremely wide-angle lens images in accordance with another embodiment of the present invention;
FIG. 3 depicts an environment representation of a vehicle utilizing the present invention to track the sequential double color beacons array, and
FIG. 4 is a diagram illustrating an extremely wide-angle lens model.
Embodiments of the present invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures. It is also noted that the figures are not necessarily drawn to scale.
DESCRIPTION OF THE PREFERRED EMBODIMENT
FIG. 1 shows a block diagram illustrating the embedded vision tracker 200 that processes extremely wide-angle images in accordance with an embodiment of the present invention. The embedded vision tracker 200 includes a Digital Signal Processor (DSP) 214, Field-Programmable Gate Array (FPGA) 210, FLASH memory 215, and Synchronous Dynamic Random Access Memory (SDRAM) 216, each element operably coupled to data bus 211 and address bus 212. A CMOS image senor 213 is operably coupled to FPGA 210. Embedded vision tracker 200 also includes an Ethernet interface 217 operably coupled to data bus 211 and address bus 212.
Extremely wide-angle lens 201, connected to said embedded vision tracker 200, is in possession of a 180° view of the environment. The extremely wide-angle lens produces a spherical field of optical view of the environment. In the application of the present invention, the extremely wide-angle lens 201 is placed vertically upward to capture the image of the top sequential beacons array.
The CMOS image senor 213, connected to extremely wide-angle lens 201, provides image data to FPGA 210 and stores the image data in SDRAM 216. The CMOS image senor 213 could convert optical view to digital image.
The Field-Programmable Gate Array (FPGA) 210, connected to the CMOS image senor 213, adjusts the capturing time sequence of the CMOS image senor and writes the image data to the SDRAM 216.
Image data stored in SDRAM 216 is transmitted to DSP 214, which may perform multiple functions, including calibration, image processing, sequence beacons tracking, and devices controlling.
Software for directing the operation of embedded vision tracker 200 may be stored in FLASH memory 215 for executing by DSP 214. Synchronous Dynamic Random Access Memory (SDRAM) 216 permits reading and writing of the data cache and image data to execute software programs.
One of the input and output means of the embedded vision tracker 200 is Ethernet interface 217. The user may input data, such as targets aimpoint and feature, through interface 217. Ethernet interface 217 could also receives data processed by DSP 214 and may transmit beacons tracking informations to PC and/or other external devices.
The mobile guiding method (MGM) of tracking sequential double color beacons array in accordance with an embodiment of the present invention will now be discussed in greater detail with reference to FIG. 2, which shows a flowchart illustrating the mobile guiding method of the embedded vision tracker 200 by processing extremely wide-angle lens images. The MGM includes 6 key technical points: vision system modeling and calibration in step 310, double beacons recognizing in step 303, alternate beacons tracking in step 305, beacons rechecking in step 313, beacons localizing in step 315, and beacons position rectifying in step 318.
The present invention is used to tracking the sequential double color beacons array with an extremely wide-angle lens. The embedded vision tracker is placed on the top of the vehicle (mobile monitor, robot etc.), and ensure the extremely wide-angle lens plumbed, as shown in FIG. 3, the navigation landmarks is sequential double color beacons array at the top. The embedded vision tracker could capture multiple groups
of beacons at one time, and then choose the best available group to track, and finally translate the image coordinates of the beacons to the space coordinates and output the tracking results.
The modeling of vision system is presented and calibrated in step 301, for obtaining necessary parameters. The modeling of vision system is built as shown in FIG. 4. There are two coordinate systems:
1. The lens coordinate system: This coordinate system is established based on a virtual semi-spherical lens model, which converts from the ordinary extremely wide-angle lens 201 which composed by the multi-layer optical lens. According to the optical axis and semi-spherical lens, a theoretical refracting optical center O, could be established, which could set as the origin of the lens coordinate system. This coordinate system is the space coordinate system of the extremely wide-angle lens, and also could represent the space coordinate system of the tracker.
2. The image coordinate system: This coordinate system is established based on the plane of the CMOS image senor 213, which is two-dimensional (2-D) system. The coordinate system relation between the CMOS senor and the image is one-to-one correspondence, so the image coordinates could be used instead of the CMOS senor coordinates. The origin of the image coordinate system is top-left of the image and the unit is pixel.
The model of the vision system satisfies Equidistance Projection Model of the extremely wide-angle lens, as shown in FIG.4, In the FIG. 4, P (x, y, z) is the point in the lens coordinate system; the projection point of P (x, y, z~) is Pm in the image coordinate system; O2 (w0, v0 ) , which is called image center, is the intersection of the optical axis with the plane of the image coordinate system; and there is a relation as the equation 1 :
r = Kco , (1) where r is the distance projection point Puv to the image center 02 0 , v0 ) ; CO means the angle of an incident ray; K is the radial distortion factor of the extremely wide-angle lens.
The image center O2 (w0, v0 ) and radial distortion factor K needs to be calibrated. The image center O2 (w0, v0 ) could be calibrated by laser calibration method. The radial distortion K could be calibrated by model projection calibration method based on the equation 1.
The primary task of the tracker initializing in step 302 is to set the parameters of the tracker, which includes the feature of the beacons, the number of the tracking samples, and the size of the tracking-gates. These parameters enter via Ethernet interface 217. The presetting feature of the targets could be set as RGB color, oriented gradients, intensity, template matching, shape and other features. For extremely wide-angle lens image, there is a serious shape distortion, so the tracker using RGB color as the main feature of the targets in the present invention.
Double color beacons recognizing means automatic recognizing the beacons in step 303, according the presetting feature of the targets which enter in step 302. Targets automatic recognition operation achieves by color threshold segmentation. The RGB value have been chosen to extract the target group which composed by two beacons with different colors.
By processing of color threshold segmentation in step 303, there will be several groups of beacons and
some other noises. So the tracker needs beacons pairing and labeling in step 304, to choose the relevant group of the beacons. The connected region algorithm is used to deal with noise suppression. Typically, the relevant beacons are characterized by forming a connected set of pixels by processing of connected area segmentation. The eight connected region method is used to count the connected components. Every connected pixel which belongs to the same connected components will be labeled to a same value. By previous step, several beacons could be distinguished by different values. The two beacons of the relevant group should simultaneously satisfy the following three conditions: the different color between the two beacons, the shortest distance between image centers and the biggest connected components.
The relevant group of the beacons recognized from the extremely wide-angle lens image in step 304 would be tracked in step 305. The tracking method is based on probability approximated by a set of prediction samples of the beacons. In order to run high-performance in embedded system, the tracker uses an alternate-implementation of tracking technology. This technology divides the core processing of tracking into three steps: sampling, weighting and resampling, and alternate implements each step for tracking double beacons.. The processing of alternate beacons tracking includes: initializing of alternate beacons tracking in step 305, beacons sampling in step 306 and 307, beacons weighting in step 308 and 309, beacons resampling in step 310 and 311, and tracking-gates generating in step 312.
The initializing of alternate beacons tracking in step 305 includes building the color histogram of the beacons and generating the samples. The color histogram of beacons is defined as a method of description of color feature distribution. The color histogram is obtained by slitting the range of the color feature into equal-sized bins (called classes), then for each bin, the number of points from the image that fall into each bin are counted. The color histogram provides a strong indication of the proper distributional model for the color feature of the beacons.
The samples of beacons are the probability prediction of beacon's location, which generate by a random sampling around the beacon. The number of samples for each target is 50 in the present invention. The structure of sample includes the weight and the coordinates of samples in the image. The weight (confidence score) of each sample is adaptively determined by measuring similarity degree of the samples and the beacons, using the following equation 2:
Weight n = 2 exp (2),
where x; is the value in the bin of the color histogram of the beacon, z; is the value in the bin of the color histogram of the sample, m is the length of the color histogram, n = 1...Count is the number of the samples (Count is 50 in the present invention), C is constant (in the present invention, C is set to 20).
The beacons sampling in step 306 and 307: beacons sampling means generating new samples by a random sampling around the existing sample (the sample in previous frame). The samples of beacon are randomly generated, so the samples need to converge \~2 times to the location of the beacons in order to ensure that they could better respond the statement of the beacons. The processing of converge is described as the following equation 3 :
Σ1^
where y is the center location of the converged sample, y . is the pixel location of old sample propagated from previous frame, Π is the number of the pixel of the sample (the height of sample-window multiply by the width of the sample-window), w . means the weight of the pixel x of the sample, w . is computed by equation 4: wj =∑S[b (x] -i) (4),
i=l np * t where δ( χ) is Delta function, b(x) is a function as b : R2 — > {1· - -m} , which means the mapping of the color value of the pixel x . to the bin index of the color histogram, m is the length of the color histogram, hf is the value of color histogram of the beacon in the bin, hf is the value of color histogram of the sample in the bin.
The beacons weighting in step 308 and 309: Compute the weight of new samples generated from step 306 and 307, using the equation 2. And then normalize the weights, by equation 5: wi = -a —> i = l ---Count> (5)·
∑WJ
The beacons resampling in step 310 and 311 : After a few iterations, the samples could be degeneracy and could not accurately reflect the statement of the beacons, that is, the weights of these samples are too small. The degeneracy can be reduced by resampling. We set a threshold (X = Avgw * Kw , where Avgw is the average of weights, Kw is threshold factor (the relation of Kw and the number of resample is proportional, in the present invention, Kw = 0.2 ). When the weight of one sample falls below the threshold OC , the sample is degeneracy. A new sample around the target needs to be resampled to replace it. After resampling, the weights of the new samples also need to be computed as the step 308 and 309.
Count
The tracking-gates generating in step 312: Calculate all the locations of the samples as x = ^ wt x{ , where wf is the weight of the sample, x; is the location of the sample, x is the prediction location of the beacon. And output the x as the tracking-gates of the beacons.
After generating the tracking-gates in step 312, the embedded vision tracker will recheck the validity of the beacons in the tracking-gates in step 313. Compute each weight of the beacons in the tracking-gates and compare them with the threshold, which could be the threshold OC in the step 310 or a new threshold set by user via Ethernet interface 217. If the weights fall below the threshold, the beacons in the tracking-gate are invalid and return the step 303 to recognize a new group of the beacons.
In the processing of tracking the sequential beacons array, some beacons will appear and others will vanish in the image of the extremely wide-angle lens all the time. Therefore, the embedded vision tracker use the beacon localizing for switching the beacons in step 315. A tracking area is set to determinate the validity of the beacons. If either of tracking beacons' position is out of the tracking area, the tracker will return to double color beacons recognizing in step 303 for switching to the next group of the valid beacons.
Oppositely, if the beacons are in the tracking area, the coordinates of the beacons in image coordinate system will be rectified in step 318, for transforming into the coordinates in the lens coordinate system (space coordinate system). The beacons are deformation in the image because of inherent distortion of the extremely wide-angle lens, and so the coordinates of the beacons is. In order to obtain the space position of the beacons, the image should be rectified to get the undistorted information. The target information is useful to tracker, so it only needs to rectify the coordinates of the tracking beacons thought the image distortion rectification in unit 317. The rectification algorithm is computed by equation 6:
■* tan (^/ - '
* (V - Vp )
K)
where (u0 , v0 ) is the center of the image which calibrated in step 301, (u , v) is the coordinate of beacon in the image coordinate system, (x, y) is the coordinate of beacon in the lens coordinate system (space coordinate system), r is the distance projection point Puv to the image center 02 (w0, v0 ) as shown in
FIG. 4, K is the radial distortion factor of the extremely wide-angle lens which also calibrated in step 301, H is the height of the sequential beacons array.
In the end, output the coordinates of the target in lens coordinate system through step 319 and input the next frame of extremely wide-angle lens image for continuous tracking.
Embodiments described above illustrate but do not limit the invention. It should also be understood that lots of modifications and variations are possible in accordance with the principles of the present invention. Accordingly, the scope of the invention is defined only by the following claims.
Claims
1. An embedded targets tracker using extremely wide-angle lens, said tracker is placed on the top of vehicle for tracking landmarks and comprising:
a CMOS image senor connected to extremely wide-angle lens and adapted to output digital image data of a field of view ;
a Field-Programmable Gate Array (FPGA) connected to said CMOS image senor and adapted to adjust the capturing time sequence of the CMOS image senor and write the image data to the SDRAM; a Digital Signal Processor (DSP) adapted to process main functions;
a FLASH memory adapted to storing software for directing the operation of embedded targets tracker;
a Synchronous Dynamic Random Access Memory (SDRAM) adapted to permit reading and writing of the data cache and image data to execute software programs.
an Ethernet interface adapted to communicating information and data to other external devices.
2. The tracker of claim 1, wherein said landmarks is the sequential double color beacons array on the ceiling of the navigation path.
3. A mobile guiding method (MGM) for tracking sequential double color beacons array, comprising: vision system modeling and calibration, for obtaining necessary parameters of the embedded vision tracker;
tracker initializing, for setting the parameters of the tracker, which enter via Ethernet interface;
double beacons recognizing and pairing, for choosing the relevant group of the beacons;
alternate beacons tracking, and generating the tracking-gates of the beacons;
beacons rechecking, for checking the validity of the beacons in the tracking-gates;
beacons localizing, for switching the sequential beacons;
beacons position rectifying, for transforming to the space coordinates;
beacons coordinates outputting, for outputting the space coordinates of the beacons via Ethernet interface.
4. The method of claim 3, wherein said the vision system modeling and calibration build the model of extremely wide-angle lens, the model includes the lens coordinate system and image coordinate system.
5. The method of claim 4, wherein said the lens coordinate system convert ordinary extremely wide-angle lens which composed by the multi-layer optical lens into one virtual lens model which composed by a semi-spherical lens; and establish a theoretical refracting optical center O, , which could set as the origin of the lens coordinate system.
6. The method of claim 4, wherein said the image coordinate system is established based on the plane of the CMOS image senor, which is two-dimensional (2-D) system; the image coordinates is used to instead of the CMOS senor coordinates.
7. The method of claim 3, wherein said double beacons recognizing is used interconnected domain segmentation for recognizing and pairing the effective beacons from the image captured by CMOS senor.
8. The method of claim 3, wherein said the alternate beacons tracking includes:
initializing of alternate beacons tracking, for building the color histogram of the beacons and generating the samples;
sampling the beacons, for generating the samples by randomly sampling around the beacon; weighting the samples, for computing the weight of each sample, which is adaptively determined by measuring similarity degree of the samples and the beacons;
resampling the beacons, for generating the new samples to replace the degrade samples;
generating the tracking-gates of the beacons, for calculating the coordinates of the samples to predict the coordinates of the beacons.
9. The method of claim 8, wherein said the alternate beacons tracking uses an alternate-implementation of tracking technology which divides the main processing of tracking into three steps: sampling, weighting and resampling, and alternate implements each step for tracking double beacons.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2009/074564 WO2011047508A1 (en) | 2009-10-22 | 2009-10-22 | Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2009/074564 WO2011047508A1 (en) | 2009-10-22 | 2009-10-22 | Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011047508A1 true WO2011047508A1 (en) | 2011-04-28 |
Family
ID=43899776
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2009/074564 WO2011047508A1 (en) | 2009-10-22 | 2009-10-22 | Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2011047508A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102880180A (en) * | 2012-10-13 | 2013-01-16 | 北京工业大学 | LabVIEW Robotics-based visual remote robot |
CN102902642A (en) * | 2012-09-24 | 2013-01-30 | 电子科技大学 | Field programmable gate array-digital signal processor (FPGA-DSP) high speed data exchange method based on data monitoring |
CN103198320A (en) * | 2013-04-24 | 2013-07-10 | 厦门大学 | Self-adaptive vision-aided driving device |
CN108594818A (en) * | 2018-04-27 | 2018-09-28 | 深圳市商汤科技有限公司 | Intelligent driving control method, intelligent vehicle-carried equipment and system |
CN110920886A (en) * | 2019-11-22 | 2020-03-27 | 浙江工业大学 | Many rotor unmanned aerial vehicle remove power supply unit based on vision |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4670648A (en) * | 1985-03-06 | 1987-06-02 | University Of Cincinnati | Omnidirectional vision system for controllng mobile machines |
US5155684A (en) * | 1988-10-25 | 1992-10-13 | Tennant Company | Guiding an unmanned vehicle by reference to overhead features |
JPH11272328A (en) * | 1998-03-25 | 1999-10-08 | Nippon Signal Co Ltd:The | Color mark, moving robot and method for guiding moving robot |
CN101447075A (en) * | 2008-12-31 | 2009-06-03 | 天津理工大学 | Wide-angle lens-based FPGA & DSP embedded multi-valued targets threshold categorization tracking device |
CN101452292A (en) * | 2008-12-29 | 2009-06-10 | 天津理工大学 | Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark |
CN101451849A (en) * | 2008-12-26 | 2009-06-10 | 天津理工大学 | Multifunction marking for vision navigation of mobile object and synthesis navigation method |
CN101561270A (en) * | 2009-05-27 | 2009-10-21 | 天津理工大学 | Embedded omnidirectional ball vision object detection and mobile monitoring system and embedded omnidirectional ball vision object detection and mobile monitoring method |
-
2009
- 2009-10-22 WO PCT/CN2009/074564 patent/WO2011047508A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4670648A (en) * | 1985-03-06 | 1987-06-02 | University Of Cincinnati | Omnidirectional vision system for controllng mobile machines |
US5155684A (en) * | 1988-10-25 | 1992-10-13 | Tennant Company | Guiding an unmanned vehicle by reference to overhead features |
JPH11272328A (en) * | 1998-03-25 | 1999-10-08 | Nippon Signal Co Ltd:The | Color mark, moving robot and method for guiding moving robot |
CN101451849A (en) * | 2008-12-26 | 2009-06-10 | 天津理工大学 | Multifunction marking for vision navigation of mobile object and synthesis navigation method |
CN101452292A (en) * | 2008-12-29 | 2009-06-10 | 天津理工大学 | Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark |
CN101447075A (en) * | 2008-12-31 | 2009-06-03 | 天津理工大学 | Wide-angle lens-based FPGA & DSP embedded multi-valued targets threshold categorization tracking device |
CN101561270A (en) * | 2009-05-27 | 2009-10-21 | 天津理工大学 | Embedded omnidirectional ball vision object detection and mobile monitoring system and embedded omnidirectional ball vision object detection and mobile monitoring method |
Non-Patent Citations (1)
Title |
---|
LIU, SHIYU: "Dynamic Localization for AGV Based on Object Tracking Using Omni-directional Vision", CHINESE MASTER'S THESES FULL-TEXT DATABASE INFORMATION SCIENCE AND TECHNOLOGY, 15 March 2009 (2009-03-15), pages I140 - I212 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102902642A (en) * | 2012-09-24 | 2013-01-30 | 电子科技大学 | Field programmable gate array-digital signal processor (FPGA-DSP) high speed data exchange method based on data monitoring |
CN102880180A (en) * | 2012-10-13 | 2013-01-16 | 北京工业大学 | LabVIEW Robotics-based visual remote robot |
CN103198320A (en) * | 2013-04-24 | 2013-07-10 | 厦门大学 | Self-adaptive vision-aided driving device |
CN108594818A (en) * | 2018-04-27 | 2018-09-28 | 深圳市商汤科技有限公司 | Intelligent driving control method, intelligent vehicle-carried equipment and system |
CN110920886A (en) * | 2019-11-22 | 2020-03-27 | 浙江工业大学 | Many rotor unmanned aerial vehicle remove power supply unit based on vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9646212B2 (en) | Methods, devices and systems for detecting objects in a video | |
WO2016106954A1 (en) | Low-orbit satellite-borne spectrogram correlation detection method and load | |
KR100941418B1 (en) | A localization method of moving robot | |
CN107111598B (en) | Optical flow imaging system and method using ultrasound depth sensing | |
JP5872818B2 (en) | Positioning processing device, positioning processing method, and image processing device | |
US8103055B2 (en) | Detection of blobs in images | |
CN111426388A (en) | Personnel body temperature measuring method, system, computer storage medium and electronic equipment | |
CN111144207B (en) | Human body detection and tracking method based on multi-mode information perception | |
CN110889829A (en) | Monocular distance measurement method based on fisheye lens | |
Sogo et al. | Real-time target localization and tracking by n-ocular stereo | |
CN109341668A (en) | Polyphaser measurement method based on refraction projection model and beam ray tracing method | |
CN105526906A (en) | Wide-angle dynamic high-precision laser angle measurement method | |
WO2011047508A1 (en) | Embedded vision tracker and mobile guiding method for tracking sequential double color beacons array with extremely wide-angle lens | |
CN117113284B (en) | Multi-sensor fusion data processing method and device and multi-sensor fusion method | |
Qiu et al. | The image stitching algorithm based on aggregated star groups | |
CN115767424A (en) | Video positioning method based on RSS and CSI fusion | |
Shen et al. | YCANet: Target Detection for Complex Traffic Scenes Based on Camera-LiDAR Fusion | |
KR101300166B1 (en) | Apparatus and method for detecting iris | |
KR102106890B1 (en) | Mini Integrated-control device | |
Su | Vanishing points in road recognition: A review | |
KR102106889B1 (en) | Mini Integrated-control device | |
CN117523428B (en) | Ground target detection method and device based on aircraft platform | |
TWI819613B (en) | Dual sensing method of object and computing apparatus for object sensing | |
Zhang et al. | Research on binocular real-time ranging method in window area | |
Zhi et al. | Research on a Miss Distance Measurement Method Based on UAV |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09850504 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09850504 Country of ref document: EP Kind code of ref document: A1 |