CN108961342B - Calibration method and system of optical flow sensor - Google Patents

Calibration method and system of optical flow sensor Download PDF

Info

Publication number
CN108961342B
CN108961342B CN201810409130.9A CN201810409130A CN108961342B CN 108961342 B CN108961342 B CN 108961342B CN 201810409130 A CN201810409130 A CN 201810409130A CN 108961342 B CN108961342 B CN 108961342B
Authority
CN
China
Prior art keywords
marker
optical flow
image
robot
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810409130.9A
Other languages
Chinese (zh)
Other versions
CN108961342A (en
Inventor
肖刚军
邓文拔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201810409130.9A priority Critical patent/CN108961342B/en
Publication of CN108961342A publication Critical patent/CN108961342A/en
Application granted granted Critical
Publication of CN108961342B publication Critical patent/CN108961342B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a calibration method and a calibration system of an optical flow sensor, wherein a robot acquires an image of a marker and a marker coordinate marked on the image through a camera module, and an image analysis module completes optical flow operation of an image between two adjacent markers to obtain displacement reference data of the robot by acquiring an optical flow from an image of the marker in a previous frame to an image of the marker in a current frame; the calculation module calculates to obtain actual displacement data of the robot according to the relative position relation among the marker, the optical flow sensor and the calibration symbol; and the calibration module performs fusion processing on the displacement reference data of the robot and the actual displacement data of the robot, eliminates the accumulated drift of the optical flow, and calibrates the sensing data of the optical flow sensor to realize the accurate positioning of the optical flow sensor.

Description

Calibration method and system of optical flow sensor
Technical Field
The invention relates to the field of robot detection and control, in particular to a method and a system for calibrating an optical flow sensor.
Background
The camera on the robot acquires a series of continuously changing images in the process of scene movement, and the goal of the visual SLAM is to estimate the movement of the robot through images between adjacent frames, and perform positioning and map construction. Under the conventional conditions, the visual SLAM estimates the motion of a camera on an optical flow sensor between adjacent images using the optical flow sensor, and calculates the position of a spatial point corresponding to each pixel from the position of the camera at each time. The common method for estimating the motion of the optical flow sensor is an optical flow method, which is a method for predicting the motion of a pixel point by using the correlation of pixel intensity data in an image sequence, that is, the change of image brightness in time is researched to establish a motion field of a target pixel point set. Optical flow (also called image flow) refers to the projection of motion generated when a moving object in three-dimensional space is imaged on a two-dimensional image plane, which is represented in the form of "flow" of image intensity, called optical flow. The optical flow field is studied in order to approximate three-dimensional spatial motion fields that are not directly available from the sequence of images.
Gibson and Wallach et al first proposed in the last 50 centuries the assumption that spatial three-dimensional motion and structural parameters can be recovered from two-dimensional planar optical flow fields. In 1981, Hom and schunck propose a first practical and effective optical flow calculation method, which becomes a foundation for the development of optical flow algorithms. Since then, research on optical flow has become a hotspot in the field of computer vision, resulting in a great deal of research effort. These methods can be classified into a differentiation method, a matching method, an energy method, a phase method, and a neurokinetic method. The differential method has better comprehensive performance, relatively smaller calculated amount and better effect, and is widely used in practice. It uses the temporal and spatial differentiation (gradient function) of the time-varying image (video) grey values to calculate the velocity vector of the pixel. The differential method mainly comprises the following steps: 1. a Horn-sehunek global smoothing method; 2. Lucas-Kanade local smoothing method; 3. specific algorithms such as Nagel directed smoothing method.
In the prior art, when a machine vision method is adopted to estimate the motion relation of a sweeper, a current frame and a reference frame are defined, the reference frame is used as a coordinate system, and feature matching is carried out on the current frame and the reference frame. Only the mode of the reference frame/current frame is considered, so that the pose estimation is over dependent on the reference frame on one hand. If the quality of the reference frame is too poor, for example, under the conditions of severe occlusion and illumination change, tracking is easily lost, so that the estimation of the position and posture of the reference frame is inaccurate, and accumulative drift inevitably occurs. Each estimation is carried out with a certain error, the error of the previous moment is transmitted to the next moment, and after a period of time, the estimated motion track is not accurate any more. For example, the robot turns first 90 degrees to the left and then 90 degrees to the right. Because of the error, the first 90 degrees is estimated to be 89 degrees, the estimated position of the robot after turning to the right is not the origin position, even if the estimation is accurate again, compared with the real value, an error of-1 degree occurs, and the original 90 degrees right angle in the built map becomes askew.
The invention is a Chinese patent CN103438904B, which uses an inertial positioning method and system of vision auxiliary correction, and sets a separate camera module and a positioning mark; the camera module is arranged at the calibration end and the positioning mark is arranged at the inertial sensor end, or the positioning mark is arranged at the calibration end and the camera module is arranged at the inertial sensor end. Then, shooting the positioning identification by using a camera module, and calculating to obtain correction data according to a shooting result; the invention needs to use a plurality of sensors to be matched for correction, and the setting of the positioning identification position causes larger measurement error, thus causing low accuracy of the correction result.
Disclosure of Invention
In order to overcome the technical problems and solve the error phenomenon that the optical flow sensor can accumulate drift in the process of sensing an object or positioning, the invention provides a calibration method and a calibration system for acquiring an identifier by the optical flow sensor under the condition that a robot keeps a preset distance to walk along the edge of the same marker, and the technical scheme is as follows:
a method of calibrating an optical flow sensor, comprising the steps of:
step 1, a robot walks along the edge of the same marker and keeps a preset distance from the edge of the same marker, an optical flow sensor collects an image of the marker, and a marker coordinate is positioned from the image of the marker;
step 2, starting from the positioning to one marker coordinate in the marker, acquiring optical flow from an image of the marker in a previous frame to an image of the marker in a current frame, performing optical flow operation, finishing the optical flow operation when the marker coordinate adjacent to the marker coordinate in the marker image is positioned, obtaining the relative speed of the robot relative to the marker, and calculating displacement reference data of the robot through time dimension integration;
step 3, calculating actual displacement data of the robot according to a preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols;
and 4, performing data fusion on the actual displacement data of the robot and the displacement reference data of the robot, and calibrating the sensing data of the optical flow sensor.
Further, in step 1, while the robot walks along the edge of the same marker, images of the marker are continuously acquired, and gray information I (x, y, t) of each frame of image is acquired, wherein the gray information I (x, y, t) represents a function of position and time.
Further, the step 1 also includes that the method for locating the coordinates of the marker from the image of the marker includes finding two characteristic straight lines passing through the marker from the image information of the marker acquired by the optical flow sensor; fitting the two characteristic straight lines by using a linear equation to obtain a linear equation of the characteristic straight lines; and solving the intersection point of the two characteristic straight lines and recording as the coordinate of the calibration symbol.
Further, the optical flow operation in step 2 includes:
acquiring optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, and calculating the time gradient of the gray level at the pixel point (x, y):
Figure 595926DEST_PATH_IMAGE001
spatial gradient:
Figure 68496DEST_PATH_IMAGE002
Figure 244262DEST_PATH_IMAGE003
wherein I is the gray value of the pixel point, x and y are the positions of the pixel point in the image of the marker, and t is time;
obtaining an optical flow field equation of the pixel point according to the principle that the gray value of the same pixel point collected in the process that the robot walks along the edge of the marker keeps unchanged:
Figure 231810DEST_PATH_IMAGE004
Figure 456118DEST_PATH_IMAGE005
Figure 517615DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure 864282DEST_PATH_IMAGE007
the optical flow in the X-axis direction is shown,
Figure 276809DEST_PATH_IMAGE008
an optical flow representing the Y-axis direction;
the robot walks along the edge of the same marker, and the optical flow equations of a plurality of pixel points are simultaneously established according to the assumed principle that pixel points in the image area of the marker with pixel points (x, y) as the center have the same speed u and v, so that the optical flow of the marker relative to the robot is solved:
Figure 367125DEST_PATH_IMAGE009
and obtaining the relative speed of the robot relative to the marker according to the optical flow operation result, and then calculating the displacement reference data of the robot through time dimension integration.
Further, the relative speed of the robot relative to the marker is obtained by converting the pixel units of the speeds u and v into length units and converting the pixel units of the speeds u and v into a geodetic coordinate system through rotation and translation transformation.
Further, in step 3, according to the principle that light propagates along a straight line, by combining the preset position relationship between the center of the marker and the calibration symbol, the preset distance maintained between the robot and the marker, and the fixed distance between two adjacent calibration symbols, the actual displacement data of the robot is calculated by a method that the side lengths of similar triangles are proportional.
Further, in step 4, the sensing data of the optical flow sensor includes pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system.
A calibration system for an optical flow sensor, the system using the calibration method, the system comprising: the system comprises a marker, a camera module, an image analysis module, a calculation module and a calibration module;
the marker is used for marking a calibration symbol with a preset position relation with a marker center, wherein the marker center is the central point of a circumscribed geometric figure of the marker horizontal projection plane;
the camera module is arranged on the robot and used for shooting the marker in real time to obtain an image of the marker and transmitting the image to the image analysis module for processing, wherein the marker is in a static state;
the image analysis module is used for identifying the marker from the image of the marker, then positioning the marker to one marker coordinate in the marker, acquiring optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, performing optical flow operation, finishing the optical flow operation when the marker is positioned to the adjacent marker coordinate in the image of the marker, obtaining the relative speed of the robot relative to the marker, and calculating displacement reference data of the robot through time dimension integration;
the calculation module is used for calculating the actual displacement data of the robot by using a method that the side lengths of similar triangles are proportional by combining the preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols;
and the calibration module is used for carrying out data fusion on the actual displacement data of the robot in the calculation module and the displacement reference data of the robot in the image analysis module, and calibrating the sensing data of the optical flow sensor, wherein the sensing data of the optical flow sensor comprises pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system.
Further, two calibration symbols with fixed distances arranged on the marker are set as infrared LED lamps.
Further, the preset position relationship is that a connecting line of the marker and the center of the marker on the horizontal projection plane is in the same straight line direction with the light emitted by the marker.
The invention discloses an optical flow computing method based on an optical flow sensor, which is used for only collecting the light intensity change sensitivity and sampling of a marker, so that the output does not contain a static background, the redundant information and the output data volume are greatly reduced, and the computing time and the resource requirement of a back-end processing algorithm are further obviously reduced;
the invention combines an optical flow calculation method, calculates the reference displacement of the robot by utilizing the identifier shot by the optical flow sensor, and constructs a geometric similarity graph to obtain the actual displacement data of the robot, thereby correcting the sensing data of the optical flow sensor, effectively avoiding the influence of image pixel accumulation drift and improving the precision of the optical flow sensor.
Drawings
FIG. 1 is a schematic block diagram of a calibration system for an optical flow sensor provided by the present invention;
FIG. 2 is a flow chart of a method of calibrating an optical flow sensor for use in the present invention;
fig. 3 is a mathematical model for obtaining actual displacement data of the robot according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention. It should be understood that the following specific examples are illustrative only and are not intended to limit the invention.
When the robot moves along the edge, the optical flow sensor senses that the content of the image feature block at the edge is basically unchanged and is easy to be regarded as the same place in a program. And the characteristic points which are not at the corner points or the edges can frequently jump, the position is unstable, and the calibration of the optical flow sensor is not facilitated. In the embodiment of the invention, the robot walks along the edge of the same marker and keeps a preset distance from the edge of the same marker, and the optical flow sensor acquires the image of the marker.
The invention provides a calibration method of an optical flow sensor, which is suitable for calibration in a scene that a robot moves along the edge of an object. FIG. 2 is a flow chart of a calibration method of an optical flow sensor provided by the present invention.
Step 1, a robot walks along the edge of the same marker and keeps a preset distance from the edge of the same marker, an optical flow sensor collects an image of the marker, and a marker coordinate is positioned from the image of the marker; wherein the marker is in a quiescent state within the chamber; the robot is a sweeping robot in the practice of the invention; the infrared LED lamp is selected in the implementation of the invention, the color difference between the infrared LED lamp and the surrounding area is large, and the infrared LED lamp can be identified and marked in the image shot by the optical flow sensor, thereby being beneficial to the subsequent optical flow calculation processing.
And 2, when the optical flow sensor is positioned to one marker coordinate in the marker, performing optical flow calculation on the image, maintaining the positioning to the adjacent marker coordinate in the marker image, acquiring the optical flow from the image of the marker in the previous frame to the image of the marker in the current frame in the period, completing the optical flow operation, obtaining the relative speed of the robot relative to the marker in the earth coordinate system through the coordinate system conversion, and calculating the displacement data of the robot through time dimension integration to be used as displacement reference data.
Step 3, according to the preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols, constructing a geometric relation to calculate the actual displacement data of the robot;
and 4, performing Kalman filtering, extended Kalman filtering, least square method and other methods on the displacement reference data of the robot obtained in the step 2 and the actual displacement data of the robot obtained by calculation in the step 3, so as to complete the fusion processing of the displacement reference data and the actual displacement data, obtain calibration parameters, update the sensing data of the optical flow sensor, eliminate the accumulated drift of the optical flow, and correct the sensing data of the optical flow sensor, so as to realize the accurate positioning of the optical flow sensor.
According to one embodiment of the present invention, in step 1, during the robot walking along the edge of the same marker, M frames of images of the marker are continuously collected, and gray scale information I (x, y, t) of each frame of image is obtained, wherein the gray scale information I (x, y, t) represents a function of position and time, and the variable M is set to be equal to or greater than 10 in units of frames.
According to an embodiment of the present invention, the step 1 further includes that the method for locating the coordinates of the marker from the image of the marker includes finding two characteristic straight lines passing through the marker from the image gray scale information of the marker collected by the optical flow sensor; a plurality of pixel points in the image gray information are distributed in the area near the characteristic straight line, and two fitted linear equations are obtained by using a least square method, namely the two characteristic straight lines are fitted by using the obtained linear equation method to obtain a linear equation of the characteristic straight line; the method for fitting the linear equation comprises a least square method; and solving the intersection point of the two characteristic straight lines, recording the intersection point as the coordinate of the marker in the image coordinate system, and detecting and starting the subsequent optical flow calculation. In this embodiment, the marker is made of a thin plate with a flat outer surface, such as a wood plate, a plastic plate, a metal plate, or a cardboard, and the ink lines or the hair lines, the silk lines, the adhesive tapes, and the like on the marker, which are straightened and attached to the outer surface of the flat plate, can be recognized by the resolution of the optical flow sensor, that is, the characteristic straight lines are distinguished from the background according to the brightness information.
According to an embodiment of the present invention, the optical flow calculation in step 2 includes obtaining an optical flow from an image of the marker in a previous frame to an image of the marker in a current frame, and calculating a time gradient of a gray level at the pixel point (x, y):
Figure 548707DEST_PATH_IMAGE010
spatial gradient:
Figure 803627DEST_PATH_IMAGE002
Figure 703450DEST_PATH_IMAGE003
wherein I is the gray value of the pixel point, x and y are the positions of the pixel point in the image of the marker, and t is time;
when the robot walks along the edge of the marker, the collected image between two adjacent markers meets the gray scale invariance principle, wherein the gray scale invariance principle refers to the pixel gray scale value of the same space point, which is fixed and unchangeable in each image, for example, for the t moment, the pixel gray scale value is located between the two adjacent markers
Figure 331877DEST_PATH_IMAGE011
At a pixel in
Figure 633546DEST_PATH_IMAGE012
At a moment, move to
Figure 56437DEST_PATH_IMAGE013
To (3). Since the gray scale does not change, the method comprises the following steps:
Figure 443556DEST_PATH_IMAGE014
carrying out Taylor expansion on the left equation, reserving a first order term, and obtaining an optical flow field equation of a pixel point according to the principle that the gray value of the same pixel point collected in the process that the robot walks along the edge of the marker is kept unchanged:
Figure 813357DEST_PATH_IMAGE004
Figure 766270DEST_PATH_IMAGE005
Figure 625641DEST_PATH_IMAGE015
wherein the content of the first and second substances,
Figure 234477DEST_PATH_IMAGE007
the optical flow in the X-axis direction is shown,
Figure 470286DEST_PATH_IMAGE008
an optical flow representing the Y-axis direction;
the robot walks along the edge of the same marker, and meets the principle of local smooth assumption, namely in the acquired optical flow, pixels in the same image have the same motion, so that pixel points in an image area of the marker with pixel points (x, y) as the center have the same speed u and v, and the optical flow equations of a plurality of pixel points are connected to calculate the optical flow of the marker relative to the robot:
Figure 543285DEST_PATH_IMAGE016
and obtaining the relative speed of the robot relative to the marker according to the optical flow operation result, and then calculating the displacement reference data of the robot through time dimension integration. According to one embodiment of the invention, the relative velocity of the robot with respect to the marker is a linear velocity in a geodetic coordinate system
Figure 573557DEST_PATH_IMAGE017
Figure 669689DEST_PATH_IMAGE018
Converting the pixel units of the velocities u, v into length units and transforming from the image coordinate system to geodetic coordinates by rotation and translation transformationIs obtained by the following expression
Figure 434821DEST_PATH_IMAGE019
Wherein
Figure 362326DEST_PATH_IMAGE020
Representing the rotation angle of the robot.
According to an embodiment of the present invention, in step 3, according to a principle that light propagates along a straight line, by combining a preset position relationship between the center of the marker and the calibration symbol, the preset distance maintained between the robot and the marker, and a fixed distance between two adjacent calibration symbols, the actual displacement data of the robot is calculated by a method in which side lengths of similar triangles are proportional.
For example, as shown in fig. 3, the center of the marker is point O, point C and point B are two adjacent markers, point a is the robot motion position when the optical flow sensor is positioned to the marker coordinate at point C, point a1 is the robot motion position when the optical flow sensor is positioned to the marker coordinate at point B, and the robot moves from position a to position a 1; a variable L indicates a distance between the marker and a horizontal line passing through the center of the marker on the premise that the marker and the center of the marker maintain the preset positional relationship, and the variable is preset according to the shape of the marker; a variable H represents the distance between the center of the robot and a horizontal line passing through the center of the marker on the premise that the robot and the marker keep the preset distance to walk along the edge, and the variable is determined according to the center of the marker and the light emitted by the marker; the line segment BC represents the fixed distance between two adjacent said markers, which distance can be obtained by means of prior measurements.
According to the principle that light travels along a straight line, the following formula is used for calculation by a method that the side lengths of similar triangles are proportional, so as to obtain the actual displacement data of the robot, namely the length of a line segment AA1 in FIG. 3:
Figure 563500DEST_PATH_IMAGE021
it is understood that two adjacent markers may be placed at a plurality of positions on the edge of the marker, and the measurement calculation may be performed to calculate the second displacement reference data of the robot and the second actual displacement data of the robot: and respectively arranging two adjacent calibration symbols for storing the fixed distance and the robot at a second relative position, obtaining a second preset distance and a second preset position relation, calculating by using an optical flow method to obtain second displacement reference data of the robot and second actual displacement data of the robot, fusing the second displacement reference data and the displacement reference data to obtain reference data, fusing the second actual displacement data and the actual displacement data to obtain actual displacement data, for example, correcting again at the next position where the robot walks along the edge, and obtaining more accurate displacement reference data and actual displacement data by using a first-time method.
According to an embodiment of the present invention, in step 4, the sensing data of the optical flow sensor includes pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system. The sensing data of the optical flow sensor can affect the displacement reference data of the robot, and further affect the positioning of the robot to construct a map, so that the displacement reference data of the robot is subjected to data fusion through the actual displacement data of the robot, the accumulated drift is eliminated, and the sensing data of the optical flow sensor is calibrated, so that the robot is accurately positioned.
Another embodiment of the present invention relates to a calibration system for an optical flow sensor, and the calibration method according to the above embodiment is used. As shown in fig. 1, the calibration system includes: the device comprises a marker, a camera module, an image analysis module, a calculation module and a calibration module.
The marker is used for marking a calibration symbol with a preset position relation with the marker center to assist in calibrating the optical flow sensor, wherein the marker center is the central point of a circumscribed geometric figure of the marker horizontal projection plane.
The camera module is arranged on the robot and keeps still relative to the robot, the lens direction of the camera is over against one surface, marked with the calibration symbol, of the marker, and the camera module is used for shooting the marker in real time to obtain image information of the marker, extracting gray values and coordinate information of the calibration symbol and transmitting the gray values and the coordinate information of the calibration symbol to the image analysis module for processing, wherein the marker is in a still state.
The image analysis module is used for finding out two characteristic straight lines passing through the calibration symbol from the image of the marker, fitting a linear equation of the two characteristic straight lines by using a linear equation fitting method, solving an intersection point of the two characteristic straight lines, identifying the calibration symbol and positioning the coordinate of the calibration symbol; and then positioning to one marker coordinate in the markers, acquiring optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, performing optical flow operation, ending the optical flow operation when positioning to the marker coordinate adjacent to the marker coordinate in the marker image, calculating according to the gray-scale invariant assumption principle and the local smoothing principle, performing coordinate conversion to obtain the speed of the robot in a geodetic coordinate system (indoor environment), and calculating displacement reference data of the robot in the geodetic coordinate system (indoor environment) through time dimension integration.
And the calculation module is used for calculating the actual displacement data of the robot by combining the preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols and using a method that the side lengths of similar triangles are proportional.
And the calibration module is used for carrying out data fusion on the actual displacement data of the robot in the calculation module and the displacement reference data of the robot in the image analysis module, and calibrating the sensing data of the optical flow sensor, wherein the sensing data of the optical flow sensor comprises pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system.
The two calibration symbols with fixed distances arranged on the markers are both set as infrared LED lamps, and the reflectivity of the infrared LED lamps is stronger, so that the infrared LED lamps are easier to distinguish by the optical flow sensor, subsequent optical flow operation is facilitated, and the accuracy of the displacement reference data of the robot is improved.
The preset position relation is that a connecting line of the marker and the center of the marker on a horizontal projection plane is in the same straight line direction with a light ray emitted by the marker, so that the calculation module combines the preset position relation of the center of the marker and the marker, the preset distance kept between the robot and the marker and the fixed distance of two adjacent markers to construct a similar triangle according to a light transmission along straight lines principle, and calculates the actual displacement data of the robot.
Since the above-described embodiment relating to the calibration method corresponds to this embodiment, this embodiment can be implemented in cooperation with the above-described embodiment relating to the calibration method. The related technical details mentioned in the above embodiment related to the calibration method are still valid in this embodiment, and the technical effects that can be achieved in the above embodiment related to the calibration method can also be achieved in this embodiment, and are not described here again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the above-described embodiments relating to the calibration method.
The above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A method for calibrating an optical flow sensor, comprising the steps of:
step 1, a robot walks along the edge of the same marker and keeps a preset distance from the edge of the same marker, an optical flow sensor collects an image of the marker, and a marker coordinate is positioned from the image of the marker;
step 2, starting from the coordinate of the marker positioned to one marker in the markers, acquiring the optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, performing optical flow operation, finishing the optical flow operation when the marker coordinate adjacent to the marker coordinate positioned to the start in the marker image is positioned, obtaining the relative speed of the robot relative to the marker, and calculating the displacement reference data of the robot through time dimension integration;
step 3, calculating actual displacement data of the robot according to a preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols; the preset position relation between the center of the marker and the calibration symbol is as follows: a connecting line of the marker and the center of the marker on a horizontal projection plane is in the same straight line direction with the light emitted by the marker;
and 4, performing data fusion on the actual displacement data of the robot and the displacement reference data of the robot, and calibrating the sensing data of the optical flow sensor.
2. The calibration method according to claim 1, wherein in step 1, the robot continuously acquires images of the same marker during walking along the edge of the marker, and acquires gray scale information I (x, y, t) of each image, wherein the gray scale information I (x, y, t) represents functions of position and time.
3. The calibration method according to claim 2, wherein the step 1 further comprises the step of finding out the coordinates of the marker from the image of the marker by finding out two characteristic straight lines passing through the marker from the image information of the marker acquired by the optical flow sensor; fitting the two characteristic straight lines by using a linear equation to obtain a linear equation of the characteristic straight lines; and solving the intersection point of the two characteristic straight lines and recording as the coordinate of the calibration symbol.
4. The calibration method according to claim 2, wherein said optical flow operation in said step 2 comprises:
acquiring the optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, and calculating the time gradient I of the gray level at the pixel point (x, y)tAnd spatial gradient Ix、Iy
Figure FDA0002711345540000021
Figure FDA0002711345540000022
Figure FDA0002711345540000023
Wherein I is the gray value of the pixel point, x and y are the positions of the pixel point in the image of the marker, and t is time;
obtaining an optical flow field equation of the pixel point according to the principle that the gray value of the same pixel point collected in the process that the robot walks along the edge of the marker keeps unchanged:
Ixu+Iyv=-It,
Figure FDA0002711345540000024
Figure FDA0002711345540000025
wherein the content of the first and second substances,
Figure FDA0002711345540000026
the optical flow in the X-axis direction is shown,
Figure FDA0002711345540000027
an optical flow representing the Y-axis direction;
the robot walks along the edge of the same marker, and the optical flow equation of m pixel points is combined according to the principle of local smooth assumption that pixel points in an image area of the marker with pixel points (x, y) as the center have the same speed u and v to calculate the optical flow of the marker relative to the robot:
Figure FDA0002711345540000028
where m is the number of pixel points,
and obtaining the relative speed of the robot relative to the marker according to the optical flow operation result, and then calculating the displacement reference data of the robot through time dimension integration.
5. The calibration method according to claim 4, wherein the relative velocity of the robot with respect to the marker is obtained by converting pixel units of the velocities u, v into length units and transforming from an image coordinate system into a geodetic coordinate system by rotation and translation transformation.
6. The calibration method according to claim 1, wherein in step 3, according to the principle that light propagates along a straight line, the actual displacement data of the robot is calculated by a method that the side lengths of similar triangles are proportional by combining the preset position relationship between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker, and the fixed distance between two adjacent calibration symbols.
7. The calibration method according to any one of claim 1, claim 3 or claim 4, wherein in step 4, the sensed data of the optical flow sensor comprises pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system.
8. A system for calibrating an optical flow sensor, the system using the calibration method of claim 1, the system comprising: the system comprises a marker, a camera module, an image analysis module, a calculation module and a calibration module;
the marker is used for marking a calibration symbol with a preset position relation with a marker center, wherein the marker center is the central point of a circumscribed geometric figure of the marker horizontal projection plane;
the camera module is arranged on the robot and used for shooting the marker in real time to obtain an image of the marker and transmitting the image to the image analysis module for processing, wherein the marker is in a static state;
the image analysis module is used for identifying the marker from the image of the marker, then positioning the marker to one marker coordinate in the marker, acquiring optical flow from the image of the marker in the previous frame to the image of the marker in the current frame, performing optical flow operation, finishing the optical flow operation when the marker coordinate adjacent to the marker coordinate which is positioned at the beginning is positioned in the image of the marker, obtaining the relative speed of the robot relative to the marker, and calculating the displacement reference data of the robot through time dimension integration;
the calculation module is used for calculating the actual displacement data of the robot by using a method that the side lengths of similar triangles are proportional by combining the preset position relation between the center of the marker and the calibration symbol, the preset distance kept between the robot and the marker and the fixed distance between two adjacent calibration symbols;
and the calibration module is used for carrying out data fusion on the actual displacement data of the robot in the calculation module and the displacement reference data of the robot in the image analysis module, and calibrating the sensing data of the optical flow sensor, wherein the sensing data of the optical flow sensor comprises pixel gray values of the image of the marker and coordinates of the marker in an image coordinate system.
9. The calibration system according to claim 8, wherein the two fixed-distance markers are provided as infrared LED lamps.
CN201810409130.9A 2018-05-02 2018-05-02 Calibration method and system of optical flow sensor Active CN108961342B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810409130.9A CN108961342B (en) 2018-05-02 2018-05-02 Calibration method and system of optical flow sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810409130.9A CN108961342B (en) 2018-05-02 2018-05-02 Calibration method and system of optical flow sensor

Publications (2)

Publication Number Publication Date
CN108961342A CN108961342A (en) 2018-12-07
CN108961342B true CN108961342B (en) 2020-12-15

Family

ID=64498851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810409130.9A Active CN108961342B (en) 2018-05-02 2018-05-02 Calibration method and system of optical flow sensor

Country Status (1)

Country Link
CN (1) CN108961342B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112438113B (en) * 2019-08-27 2022-07-05 南京泉峰科技有限公司 Self-walking mower
CN113034552B (en) * 2019-12-25 2024-02-20 Tcl科技集团股份有限公司 Optical flow correction method and computer equipment
CN111044080B (en) * 2019-12-30 2023-10-27 珠海一微半导体股份有限公司 Calibration parameter acquisition device and method based on optical flow sensor
CN111486819B (en) * 2020-04-10 2022-03-15 桂林电子科技大学 Method for measuring three-dimensional angular motion by adopting optical flow
CN112783170B (en) * 2020-12-30 2022-11-29 速感科技(北京)有限公司 Method and device for determining motion parameters of autonomous mobile equipment
CN113238555A (en) * 2021-05-12 2021-08-10 珠海市一微半导体有限公司 Mobile robot having optical flow sensor and control method thereof
CN114347064B (en) * 2022-01-31 2022-09-20 深圳市云鼠科技开发有限公司 Robot collision detection method and device based on optical flow, computer equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101312524A (en) * 2007-05-23 2008-11-26 财团法人工业技术研究院 Moving object detecting apparatus and method using light track analysis
CN103838437A (en) * 2014-03-14 2014-06-04 重庆大学 Touch positioning control method based on projection image
CN105606092A (en) * 2016-02-04 2016-05-25 中国科学院电子学研究所 Method and system for locating indoor robot
CN105698726A (en) * 2016-02-02 2016-06-22 深圳中科光子科技有限公司 Method and device for measuring and calibrating motion platform positioning precision
CN106845552A (en) * 2017-01-31 2017-06-13 东南大学 The low dynamic carrier speed calculation method of fusion light stream and SIFT feature Point matching under the uneven environment of light distribution
WO2017121926A1 (en) * 2016-01-15 2017-07-20 Nokia Technologies Oy Method and apparatus for calibration of a multi-camera system
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor
CN107924573A (en) * 2015-08-31 2018-04-17 三菱电机株式会社 Image processing apparatus, image processing method and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9363427B2 (en) * 2013-08-28 2016-06-07 Disney Enterprises, Inc. Device and method for calibrating a temporal contrast sensor with a frame-based camera sensor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101312524A (en) * 2007-05-23 2008-11-26 财团法人工业技术研究院 Moving object detecting apparatus and method using light track analysis
CN103838437A (en) * 2014-03-14 2014-06-04 重庆大学 Touch positioning control method based on projection image
CN107924573A (en) * 2015-08-31 2018-04-17 三菱电机株式会社 Image processing apparatus, image processing method and program
WO2017121926A1 (en) * 2016-01-15 2017-07-20 Nokia Technologies Oy Method and apparatus for calibration of a multi-camera system
CN105698726A (en) * 2016-02-02 2016-06-22 深圳中科光子科技有限公司 Method and device for measuring and calibrating motion platform positioning precision
CN105606092A (en) * 2016-02-04 2016-05-25 中国科学院电子学研究所 Method and system for locating indoor robot
CN106845552A (en) * 2017-01-31 2017-06-13 东南大学 The low dynamic carrier speed calculation method of fusion light stream and SIFT feature Point matching under the uneven environment of light distribution
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An Open Souce and Open Hardware Embedded Metric Optical Flow CMPS Camera for Indoor and Outdoor Applications;Dominik Honegger et al;《2013 IEEE International Conference on Robotics and Automation (ICRA)》;20130510;第1736-1741页 *
基于光流传感器的四旋翼飞行器位姿估计与悬停校正研究;李亚斌;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20180215;C031-154 *

Also Published As

Publication number Publication date
CN108961342A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108961342B (en) Calibration method and system of optical flow sensor
CN112197770B (en) Robot positioning method and positioning device thereof
KR100776215B1 (en) Apparatus and method for estimating location and generating map of mobile body, using upper image, computer-readable recording media storing computer program controlling the apparatus
CN110044354A (en) A kind of binocular vision indoor positioning and build drawing method and device
CN110361027A (en) Robot path planning method based on single line laser radar Yu binocular camera data fusion
CN107909614B (en) Positioning method of inspection robot in GPS failure environment
WO2012043045A1 (en) Image processing device and image capturing device using same
Chien et al. Visual odometry driven online calibration for monocular lidar-camera systems
CN112815939B (en) Pose estimation method of mobile robot and computer readable storage medium
CN108090921A (en) Monocular vision and the adaptive indoor orientation method of IMU fusions
CN112254729A (en) Mobile robot positioning method based on multi-sensor fusion
CN111164648B (en) Position estimating device and position estimating method for mobile body
CN208323361U (en) A kind of positioning device and robot based on deep vision
CN115731268A (en) Unmanned aerial vehicle multi-target tracking method based on visual/millimeter wave radar information fusion
CN112819711B (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN111998862A (en) Dense binocular SLAM method based on BNN
CN115205339A (en) Mobile robot target following method based on improved YOLOX
Li et al. A binocular MSCKF-based visual inertial odometry system using LK optical flow
CN112580683A (en) Multi-sensor data time alignment system and method based on cross correlation
CN117367427A (en) Multi-mode slam method applicable to vision-assisted laser fusion IMU in indoor environment
JP6410231B2 (en) Alignment apparatus, alignment method, and computer program for alignment
CN112432653B (en) Monocular vision inertial odometer method based on dotted line characteristics
CN112731503A (en) Pose estimation method and system based on front-end tight coupling
CN116804553A (en) Odometer system and method based on event camera/IMU/natural road sign
CN113379911A (en) SLAM method, SLAM system and intelligent robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant