CN109506652A - A kind of optical flow data fusion method and clean robot based on carpet offset - Google Patents
A kind of optical flow data fusion method and clean robot based on carpet offset Download PDFInfo
- Publication number
- CN109506652A CN109506652A CN201811238969.7A CN201811238969A CN109506652A CN 109506652 A CN109506652 A CN 109506652A CN 201811238969 A CN201811238969 A CN 201811238969A CN 109506652 A CN109506652 A CN 109506652A
- Authority
- CN
- China
- Prior art keywords
- light stream
- robot
- coordinate
- stream sensor
- carpet
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Abstract
The present invention discloses a kind of optical flow data fusion method based on carpet offset, it include: when the sensing data of light stream sensor are reliable, convert the picture displacement amount that light stream sensor obtains in each preset time to the displacement of dimension identical as code-disc, cumulative integral is carried out to the sensing data of light stream sensor on time dimension again, obtains light stream deviation post coordinate of the light stream sensor relative to initial position;Light stream deviation post coordinate translation is converted to obtain machine center coordinate, the actual distance that the driving wheel of robot advances on corresponding carpet further according to light stream sensor and the rigid connection relationship at robot center;When the sensing data of light stream sensor are unreliable, the pulse data that code-disc senses in each preset time is subjected to integral calculation on time dimension, and calculated result is updated into the machine center coordinate, to obtain the actual distance that the driving wheel of robot on carpet advances, the accuracy of light stream sensor sensing carpet coordinate data is improved.
Description
Technical field
The present invention relates to robots to detect control field, and in particular to a kind of optical flow data fusion side based on carpet offset
Method, chip and clean robot.
Background technique
Robot based on inertial navigation is more more and more universal, it is representative it is stronger be that family sweeps the floor clean robot, knot
Gyroscope, acceleration are closed with the data of wheel odometer, realizes that indoor environment is positioned immediately with building figure, further according to the map of foundation
Realize location navigation.But due to national conditions difference, the place of the warmer sweltering heat of weather, the floor in house is substantially hard place
Plate, and weather than colder place, on the floor in house be generally covered with carpet in addition some families be hard floor with soft carpet all
In the presence of.Hard floor will not generally impact the motion process of robot, when being carpet since material problem can be to robot
Apply vectorial force, cause wheel slip, deviation failure occurs in the map of foundation, and it is bigger often to will lead to robot navigation's appearance
Error.
Specifically, when robot navigates by water in carpet-covered environment, the movement of robot is not only by frictional force
Impetus, and influenced by the active force that carpet is applied to robot.Fortune based on robot relative to carpet texture
Dynamic, the driving wheel of robot can make carpet fiber hold up or fall down.Particularly, when fiber falls down along carpet texture, carpet
It can be along the promotion of the direction of carpet texture or guided robot.As shown in figure 3, the robot 1 in left side is towards desired motion direction C
In motion process, impetus of the driving wheel A of robot 1 by frictional force f11, and carpet fiber is applied to robot 1
The inside directed force F 11 of driving wheel A so that robot 1 by frictional force f11 and the resultant force F12 of inside directed force F 11 effect and
Deviate desired motion direction C during the motion;As shown in figure 3, the robot 2 on right side is moved through towards desired motion direction C
The impetus of Cheng Zhong, the driving wheel B of robot 2 by frictional force f21, and carpet fiber is applied to the driving of robot 2
B outward force F21 is taken turns, so that robot 2 is being moved by the frictional force f21 and resultant force F22 of inside directed force F 21 effect
Deviate desired motion direction C in the process.Under prior art background, the calculated distance of code-disc only considers robot and is rubbed
The problem of the case where wiping power, there is no the offsets for considering to move on carpet, therefore general industry can consider using light stream sensor
To eliminate the influence of carpet.Although light stream sensor ensure that the positional accuracy of robot, it cannot be guaranteed that robot fortune
The influence of the direction each opposite sex of the dynamic regularity with eliminating carpet.
Summary of the invention
In order to overcome drawbacks described above, the present invention provides a kind of optical flow data fusion method based on carpet offset, the light stream
Data fusion method merges light stream sensor with the data of wheel code-disc, integrates out robot using the relative distance data of light stream
Current position coordinates, its technical solution is as follows:
A kind of optical flow data fusion method based on carpet offset, the optical flow data fusion method are applied to robot in carpet table
The processing of sensing data when face shifts, wherein the reliability of the sensing data of light stream sensor is by light stream sensor
The interrupt signal of output obtains, when the interrupt signal of light stream sensor output is high level, then sensing data of light stream sensor
Reliably;When the interrupt signal of light stream sensor output is low level, then the sensing data of light stream sensor are unreliable;The light stream
Data fusion method includes: when the sensing data of light stream sensor are reliable, first by light stream sensor when each described default
The picture displacement amount of interior acquisition is converted into the displacement of dimension identical as code-disc, then to light stream sensor on time dimension
Sensing data carry out cumulative integral, obtain light stream deviation post coordinate of the light stream sensor relative to its initial position;Then root
It converts light stream deviation post coordinate translation to obtain current location according to the rigid connection relationship at light stream sensor and robot center
Under machine center coordinate, i.e. current position coordinates of robot, the practical road that the driving wheel of robot advances on corresponding carpet
Journey;When the sensing data of light stream sensor are unreliable, the pulse data that code-disc senses in each preset time is existed
Integral calculation is carried out on time dimension, and calculated result is updated into the machine center coordinate, to obtain the current of robot
Position coordinates correspond to the actual distance that the driving wheel of robot on carpet advances;Simultaneously according in light stream sensor and robot
The rigid connection relationship of the heart converts the machine center coordinate translation, and the coordinate of translation conversion is updated the light stream and is deviated
Position coordinates;Wherein, the preset time is the time of each fusion calculation;The current position coordinates are all world coordinates.
Further, the rigid connection relationship is that the light stream coordinate system of light stream sensor and the machine at robot center are sat
Mark the relative positional relationship of system, the position including light stream sensor size, light stream sensor at a distance from robot center
Position and robot center line and coordinate system of machine preset coordinate axis angle;Wherein, coordinate system of machine
Preset coordinate axis positive direction is robot current kinetic direction;The preset coordinate axis positive direction and global coordinate system of coordinate system of machine
Preset coordinate axis positive direction angle be based on gyroscope detection numerical value be calculated, as robot current location relative to
The deviation angle of the preset direction.
A kind of chip, for storing program, described program executes the optical flow data fusion method for controlling robot.
A kind of clean robot, the clean robot are a kind of for cleaning the robot of carpet surface, the cleaner
Device people is based on the built-in chip.
The number merged using code-disc with light stream sensor in the optical flow data fusion method provided in an embodiment of the present invention
According to the coordinate extrapolated with respect to deviation post, and code-disc and light are updated according to the reliability of the sensing data of light stream sensor respectively
The current sensing data of flow sensor, to improve the accuracy of robot records carpet deviation post coordinate.
Detailed description of the invention
Fig. 1 is the structural model schematic diagram of robot in the embodiment of the present invention;
Fig. 2 is that the distribution of robot coordinate system, light stream coordinate system and global coordinate system under current location in the embodiment of the present invention are shown
It is intended to;
Fig. 3 is the schematic top plan view of force analysis of the wheels of robot on carpet in the embodiment of the present invention;
Fig. 4 is robot coordinate system and light stream coordinate system transition diagram in the embodiment of the present invention;
Fig. 5 is the flow chart of the sensing data fusion calculation method of light stream sensor provided in an embodiment of the present invention and code-disc.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention is retouched in detail
It states.It should be appreciated that disclosed below, the specific embodiments are only for explaining the present invention, is not intended to limit the present invention.
In the description of invention, it is to be understood that term " center ", " longitudinal direction ", " transverse direction ", "upper", "lower", " preceding ",
The orientation or positional relationship of the instructions such as " rear ", "left", "right", " hard straight ", "horizontal", "top", "bottom", "inner", "outside" be based on
Orientation or positional relationship shown in the drawings is merely for convenience of description invention and simplifies description, rather than indication or suggestion is signified
Device or element must have a particular orientation, be constructed and operated in a specific orientation, therefore should not be understood as to invention
Limitation.
It should be noted that, in this document, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.
It is used for equipped with gyroscope for the detection of rotational angle, odometer on robot carrier provided in an embodiment of the present invention
The detection of stroke distances, and equipped with the sensor for being able to detect metope distance, the sensor of detection metope distance can be super
Sound wave range sensor, infrared intensity detection sensor, infrared distance sensor, physical switch detect crash sensor, capacitor
Or resistance variations detection sensor etc., the light stream of detection robot relative displacement coordinate is also installed on the robot carrier
Sensor.Mobile robot of the invention as shown in Figure 1, Fig. 1 does not represent the real structure of robot of the invention with appearance,
A schematic diagram of the invention is only represented, light stream sensor is placed on robot base and (can be any position on pedestal).Fig. 1
In, left driving wheel 11 and right driving wheel 12 of the pedestal 4 of mobile robot for fixed placement control robot direction of advance;Top
Spiral shell instrument 3 can be placed on any position in the control mainboard 2 of robot, may include one or more gyros in control mainboard 2
Instrument is used to sense the rotation of robot;Control mainboard 2 is capable of handling the parameter of related sensor, and can output a control signal to
The execution unit of robot.Light stream module 7 can also be installed in any position on the pedestal 4 of mobile robot, the moving machine
There are also universal wheels 6 by device people.Wherein left driving wheel 11 and right driving wheel 12 are equipped with code-disc, for detecting the speed of respective wheel rotation
Degree;The camera lens for the light stream sensor installed in light stream module 7 is directed parallel to ground, also with light-emitting LED in light stream module 7,
It is light-emitting LED to be automatically closed or open according to the brightness of environment light, when the brightness when ground is relatively low, LED light is turned on,
When the brightness of environment light is relatively high, it is shut off LED light.
It is to be understood that robot can advance than based on driving wheel when the direction of Robot carpet texture is mobile
The code-disc being equipped with rotates the distance of identified distance.On the other hand, when robot inverse carpet texture is in the fiber of setting
When upper traveling, robot can advance the distance shorter than the distance based on determined by driving wheel be equipped with code-disc rotation.This two
In the case of kind, the actual range that robot is advanced may be different at a distance from being measured by code-disc.Since robotically-driven wheel exists
It skids and is affected when being moved on carpet, be not necessarily required to installing code-disc, be selectable inertial sensor;Therefore, in machine
When device people passes through carpet, position estimation error may be accumulated at any time.Therefore, robot possibly can not establish accurate environment
Map or possibly can not be effective, accurate and/or be safely navigated by water in environment, to cannot be used for execution task, such as very
Empty dedusting.
It should be noted that if robot initial pose, environment and target are it is known that navigation problem is converted into global path
Planning problem, therefore the coordinate of the code-disc of robot and light stream sensor sensing requires to be transformed under global coordinate system and be melted again
It is total to calculate, and the current position coordinates of finally obtained robot are the position coordinates under global coordinate system.
In embodiments of the present invention, the distribution schematic diagram of robot coordinate system, light stream coordinate system and global coordinate system such as Fig. 2
Shown, robot coordinate system is the advance side, robot under corresponding current location using robot center RO under current location as origin
To the coordinate system for R_X axis positive direction, robot coordinate system further includes the R_Y axis perpendicular to R_X axis direction;Robot coordinate system
Center R0 corresponds to the gyroscope 3 for being placed on 2 center of control mainboard at robot center.Global coordinate system is with robot
Initial position be origin, the direction of advance using robot from initial position as X-axis positive direction, with perpendicular to X-direction for Y
The coordinate system of axis, wherein the direction of advance is the desired motion direction of robot;Light stream coordinate system is pixel coordinate system, with machine
Device people coordinate system is different with the unit of global coordinate system, is using the center PO of light stream module 7 as origin, orthogonal P_
X-axis and P_Y axis are the coordinate system of reference axis.Above three coordinate system all defers to the right-hand rule;Wherein robot coordinate system and light
Flowing coordinate system is all relative coordinate system, and origin changes with the variation of robot current location.In the global coordinate system
In, it is first quartile on the left of Y-axis, rotation is followed successively by the second quadrant counterclockwise, and third quadrant, fourth quadrant, wherein robot works as
The absolute value that the preceding direction of motion deviates the angle in desired motion direction remains set at。
Specifically, light stream sensor passes through given pace continuous acquisition body surface image in light stream module 7, then by machine
The control mainboard 2 of people analyzes generated image pixel point.Since always there are identical spies for adjacent two images
Sign, so can judge the mean motion of body surface feature by the change in location information for comparing these characteristic points;So
Afterwards according to pixel spot speed principle of identity in same pixel gray level principle of invariance and same image-region, light stream field equation is established
And solve and obtain the movement velocity of pixel, integral calculation is then carried out, thus the image obtained using the light stream sensor
Characteristic information integrating meter calculates the picture displacement amount that robot obtains in the preset time, and picture displacement amount is light stream seat
Numerical value under mark system, unit need to be converted to chainage unit, therefore picture displacement amount is converted into dimension identical as code-disc
Displacement.
As the embodiment of the present invention, the rigid connection relationship at light stream sensor and robot center is the light of light stream sensor
The relative positional relationship of the coordinate system of machine at coordinate system and robot center is flowed, in position and robot including light stream sensor
Heart position apart from size, light stream sensor position and the line of robot center and the preset coordinate of coordinate system of machine
The angle of axis;Wherein, the preset coordinate axis positive direction of coordinate system of machine is robot current kinetic direction;Coordinate system of machine it is pre-
If the angle of reference axis positive direction and the preset coordinate axis positive direction of global coordinate system is calculated based on gyroscope detection numerical value
It arrives, the deviation angle as robot current location relative to the preset direction.As shown in Figure 2 and Figure 4, robot coordinate system
Origin RO and the relative positional relationship of point of origin P O of light stream coordinate system be the light stream sensor and the inertial sensor
Rigid body connection relationship, point of origin P the O distance L and line segment PORO of origin RO and light stream coordinate system including robot coordinate system
Angle absolute value with straight line where the R_X axis of robot coordinate system is, the opposite position of robot coordinate system and light stream coordinate system
Relationship is set to be remained unchanged in robot kinematics to form the rigid body connection relationship, the origin RO's of robot coordinate system
Physical location corresponds to the gyroscope 3 for being placed in robot center, and the physical location of the point of origin P O of light stream coordinate system is corresponding
In light stream module 7.
As shown in figure 4, the coordinate system conversion method based on above-mentioned rigid connection relationship: robot center R0 is in complete
In the fourth quadrant of office's coordinate system, the light stream sensor in light stream module 7 senses the coordinate under light stream coordinate system, is transformed into complete
Office's coordinate system obtains the first predicted position coordinate PO (xp4, yp4), in the fourth quadrant of global coordinate system, i.e. machine
People's current kinetic direction is toward the angle that Y-axis negative direction deviates X-axis positive direction, which is effect of the robot by carpet
Constant offset angle caused by power can sense revolute angle by gyroscope 3.It will according to trigonometric function relational expression
The first predicted position coordinate obtains obtaining under robot, position, center by weight of rigid body connection relationship translation
The current position coordinates RO (xr4, yr4) of second predicted position coordinate, i.e. robot center under global coordinate system, can be by
The example formula approximate representation of four-quadrant:
。
The applicable specific embodiment of above-mentioned formula are as follows: gyroscope 3 is located at robot center, and light stream module 7 is located at robot
The lower right at center.The light stream coordinate shift amount that light stream sensor measurement obtains obtains machine by aforementioned coordinate system conversion method
Current position coordinates of the people center under global coordinate system are RO (xr4, yr4), and body deviation of gravity center moves desired locations
(xr4,0) angle is。
It should be noted that robot center further include in the first quartile of global coordinate system, the second quadrant and
The embodiment of third quadrant.In these embodiments, gyroscope 3 is located at robot center, and light stream module 7 is located at robot center
Lower right, the expectation direction of displacement of robot is X-axis positive direction, i.e., the described preset direction is X-axis positive direction.
Robot center R1 is in the embodiment in the first quartile of global coordinate system: first predicted position is sat
It marks P1 (xp1, yp1), is put down the first predicted position coordinate according to the rigid body connection relationship according to trigonometric function relational expression
It moves conversion to obtain obtaining the second predicted position coordinate under robot, position, center, i.e., robot center is in global coordinate system the
Current position coordinates R1 (xr1, yr1) in one quadrant, it is close by trigonometric function relational expression on the basis of the fourth quadrant
Like expression:
。
Robot center R2 is in the embodiment in the second quadrant of global coordinate system: first predicted position is sat
It marks P2 (xp2, yp2), is put down the first predicted position coordinate according to the rigid body connection relationship according to trigonometric function relational expression
It moves conversion to obtain obtaining the second predicted position coordinate under robot, position, center, i.e., robot center is in global coordinate system the
Current position coordinates R2 (xr2, yr2) in two quadrant, can be by following exemplary formula approximate representation:
。
Robot center R3 is in the embodiment in the third quadrant of global coordinate system, and first predicted position is sat
It marks P3 (xp3, yp3), is put down the first predicted position coordinate according to the rigid body connection relationship according to trigonometric function relational expression
It moves conversion to obtain obtaining the second predicted position coordinate under robot, position, center, i.e., robot center is in global coordinate system the
Current position coordinates R3 (xr3, yr3) in three quadrant, can be by following exemplary formula approximate representation:
。
In addition, that is, described preset direction is not X-axis pros if the expectation direction of displacement of robot is not X-axis positive direction
To, alternatively, light stream module 7 is not located at the lower right at robot center, then according to the thinking of the example formula by fourth quadrant,
And the center position coordinates of robot, the hair of their coordinate system conversion method are calculated in conjunction with corresponding trigonometric function relationship
Bright design is identical, therefore repeats no more the other embodiments of the position of above-mentioned expectation direction of displacement and light stream module 7 herein.
The embodiment of the present invention provides a kind of optical flow data fusion method, since the position that light stream sensor improves robot is quasi-
True property, but the sensing data of light stream sensor are not necessarily reliable, therefore need to carry out the fusion calculation by code-disc data,
In, the reliability of the sensing data of light stream sensor is obtained by the interrupt signal judgement of light stream sensor built-in algorithm.Specifically,
When the interrupt signal of light stream sensor output is high level, then the sensing data of light stream sensor are reliable;When light stream sensor is defeated
Interrupt signal out is low level, then the sensing data of light stream sensor are unreliable;Wherein, the interrupt signal is light stream sensing
As a result, the built-in algorithm is the image for handling carpet surface in the prior art obtained from device built-in algorithm processing sensing data
The common algorithms of data, so it will not be repeated.
As shown in figure 5, the optical flow data fusion method the following steps are included:
Step S501: code-disc sensing pulse data, while light stream sensor senses optical flow data, subsequently into step S502.
Step S502: judging whether the sensing data of light stream sensor are reliable, be to enter step S503, otherwise enters step
Rapid S506.
Step S503: it converts the picture displacement amount that light stream sensor obtains in each preset time to and code-disc
The displacement of identical dimension, specifically, when updating map reference using the optical flow data, by the single pulse week of the code-disc
The offset numerical quantity of measured distance values and relative coordinate of the light stream sensor within the identical pulse period in phase
Ratio is multiplied by the Units conversion factor as Units conversion factor, then by the optical flow data, obtains the numerical value of unit after reunification.
Then the sensing data of light stream sensor in each preset time are subjected to the cumulative integral calculation realized on time dimension,
To obtain light stream deviation post coordinate of the light stream sensor relative to its initial position, that is, corresponds to light stream sensor and currently export
Measurement result.Subsequently into step S504.
Step S504: public according to the revealed aforementioned exemplary of the rigid connection relationship of light stream sensor and robot center
Formula, i.e., the triangle geometrical relationship that angular relationship constructs at a distance from light stream coordinate system according to robot coordinate system, by the light stream
Deviation post coordinate obtains robot location's coordinate by weight of aforementioned coordinate system conversion method carries out translation, corresponds on carpet
The actual distance that the driving wheel of robot advances, subsequently into step S505;
Step S505: step S504 obtained robot location's coordinate updates the coordinate data that code-disc currently exports.Then it returns
Step S501.Relative to the measurement result that code-disc before non-fusion treatment exports, the result of the step fusion calculation is relatively reliable steady
It is fixed.
Step S506: time dimension integral is carried out to the pulse data of code-disc sensing, to obtain the robot center
Coordinate, the coordinate data can be updated entering step in S505 next time by robot location's coordinate.Subsequently into step
S507.Due to code-disc by the pulse number of generation per second come the movement velocity of recorder people, so by code-disc in each institute
It states the pulse data sensed in preset time and carries out integral calculation on time dimension, obtain the current position coordinates of robot,
The actual distance that the driving wheel of robot advances on corresponding carpet.
Step S507: the result of integral calculation described in step S506 is updated into the coordinate data that code-disc currently exports.So
After enter step S508.Before updating machine center coordinate described in step S504, in the machine as described in step S504
Heart coordinate can be the result (light stream sensor of sensing data reliable stage the sensed data Integral Transformation of light stream sensor
In the measurement result that the sensing data reliable stage is exported), so the update operation guarantees the position of the robot measured
Set the accuracy of coordinate.Simultaneously according to the revealed aforementioned exemplary of rigid connection relationship of light stream sensor and robot center
The inverse operation formula of formula, i.e., the triangle geometry that angular relationship constructs at a distance from light stream coordinate system according to robot coordinate system close
The machine center coordinate is carried out inverse conversion according to aforementioned coordinate system conversion method by system, obtains light stream sensor currently inclined
Pan position coordinate.
Step S508: the offset coordinates of light stream sensor obtained in step S507 are updated into light stream sensor and are currently exported
Coordinate data.Then return step S501.Since the light stream deviation post coordinate can be the sensing number of light stream sensor
According to the integral calculation carried out on cumulative realization time dimension as a result, but since the sensing data there are light stream sensor are insecure
Situation is changed so translation need to be carried out the machine center coordinate that the pulse data that code-disc in step S506 senses integrates
It calculates, and the result of translation conversion is updated into the light stream deviation post coordinate being calculated in step S503, thus in light stream
When the sensing data of sensor are reliable, the accuracy of the sensing data integral operation of light stream sensor is improved.
The embodiment of the present invention carries out reliability judgement by the data that built-in light stream sensor and code-disc sense in real time, so
Afterwards according to the result of sensor reliability judgement select one of sensor sensing data be transformed under light stream coordinate system into
The actual distance that row integral operation is advanced in the hope of the driving wheel of robot on more accurate carpet reduces carpet and deviates institute's band
The error for the force effect come.
A kind of chip, for storing program, described program executes the optical flow data fusion method for controlling robot,
It is more accurate with the coordinate data for realizing that robot acquires on carpeted surfaces, avoid drain sweep.The chip is sensed by light stream
Device, gyroscope and code-disc have in the process come the initial position message (X1, Y1, θ 1) and robot ambulation for determining straight line to be walked
The current location information (X2, Y2, θ 2) of body, then the optical flow data fusion method is executed, fusion light stream sensor is with taking turns subcode
The data of disk realize mutual conversion of the light stream coordinate with wheel code-disc coordinate, use of working in coordination.
Robot of the chip as control chip is assembled, which is a kind of for cleaning the cleaning of carpet surface
Robot, it is only necessary to by the data of light stream sensor and code-disc fusion calculation, so that it may calculate the opposite of robot ambulation
The coordinate of deviation post, Data Fusion is also relatively simpler, does not need high performance processor, further reduces system fortune
Calculate the hardware cost of resource and robot.
The above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;Although referring to aforementioned each reality
Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each
Technical solution documented by embodiment is modified, or equivalent substitution of some or all of the technical features;And
These are modified or replaceed, the range for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.
Claims (4)
1. a kind of optical flow data fusion method based on carpet offset, which is applied to robot in carpet
The processing of sensing data when surface shifts, wherein the reliability of the sensing data of light stream sensor is sensed by light stream
The interrupt signal of device output obtains, when the interrupt signal of light stream sensor output is high level, then the sensing number of light stream sensor
According to reliable;When the interrupt signal of light stream sensor output is low level, then the sensing data of light stream sensor are unreliable;Its feature
It is, the optical flow data fusion method includes:
When the sensing data of light stream sensor are reliable, the image that first obtains light stream sensor in each preset time
Displacement is converted into the displacement of dimension identical as code-disc, then carries out to the sensing data of light stream sensor on time dimension tired
Accretion point, obtains light stream deviation post coordinate of the light stream sensor relative to its initial position;
Then light stream deviation post coordinate translation is converted according to the rigid connection relationship at light stream sensor and robot center
Machine center coordinate under to current location, the i.e. current position coordinates of robot correspond on carpet before the driving wheel of robot
Into actual distance;
When the sensing data of light stream sensor are unreliable, the pulse data that code-disc senses in each preset time is existed
Integral calculation is carried out on time dimension, and calculated result is updated into the machine center coordinate, to obtain the current of robot
Position coordinates correspond to the actual distance that the driving wheel of robot on carpet advances;Simultaneously according in light stream sensor and robot
The rigid connection relationship of the heart converts the machine center coordinate translation, and the coordinate of translation conversion is updated the light stream and is deviated
Position coordinates;
Wherein, the preset time is the time of each fusion calculation;The current position coordinates are all world coordinates.
2. optical flow data fusion method according to claim 1, which is characterized in that the rigid connection relationship is light stream sensing
The relative positional relationship of the coordinate system of machine of the light stream coordinate system and robot center of device, position and machine including light stream sensor
Device people center apart from the pre- of size, the position of light stream sensor and the line of robot center and coordinate system of machine
If the angle of reference axis;Wherein, the preset coordinate axis positive direction of coordinate system of machine is robot current kinetic direction;Machine coordinates
The angle of the preset coordinate axis positive direction of the preset coordinate axis positive direction and global coordinate system of system is to detect numerical value based on gyroscope
It is calculated, the deviation angle as robot current location relative to the preset direction.
3. a kind of chip, for storing program, which is characterized in that described program for control robot perform claim require 1 to
Any one of the claim 2 optical flow data fusion method.
4. a kind of clean robot, which is a kind of for cleaning the robot of carpet surface, which is characterized in that institute
It states clean robot and is based on chip described in built-in claim 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811238969.7A CN109506652B (en) | 2018-10-23 | 2018-10-23 | Optical flow data fusion method based on carpet migration and cleaning robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811238969.7A CN109506652B (en) | 2018-10-23 | 2018-10-23 | Optical flow data fusion method based on carpet migration and cleaning robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109506652A true CN109506652A (en) | 2019-03-22 |
CN109506652B CN109506652B (en) | 2022-11-15 |
Family
ID=65746038
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811238969.7A Active CN109506652B (en) | 2018-10-23 | 2018-10-23 | Optical flow data fusion method based on carpet migration and cleaning robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109506652B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111044080A (en) * | 2019-12-30 | 2020-04-21 | 珠海市一微半导体有限公司 | Calibration parameter acquisition device and method based on optical flow sensor |
CN111089595A (en) * | 2019-12-30 | 2020-05-01 | 珠海市一微半导体有限公司 | Detection data fusion method of robot, main control chip and robot |
CN112336258A (en) * | 2019-08-09 | 2021-02-09 | 松下知识产权经营株式会社 | Mobile robot, control method, and storage medium |
CN113238555A (en) * | 2021-05-12 | 2021-08-10 | 珠海市一微半导体有限公司 | Mobile robot having optical flow sensor and control method thereof |
CN114001656A (en) * | 2021-11-12 | 2022-02-01 | 天津希格玛微电子技术有限公司 | Detection error correction method and device for optical displacement detection device |
CN114440874A (en) * | 2021-12-31 | 2022-05-06 | 深圳市云鼠科技开发有限公司 | Fusion positioning method and device based on optical flow and grating |
WO2022142782A1 (en) * | 2020-12-30 | 2022-07-07 | 速感科技(北京)有限公司 | Method and device for determining motion parameter of autonomous mobile device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1063317A (en) * | 1996-08-13 | 1998-03-06 | Fanuc Ltd | Method for combining coordinate system in robot and visual sensor system |
US20130056032A1 (en) * | 2011-09-07 | 2013-03-07 | Suuk Choe | Robot cleaner, and system and method for remotely controlling the same |
CN103411621A (en) * | 2013-08-09 | 2013-11-27 | 东南大学 | Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method |
CN105717924A (en) * | 2012-06-08 | 2016-06-29 | 艾罗伯特公司 | Carpet drift estimation using differential sensors or visual measurements |
CN105973240A (en) * | 2016-07-15 | 2016-09-28 | 哈尔滨工大服务机器人有限公司 | Conversion method of navigation module coordinate system and robot coordinate system |
CN106489104A (en) * | 2014-11-26 | 2017-03-08 | 艾罗伯特公司 | System and method for the use of the optics range sensorses in mobile robot |
CN108638053A (en) * | 2018-04-03 | 2018-10-12 | 珠海市微半导体有限公司 | A kind of detection method and its antidote of robot skidding |
-
2018
- 2018-10-23 CN CN201811238969.7A patent/CN109506652B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1063317A (en) * | 1996-08-13 | 1998-03-06 | Fanuc Ltd | Method for combining coordinate system in robot and visual sensor system |
US20130056032A1 (en) * | 2011-09-07 | 2013-03-07 | Suuk Choe | Robot cleaner, and system and method for remotely controlling the same |
CN105717924A (en) * | 2012-06-08 | 2016-06-29 | 艾罗伯特公司 | Carpet drift estimation using differential sensors or visual measurements |
CN103411621A (en) * | 2013-08-09 | 2013-11-27 | 东南大学 | Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method |
CN106489104A (en) * | 2014-11-26 | 2017-03-08 | 艾罗伯特公司 | System and method for the use of the optics range sensorses in mobile robot |
CN105973240A (en) * | 2016-07-15 | 2016-09-28 | 哈尔滨工大服务机器人有限公司 | Conversion method of navigation module coordinate system and robot coordinate system |
CN108638053A (en) * | 2018-04-03 | 2018-10-12 | 珠海市微半导体有限公司 | A kind of detection method and its antidote of robot skidding |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112336258A (en) * | 2019-08-09 | 2021-02-09 | 松下知识产权经营株式会社 | Mobile robot, control method, and storage medium |
US11630463B2 (en) | 2019-08-09 | 2023-04-18 | Panasonic Intellectual Property Management Co., Ltd. | Mobile robot, control method, and storage medium |
CN112336258B (en) * | 2019-08-09 | 2023-12-15 | 松下知识产权经营株式会社 | Mobile robot, control method, and storage medium |
CN111044080A (en) * | 2019-12-30 | 2020-04-21 | 珠海市一微半导体有限公司 | Calibration parameter acquisition device and method based on optical flow sensor |
CN111089595A (en) * | 2019-12-30 | 2020-05-01 | 珠海市一微半导体有限公司 | Detection data fusion method of robot, main control chip and robot |
CN111044080B (en) * | 2019-12-30 | 2023-10-27 | 珠海一微半导体股份有限公司 | Calibration parameter acquisition device and method based on optical flow sensor |
WO2022142782A1 (en) * | 2020-12-30 | 2022-07-07 | 速感科技(北京)有限公司 | Method and device for determining motion parameter of autonomous mobile device |
CN113238555A (en) * | 2021-05-12 | 2021-08-10 | 珠海市一微半导体有限公司 | Mobile robot having optical flow sensor and control method thereof |
CN114001656A (en) * | 2021-11-12 | 2022-02-01 | 天津希格玛微电子技术有限公司 | Detection error correction method and device for optical displacement detection device |
CN114440874A (en) * | 2021-12-31 | 2022-05-06 | 深圳市云鼠科技开发有限公司 | Fusion positioning method and device based on optical flow and grating |
Also Published As
Publication number | Publication date |
---|---|
CN109506652B (en) | 2022-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109506652A (en) | A kind of optical flow data fusion method and clean robot based on carpet offset | |
CN109394095A (en) | A kind of control method, chip and the clean robot of the offset of robot motion's carpet | |
Campbell et al. | A robust visual odometry and precipice detection system using consumer-grade monocular vision | |
US9329598B2 (en) | Simultaneous localization and mapping for a mobile robot | |
CN104302453B (en) | Use the carpet bias estimation of differential pick-up or vision measurement | |
CN109358623A (en) | A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet | |
Burschka et al. | Vision-based control of mobile robots | |
US9400501B2 (en) | Simultaneous localization and mapping for a mobile robot | |
KR100772912B1 (en) | Robot using absolute azimuth and method for mapping by the robot | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN108638053A (en) | A kind of detection method and its antidote of robot skidding | |
CN109959377A (en) | A kind of robot navigation's positioning system and method | |
US20160271795A1 (en) | Localization and Mapping Using Physical Features | |
CN106489104A (en) | System and method for the use of the optics range sensorses in mobile robot | |
WO2016077703A1 (en) | Gyroscope assisted scalable visual simultaneous localization and mapping | |
CN108052103A (en) | The crusing robot underground space based on depth inertia odometer positions simultaneously and map constructing method | |
CN109933056A (en) | A kind of robot navigation method and robot based on SLAM | |
CN105606092A (en) | Method and system for locating indoor robot | |
Karam et al. | Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping | |
CN115436955A (en) | Indoor and outdoor environment positioning method | |
WO2022188333A1 (en) | Walking method and apparatus, and computer storage medium | |
Fernandes et al. | A low-cost localization system based on Artificial Landmarks | |
CN110857861B (en) | Track planning method and system | |
CN114789439A (en) | Slope positioning correction method and device, robot and readable storage medium | |
Zhu et al. | Indoor Robot Localization Based on Visual Perception and on Particle Filter Algorithm of Increasing Priority Particles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd. Address before: 519000 room 105-514, No. 6, Baohua Road, Hengqin new area, Zhuhai, Guangdong Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |