CN105717928A - Vision-based robot navigation door-passing method - Google Patents
Vision-based robot navigation door-passing method Download PDFInfo
- Publication number
- CN105717928A CN105717928A CN201610266610.5A CN201610266610A CN105717928A CN 105717928 A CN105717928 A CN 105717928A CN 201610266610 A CN201610266610 A CN 201610266610A CN 105717928 A CN105717928 A CN 105717928A
- Authority
- CN
- China
- Prior art keywords
- robot
- door
- vertical stripe
- coordinate
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Abstract
The invention discloses a vision-based robot navigation door-passing method. The vision-based robot navigation door-passing method comprises the following steps that position and attitude parameters of a robot in a global coordinate system are obtained according to a sensor; a door to be passed by the robot is determined; the area for starting cameras of the robot is determined; a navigation destination is calculated; the robot moves to the navigation destination; the robot starts double cameras to recognize the door; a Cartesian coordinate is established by adopting the center of the door as an origin; the robot is controlled to reach 3 correction points respectively; and after the door is passed, the door-passing operation is finished. According to the technical scheme, the relative position relation between the door and the robot is calculated according to information returned from the vision, the global position information of the robot is calculated according to information from a milemeter, and the position and attitude parameters of the robot are effectively acquired, so that the robot is controlled to realize door-passing movement accurately.
Description
Technical field
The present invention relates to technical field of automation, move into one's husband's household upon marriage method particularly to the robot navigation of a kind of view-based access control model.
Background technology
Along with the development of robotics, indoor autonomous robot will be widely used in the every field in life, such as office, hospital and factory etc..Realize the robot independent navigation in these fields, one important problem Shi Shi robot is through door, accordingly, it would be desirable to robot can obtain environmental information, door in environment-identification, makes robot autonomous shuttling back and forth between each room thereby through path planning and navigation.
For being relevant to of moving into one's husband's household upon marriage of robot, research is existing it has been proposed that multiple method, but every kind of method has its limitation and shortcoming.Such as traditional method is to use ultrasound wave or the infrared sensor distance of measuring robots car body and door two frame near door such that it is able to makes robot avoid door two frame in the process moved into one's husband's household upon marriage preferably, passes through door.But this kind of method is it is to be appreciated that the initial position of more accurate environmental map and robot, otherwise owing to ultrasound wave and infrared detection performance very likely guided robot are moved into one's husband's household upon marriage unsuccessfully.Additionally, utilize ultrasound wave or infrared recognitiion gate to need to utilize more priori, this certainly will require that robot obtains information as much as possible near door, accurately determines a position for both sides phase opposite house thereby through blending algorithm.
In recent years, along with the lifting of the universal correlation technique of vision sensor, use the object in vision sensor environment-identification to be possibly realized and to have cost low, the advantages such as arithmetic speed is very fast.For other objects in vision sensor recognitiion gate or environment, it is thus proposed that the method strengthening study obtains the relative pose of door and robot thus robot is navigated, and completes the task of moving into one's husband's household upon marriage of robot.But this method is based on the algorithm of study, it is necessary to great amount of samples is also trained the long period, and discrimination is heavily dependent on the rich of institute's training sample, if samples selection is abundant not will not adapt to various door identification.Someone uses Hough transformation to carry out image extracting line processing, so that estimate the relative pose of door and robot by the method based on model, although this algorithm does not rely on sample, but error is relatively big, and algorithm is more complicated.Also having some algorithms to use the algorithm of image procossing related fields, inefficient, success rate is limited.This patent proposes a kind of door recognition methods based on binocular, use setting height(from bottom) this characteristic identical of two photographic head cleverly, thus being conducive to the upper Feature Points Matching of two width image level axis about binocular, reach the purpose of the pose better determining door under robot coordinate system.
Summary of the invention
The present invention proposes the navigation of a kind of indoor moving robot view-based access control model and moves into one's husband's household upon marriage method, the sensor that the method uses has speedometer, vision sensor, the information calculating door returned according to vision and the relative position relation of robot, global position information according to the information computer device people that speedometer returns, thus controlling the motion that robot realizes moving into one's husband's household upon marriage accurately.
Technical scheme provides the robot navigation of a kind of view-based access control model and moves into one's husband's household upon marriage method, comprises the following steps:
S101, obtain robot current time Position and orientation parameters (x in global coordinate system according to speedometer or Global localization sensor, y, θ), x is the robot abscissa relative to initial point, y is the robot vertical coordinate relative to initial point, θ be robot towards the angle relative to x-axis, rotate to be counterclockwise on the occasion of angle angle, clockwise turn to negative value angle angle;
S102, determine the door that robot to pass through according to user instruction;
S103, determining that robot moves into one's husband's household upon marriage and open the region of photographic head before, each unlatching photographic head region is: Areai=(xj,yk) (1≤j≤N, 1≤k≤M), wherein i represents the numbering of door, and N and M represents the scope in certain region in front of the door;
S104, calculate and this time navigate terminal;
After S105, beginning and end are determined, by A* algorithm calculating robot's walking path, control robot motion to the terminal that navigates;
S106, the dead reckoning information (x returned according to speedometerr,yr,θr), the position of calculating robot's phase opposite house and attitude, obtain robot towards the angle Δ θ with robot Yu door midpoint line, control revolute Δ θ and make robot towards door;
S107, robot are static, and slave computer sends request to host computer and opens photographic head order simultaneously, and host computer opens the binocular camera being positioned at robot front, and starts to gather image, carry out an identification;
S108, the coordinate x of door the right and left frame returned according to image procossingl,yl,xr,yrxl,yl,xr,yr(xl< xr,yl< yr) calculate door centre coordinate (x0,y0), wherein
S109, with xo, yo for initial point, robot is currently oriented benchmark, with from door left frame point to left frame set up cartesian coordinate system XOY for x-axis forward;
S110, obtain robot coordinate (x ' under XOY coordinate systemr1,y′r1,θ′r1), and obtain robot and relatively move into one's husband's household upon marriage in process first and correct point (xl1,yl1) distance, delta d1With angle delta θ1, wherein,
-200mm < xl1< 200mm,
-1400mm < yl1<-1000mm;
S111, control robot craspedodrome Δ d1, then turning Δ θ1, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '1, control robot and turn over Δ θ '1Angle aligns target door;
S112, obtain robot coordinate (x ' under XOY coordinate systemr2,y′r2,θ′r2), obtain robot and relatively move into one's husband's household upon marriage in process second and correct point (xl2,yl2) distance, delta d2With angle delta θ2, wherein,
-200mm < xl2< 200mm,
-1000mm < yl2<-600mm;
S113, control robot craspedodrome Δ d2, then turning Δ θ2, according to dead reckoning information calculate robot currently towards with door perpendicular bisector angle Δ θ '2, make robot turn over Δ θ '2To main entrance;
S114, obtain robot coordinate (x ' under XOY coordinate systemr3,y′r3,θ′r3), obtain robot and relatively move into one's husband's household upon marriage in process the 3rd and correct point (xl3,yl3) distance, delta d3With angle delta θ3, wherein,
-200mm < xl3< 200mm,
-600mm < yl3<-300mm;
S115, control robot craspedodrome Δ d3, then turning Δ θ3, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '3, make robot turn over Δ θ '3To main entrance;
S116, control robot advance Δ d, wherein 800mm < Δ d < 1200mm, make the complete 6 one-tenth operations of entering of robot.
Further, in step S104, described calculating is this time navigated terminal, farther includes:
Reference zone is scanned for, obtain each reference point in reference zone to robot current location Euclidean distance namelyWherein x, y represent certain grid coordinate in grating map;
To all d tried to achieveiD is tried to achieve in sequenceiMinima;
diMinima corresponding grid positions be terminal of robot this time navigation.
Further, in step s 107, described in carry out identifying, farther include:
The left and right two width RBG image I that binocular camera is collected by S201, host computerl、IrCarry out gray processing process respectively, obtain gray level image Grayl、Grayr;
S202, employing image overall Binarization methods (such as threshold method, the big law of OSTU, Shanbhag method etc.) are to the gray level image Gray obtainedl、GrayrCarry out binary conversion treatment respectively and obtain bianry image Binaryl、Binaryr;
S203, use are shaped as rectangle (a < b, and a value is 1 or 2, b span be b < h&b > h.2, wherein h is the height collecting image) structural element, carry out morphological image burn into expansion process obtain bianry image Binary 'l、Binary′lIn vertical stripe regionNamely this region is likely doorframe or the region of door leaf existence;
S204, the vertical stripe region obtained in step S203 are Matching unit, to RlAnd RrIn vertical stripe region mate;
S205, left and right sides corresponding point application binocular range measurement principle is drawn this positional information (x and y) under robot coordinate system by the vertical stripe region obtained in step S204, obtain the developed width information in each vertical stripe region in image simultaneously
S206, developed width in conjunction with domestic environment Xiamen are 850mm-1200mm, can obtain meeting the vertical stripe region (vertical stripe region (door leaf+left and right sides doorframe), two vertical stripe regions (doorframe and doorframe, doorframe and door leaf)) of actual door width, further door is thought in the vertical stripe region obtained, and by door the right and left frame coordinate x under robot coordinate systeml,yl,xr,yrReturn slave computer and (when for vertical stripe region (door leaf), return the coordinate of both sides, vertical stripe region;When for two vertical stripe regions (doorframe and doorframe), return the coordinate of two vertical stripe areas outside;When for two vertical stripe regions (doorframe and door leaf), return the coordinate of two vertical stripe area inside).
Further, in step S204, described to RlAnd RrIn vertical stripe region mate, farther include:
S301, there is parallax according to the object same in binocular solid matching principle imaging in the photographic head of two, left and right, and for the parallel binocular range finding model of optical axis, it is the x of imaging point under left camera image coordinate systemlValue is more than its x of imaging point under right camera image coordinate systemr, i.e. RlIn elementCorresponding RrIn occurrenceMeetWhereinForThe x value of regional center point,ForThe x value of regional center point, selectsImaging point.
S302, retrain according to geometric similarity in binocular solid matching principle, i.e. RlIn elementCorresponding RrIn occurrenceMeet its geometric properties to be more or less the same, include but not limited to peak width w, color (rgb color space, HSV color space, YUV color space etc.), texture (LBP etc.) feature, by features described above similarity is measured:
For width characteristics: RespectivelyWidth, n is RrMeet the region total quantity of constraint;
To other features: adopt statistics with histogram, and adopt the mode of similarity measurement (such as Euclidean distance, mahalanobis distance, Pasteur's distance etc.) to carry out Similarity measures and can obtain feature correspondence similarity Sfeature;
May finally obtain its similarity is
Select similarity maximumForOccurrence.
S303, return S301, repeat S301, S302 and obtain all of matching area in left images.
Technical solution of the present invention adopts speedometer, the position of visual response device robot measurement and surrounding enviroment, the information calculating door returned according to vision and the relative position relation of robot, global position information according to the information computer device people that speedometer returns, the effective position obtaining robot and pose parameter, thus controlling the motion that robot realizes moving into one's husband's household upon marriage accurately.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from description, or understand by implementing the present invention.The purpose of the present invention and other advantages can be realized by structure specifically noted in the description write, claims and accompanying drawing and be obtained.
Below by drawings and Examples, technical scheme is described in further detail.
Accompanying drawing explanation
Accompanying drawing is for providing a further understanding of the present invention, and constitutes a part for description, is used for together with embodiments of the present invention explaining the present invention, is not intended that limitation of the present invention.In the accompanying drawings:
Fig. 1 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage the flow chart of method;
Fig. 2 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage the method flow diagram of the door identification in method;
Fig. 3 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage the method flow diagram of the vertical stripe Region Matching in method.
Detailed description of the invention
Below in conjunction with accompanying drawing, the preferred embodiments of the present invention are illustrated, it will be appreciated that preferred embodiment described herein is merely to illustrate and explains the present invention, is not intended to limit the present invention.
Fig. 1 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage method flow diagram.As it is shown in figure 1, this flow process comprises the following steps:
S101, obtain robot current time Position and orientation parameters (x in global coordinate system according to speedometer or Global localization sensor, y, θ), x is the robot abscissa relative to initial point, y is the robot vertical coordinate relative to initial point, θ be robot towards the angle relative to x-axis, rotate to be counterclockwise on the occasion of angle angle, clockwise turn to negative value angle angle;
S102, determine the door that robot to pass through according to user instruction;
S103, determining that robot moves into one's husband's household upon marriage and open the region of photographic head before, each unlatching photographic head region is: Areai=(xj,yk) (1≤j≤N, 1≤k≤M), wherein i represents the numbering of door, and N and M represents the scope in certain region in front of the door;
S104, calculate and this time navigate terminal;
After S105, beginning and end are determined, by A* algorithm calculating robot's walking path, control robot motion to the terminal that navigates;
S106, the dead reckoning information (x returned according to speedometerr,yr,θr), the position of calculating robot's phase opposite house and attitude, obtain robot towards the angle Δ θ with robot Yu door midpoint line, control revolute Δ θ and make robot towards door;
S107, robot are static, and slave computer sends request to host computer and opens photographic head order simultaneously, and host computer opens the binocular camera being positioned at robot front, and starts to gather image, carry out an identification;
S108, the coordinate x of door the right and left frame returned according to image procossingl,yl,xr,yr(xl< xr,yl< yr) calculate door centre coordinate (x0,y0), wherein,
S109, with xo,yoFor initial point, robot is currently oriented benchmark, to set up cartesian coordinate system XOY from door left frame sensing left frame for x-axis forward;
S110, obtain robot coordinate (x ' under XOY coordinate systemr1,y′r1,θ′r1), and obtain robot and relatively move into one's husband's household upon marriage first distance, delta d correcting point (0 ,-1200) in process1With angle delta θ1;
S111, control robot craspedodrome Δ d1, then turning Δ θ1, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '1, control robot and turn over Δ θ '1Angle aligns target door;
S112, obtain robot coordinate (x ' under XOY coordinate systemr2,y′r2,θ′r2), obtain robot and relatively move into one's husband's household upon marriage second distance, delta d correcting point (0 ,-800) in process2With angle delta θ2;
S113, control robot craspedodrome Δ d2, then turning Δ θ2, according to dead reckoning information calculate robot currently towards with door perpendicular bisector angle Δ θ '2, make robot turn over Δ θ '2To main entrance;
S114, obtain robot coordinate (x ' under XOY coordinate systemr3,y′r3,θ′r3), obtain robot and relatively move into one's husband's household upon marriage the 3rd the distance, delta d correcting point (0 ,-400) in process3With angle delta θ3;
S115, control robot craspedodrome Δ d3, then turning Δ θ3, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '3, make robot turn over Δ θ '3To main entrance;
S116, control robot advance 1000mm, make robot complete operation of entering.
Further, in step S104, described calculating is this time navigated terminal, farther includes following steps:
Reference zone is scanned for, obtain each reference point in reference zone to robot current location Euclidean distance namelyWherein x, y represent certain grid coordinate in grating map;
To all d tried to achieveiD is tried to achieve in sequenceiMinima;
diMinima corresponding grid positions be terminal of robot this time navigation.
Fig. 2 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage the method flow diagram of the door identification in method, namely carries out a step identified in step S107.As in figure 2 it is shown, this flow process comprises the following steps:
The left and right two width RBG image I that binocular camera is collected by S201, host computerl、IrCarry out gray processing process respectively, obtain gray level image Grayl、Grayr;
S202, employing image overall Binarization methods (such as threshold method, the big law of OSTU, Shanbhag method etc.) are to the gray level image Gray obtainedl、GrayrCarry out binary conversion treatment respectively and obtain bianry image Binaryl、Binaryr;
S203, use are shaped as rectangle (a < b, and a value is 1 or 2, b span be b < h&b > h.2, wherein h is the height collecting image) structural element, carry out morphological image burn into expansion process obtain bianry image Binary 'l、Binary′rIn vertical stripe regionNamely this region is likely doorframe or the region of door leaf existence;
S204, the vertical stripe region obtained in step S203 are Matching unit, to RlAnd RrIn vertical stripe region mate;
S205, left and right sides corresponding point application binocular range measurement principle is drawn this positional information (x and y) under robot coordinate system by the vertical stripe region obtained in step S204, obtain the developed width information in each vertical stripe region in image simultaneously
S206, developed width in conjunction with domestic environment Xiamen are 850mm-1200mm, can obtain meeting the vertical stripe region (vertical stripe region (door leaf+left and right sides doorframe), two vertical stripe regions (doorframe and doorframe, doorframe and door leaf)) of actual door width, further door is thought in the vertical stripe region obtained, and by door the right and left frame coordinate x under robot coordinate systeml,yl,xr,yrReturn slave computer and (when for vertical stripe region (door leaf), return the coordinate of both sides, vertical stripe region;When for two vertical stripe regions (doorframe and doorframe), return the coordinate of two vertical stripe areas outside;When for two vertical stripe regions (doorframe and door leaf), return the coordinate of two vertical stripe area inside).
Fig. 3 is that in the embodiment of the present invention one, the robot navigation of view-based access control model moves into one's husband's household upon marriage the method flow diagram of the vertical stripe Region Matching in method, namely to R in step S204lAnd RrIn vertical stripe region carry out the step mated.As it is shown on figure 3, this flow process comprises the following steps:
S301, there is parallax according to the object same in binocular solid matching principle imaging in the photographic head of two, left and right, and for the parallel binocular range finding model of optical axis, it is the x of imaging point under left camera image coordinate systemlValue is more than its x of imaging point under right camera image coordinate systemr, i.e. RlIn elementCorresponding RrIn occurrenceMeetWhereinForThe x value of regional center point,ForThe x value of regional center point, selectsImaging point.
S302, retrain according to geometric similarity in binocular solid matching principle, i.e. RlIn elementCorresponding RrIn occurrenceMeet its geometric properties to be more or less the same, include but not limited to peak width w, color (rgb color space, HSV color space, YUV color space etc.), texture (LBP etc.) feature, by features described above similarity is measured:
For width characteristics: RespectivelyWidth, n is RrMeet the region total quantity of constraint;
To other features: adopt statistics with histogram, and adopt the mode of similarity measurement (such as Euclidean distance, mahalanobis distance, Pasteur's distance etc.) to carry out Similarity measures and can obtain feature correspondence similarity Sfeature;
May finally obtain its similarity is
Select similarity maximumForOccurrence.
S303, return S301, repeat S301, S302 and obtain all of matching area in left images.
Technical scheme in above-described embodiment is owing to employing speedometer, vision sensor etc., global position information according to the information computer device people that speedometer returns, the information calculating door returned according to vision and the relative position relation of robot, position and posture to robot effectively obtain, thus controlling the motion that robot realizes moving into one's husband's household upon marriage accurately.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of complete hardware embodiment, complete software implementation or the embodiment in conjunction with software and hardware aspect.And, the present invention can adopt the form at one or more upper computer programs implemented of computer-usable storage medium (including but not limited to disk memory and optical memory etc.) wherein including computer usable program code.
The present invention is that flow chart and/or block diagram with reference to method according to embodiments of the present invention, equipment (system) and computer program describe.It should be understood that can by the combination of the flow process in each flow process in computer program instructions flowchart and/or block diagram and/or square frame and flow chart and/or block diagram and/or square frame.These computer program instructions can be provided to produce a machine to the processor of general purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device so that the instruction performed by the processor of computer or other programmable data processing device is produced for realizing the device of function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions may be alternatively stored in and can guide in the computer-readable memory that computer or other programmable data processing device work in a specific way, the instruction making to be stored in this computer-readable memory produces to include the manufacture of command device, and this command device realizes the function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device, make on computer or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computer or other programmable devices provides for realizing the step of function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame.
Obviously, the present invention can be carried out various change and modification without deviating from the spirit and scope of the present invention by those skilled in the art.So, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.
Claims (4)
1. the robot navigation of a view-based access control model moves into one's husband's household upon marriage method, it is characterised in that comprise the following steps:
S101, obtain robot current time Position and orientation parameters (x in global coordinate system according to speedometer or Global localization sensor, y, θ), x is the robot abscissa relative to initial point, y is the robot vertical coordinate relative to initial point, θ be robot towards the angle relative to x-axis, rotate to be counterclockwise on the occasion of angle angle, clockwise turn to negative value angle angle;
S102, determine the door that robot to pass through according to user instruction;
S103, determining that robot moves into one's husband's household upon marriage and open the region of photographic head before, each unlatching photographic head region is: Areai=(xj,yk) (1≤j≤N, 1≤k≤M), wherein i represents the numbering of door, and N and M represents the scope in certain region in front of the door;
S104, calculate and this time navigate terminal;
After S105, beginning and end are determined, by A* algorithm calculating robot's walking path, control robot motion to the terminal that navigates;
S106, the dead reckoning information (x returned according to speedometerr,yr,θr), the position of calculating robot's phase opposite house and attitude, obtain robot towards the angle Δ θ with robot Yu door midpoint line, control revolute Δ θ and make robot towards door;
S107, robot are static, and slave computer sends request to host computer and opens photographic head order simultaneously, and host computer opens the binocular camera being positioned at robot front, and starts to gather image, carry out an identification;
S108, the coordinate x of door the right and left frame returned according to image procossingl,yl,xr,yr(xl<xr,yl<yr) calculate door centre coordinate (x0,y0), wherein,
S109, with xo,yoFor initial point, robot is currently oriented benchmark, to set up cartesian coordinate system XOY from door left frame sensing left frame for x-axis forward;
S110, obtain robot coordinate (x ' under XOY coordinate systemr1,y′r1,θ′r1), and obtain robot and relatively move into one's husband's household upon marriage in process first and correct point (xl1,yl1) distance, delta d1With angle delta θ1, wherein,
-200mm<xl1< 200mm,
-1400mm<yl1<-1000mm;
S111, control robot craspedodrome Δ d1, then turning Δ θ1, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '1, control robot and turn over Δ θ '1Angle is to main entrance;
S112, obtain robot coordinate (x ' under XOY coordinate systemr2,y′r2,θ′r2), obtain robot and relatively move into one's husband's household upon marriage in process second and correct point (xl2,yl2) distance, delta d2With angle delta θ2, wherein,
-200mm<xl2< 200mm,
-1000mm<yl2<-600mm;
S113, control robot craspedodrome Δ d2, then turning Δ θ2, according to dead reckoning information calculate robot currently towards with door perpendicular bisector angle Δ θ '2, make robot turn over Δ θ '2To main entrance;
S114, obtain robot coordinate (x ' under XOY coordinate systemr3,y′r3,θ′r3), obtain robot and relatively move into one's husband's household upon marriage in process the 3rd and correct point (xl3,yl3) distance, delta d3With angle delta θ3, wherein,
-200mm<xl3< 200mm,
-600mm<yl3<-300mm;
S115, control robot craspedodrome Δ d3, then turning Δ θ3, according to dead reckoning information obtain robot currently towards with door perpendicular bisector angle Δ θ '3, make robot turn over Δ θ '3To main entrance;
S116, control robot advance Δ d, wherein 800mm < Δ d < 1200mm, make robot complete operation of entering.
2. method according to claim 1, it is characterised in that in step S104, described calculating is this time navigated terminal, farther includes:
Reference zone is scanned for, obtain each reference point in reference zone to robot current location Euclidean distance namelyWherein x, y represent certain grid coordinate in grating map;
To all d tried to achieveiD is tried to achieve in sequenceiMinima;
diMinima corresponding grid positions be terminal of robot this time navigation.
3. method according to claim 1, it is characterised in that in step s 107, described in carry out identifying, farther include:
The left and right two width RBG image I that binocular camera is collected by S201, host computerl、IrCarry out gray processing process respectively, obtain gray level image Grayl、Grayr;
S202, employing image overall Binarization methods (such as threshold method, the big law of OSTU, Shanbhag method etc.) are to the gray level image Gray obtainedl、GrayrCarry out binary conversion treatment respectively and obtain bianry image Binaryl、Binaryr;
S203, use are shaped as rectangle (a<b, and a value be 1 or 2, b spans is b<h&b>h.2, wherein h is the height collecting image) structural element, carry out morphological image burn into expansion process obtain bianry image Binary 'l、Binary′rIn vertical stripe regionNamely this region is likely doorframe or the region of door leaf existence;
S204, the vertical stripe region obtained in step S203 are Matching unit, to RlAnd RrIn vertical stripe region mate;
S205, left and right sides corresponding point application binocular range measurement principle is drawn this positional information (x and y) under robot coordinate system by the vertical stripe region obtained in step S204, obtain the developed width information in each vertical stripe region in image simultaneously
S206, developed width in conjunction with domestic environment Xiamen are 850mm-1200mm, can obtain meeting the vertical stripe region (vertical stripe region (door leaf+left and right sides doorframe), two vertical stripe regions (doorframe and doorframe, doorframe and door leaf)) of actual door width, further door is thought in the vertical stripe region obtained, and by door the right and left frame coordinate x under robot coordinate systeml,yl,xr,yrReturn slave computer and (when for vertical stripe region (door leaf), return the coordinate of both sides, vertical stripe region;When for two vertical stripe regions (doorframe and doorframe), return the coordinate of two vertical stripe areas outside;When for two vertical stripe regions (doorframe and door leaf), return the coordinate of two vertical stripe area inside).
4. method according to claim 3, it is characterised in that in step S204, described to RlAnd RrIn vertical stripe region mate, farther include:
S301, there is parallax according to the object same in binocular solid matching principle imaging in the photographic head of two, left and right, and for the parallel binocular range finding model of optical axis, it is the x of imaging point under left camera image coordinate systemlValue is more than its x of imaging point under right camera image coordinate systemr, i.e. RlIn elementCorresponding RrIn occurrenceMeetWhereinForThe x value of regional center point,ForThe x value of regional center point, selectsImaging point.
S302, retrain according to geometric similarity in binocular solid matching principle, i.e. RlIn elementCorresponding RrIn occurrenceMeet its geometric properties to be more or less the same, include but not limited to peak width w, color (rgb color space, HSV color space, YUV color space etc.), texture (LBP etc.) feature, by features described above similarity is measured:
For width characteristics: RespectivelyWidth, n is RrMeet the region total quantity of constraint;
To other features: adopt statistics with histogram, and adopt the mode of similarity measurement (such as Euclidean distance, mahalanobis distance, Pasteur's distance etc.) to carry out Similarity measures and can obtain feature correspondence similarity Sfeature;
May finally obtain its similarity is
Select similarity maximumForOccurrence.
S303, return S301, repeat S301, S302, obtain all of matching area in left images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610266610.5A CN105717928B (en) | 2016-04-26 | 2016-04-26 | A kind of robot navigation of view-based access control model moves into one's husband's household upon marriage method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610266610.5A CN105717928B (en) | 2016-04-26 | 2016-04-26 | A kind of robot navigation of view-based access control model moves into one's husband's household upon marriage method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105717928A true CN105717928A (en) | 2016-06-29 |
CN105717928B CN105717928B (en) | 2018-03-30 |
Family
ID=56161615
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610266610.5A Active CN105717928B (en) | 2016-04-26 | 2016-04-26 | A kind of robot navigation of view-based access control model moves into one's husband's household upon marriage method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105717928B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106444777A (en) * | 2016-10-28 | 2017-02-22 | 北京进化者机器人科技有限公司 | Robot automatic return charging method and system |
CN106504288A (en) * | 2016-10-24 | 2017-03-15 | 北京进化者机器人科技有限公司 | A kind of domestic environment Xiamen localization method based on binocular vision target detection |
CN107329476A (en) * | 2017-08-02 | 2017-11-07 | 珊口(上海)智能科技有限公司 | A kind of room topology map construction method, system, device and sweeping robot |
CN108319268A (en) * | 2018-02-08 | 2018-07-24 | 衢州职业技术学院 | A kind of robot navigation of view-based access control model moves into one's husband's household upon marriage method |
CN109674404A (en) * | 2019-01-26 | 2019-04-26 | 深圳市云鼠科技开发有限公司 | A kind of sweeping robot avoidance processing mode based on free move technology |
CN110293548A (en) * | 2018-03-21 | 2019-10-01 | 中车株洲电力机车研究所有限公司 | A kind of locomotive inspection and repair intelligent robot passes through the barrier-avoiding method and control system of kissing gate |
CN110456802A (en) * | 2019-08-30 | 2019-11-15 | 上海有个机器人有限公司 | A kind of method that mobile robot safely and fast crosses gate |
CN107479551B (en) * | 2017-08-22 | 2020-11-10 | 北京小米移动软件有限公司 | Method and device for controlling movement |
CN112000109A (en) * | 2020-09-10 | 2020-11-27 | 广西亚像科技有限责任公司 | Position correction method for power inspection robot, power inspection robot and medium |
CN112650250A (en) * | 2020-12-23 | 2021-04-13 | 深圳市杉川机器人有限公司 | Map construction method and robot |
CN113386138A (en) * | 2021-07-01 | 2021-09-14 | 上海宜硕网络科技有限公司 | Robot door opening control method and device and electronic equipment |
CN113459098A (en) * | 2021-07-01 | 2021-10-01 | 上海宜硕网络科技有限公司 | Robot door closing control method and device and electronic equipment |
CN114018268A (en) * | 2021-11-05 | 2022-02-08 | 上海景吾智能科技有限公司 | Indoor mobile robot navigation method |
CN114296470A (en) * | 2022-01-21 | 2022-04-08 | 河南牧原智能科技有限公司 | Robot navigation method, device and medium |
CN114505857A (en) * | 2022-01-24 | 2022-05-17 | 达闼机器人股份有限公司 | Robot control method, device, system and computer readable storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108915492A (en) * | 2018-06-26 | 2018-11-30 | 北京云迹科技有限公司 | A kind of control method, system, unmanned equipment and automatically-controlled door |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02145914A (en) * | 1988-02-24 | 1990-06-05 | United Technol Corp <Utc> | Display device for aircraft |
EP0679903A2 (en) * | 1989-12-11 | 1995-11-02 | Caterpillar Inc. | Integrated vehicle positioning and navigation system, apparatus and method |
US5610815A (en) * | 1989-12-11 | 1997-03-11 | Caterpillar Inc. | Integrated vehicle positioning and navigation system, apparatus and method |
WO2005098475A1 (en) * | 2004-03-29 | 2005-10-20 | Evolution Robotics, Inc. | Sensing device and method for measuring position and orientation relative to multiple light sources |
CN102052923A (en) * | 2010-11-25 | 2011-05-11 | 哈尔滨工程大学 | Small-sized underwater robot combined navigation system and navigation method |
CN102829777A (en) * | 2012-09-10 | 2012-12-19 | 江苏科技大学 | Integrated navigation system for autonomous underwater robot and method |
-
2016
- 2016-04-26 CN CN201610266610.5A patent/CN105717928B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02145914A (en) * | 1988-02-24 | 1990-06-05 | United Technol Corp <Utc> | Display device for aircraft |
EP0679903A2 (en) * | 1989-12-11 | 1995-11-02 | Caterpillar Inc. | Integrated vehicle positioning and navigation system, apparatus and method |
US5610815A (en) * | 1989-12-11 | 1997-03-11 | Caterpillar Inc. | Integrated vehicle positioning and navigation system, apparatus and method |
WO2005098475A1 (en) * | 2004-03-29 | 2005-10-20 | Evolution Robotics, Inc. | Sensing device and method for measuring position and orientation relative to multiple light sources |
CN102052923A (en) * | 2010-11-25 | 2011-05-11 | 哈尔滨工程大学 | Small-sized underwater robot combined navigation system and navigation method |
CN102829777A (en) * | 2012-09-10 | 2012-12-19 | 江苏科技大学 | Integrated navigation system for autonomous underwater robot and method |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106504288A (en) * | 2016-10-24 | 2017-03-15 | 北京进化者机器人科技有限公司 | A kind of domestic environment Xiamen localization method based on binocular vision target detection |
WO2018077164A1 (en) * | 2016-10-28 | 2018-05-03 | 北京进化者机器人科技有限公司 | Method and system for enabling robot to automatically return for charging |
CN106444777B (en) * | 2016-10-28 | 2019-12-17 | 北京进化者机器人科技有限公司 | Automatic returning and charging method and system for robot |
CN106444777A (en) * | 2016-10-28 | 2017-02-22 | 北京进化者机器人科技有限公司 | Robot automatic return charging method and system |
CN107329476A (en) * | 2017-08-02 | 2017-11-07 | 珊口(上海)智能科技有限公司 | A kind of room topology map construction method, system, device and sweeping robot |
CN107479551B (en) * | 2017-08-22 | 2020-11-10 | 北京小米移动软件有限公司 | Method and device for controlling movement |
CN108319268A (en) * | 2018-02-08 | 2018-07-24 | 衢州职业技术学院 | A kind of robot navigation of view-based access control model moves into one's husband's household upon marriage method |
CN110293548A (en) * | 2018-03-21 | 2019-10-01 | 中车株洲电力机车研究所有限公司 | A kind of locomotive inspection and repair intelligent robot passes through the barrier-avoiding method and control system of kissing gate |
CN110293548B (en) * | 2018-03-21 | 2022-06-10 | 中车株洲电力机车研究所有限公司 | Obstacle avoidance method and control system for intelligent narrow door crossing of locomotive inspection and repair robot |
CN109674404B (en) * | 2019-01-26 | 2021-08-10 | 深圳市云鼠科技开发有限公司 | Obstacle avoidance processing mode of sweeping robot based on free move technology |
CN109674404A (en) * | 2019-01-26 | 2019-04-26 | 深圳市云鼠科技开发有限公司 | A kind of sweeping robot avoidance processing mode based on free move technology |
CN110456802A (en) * | 2019-08-30 | 2019-11-15 | 上海有个机器人有限公司 | A kind of method that mobile robot safely and fast crosses gate |
CN112000109A (en) * | 2020-09-10 | 2020-11-27 | 广西亚像科技有限责任公司 | Position correction method for power inspection robot, power inspection robot and medium |
CN112650250A (en) * | 2020-12-23 | 2021-04-13 | 深圳市杉川机器人有限公司 | Map construction method and robot |
CN113386138A (en) * | 2021-07-01 | 2021-09-14 | 上海宜硕网络科技有限公司 | Robot door opening control method and device and electronic equipment |
CN113459098A (en) * | 2021-07-01 | 2021-10-01 | 上海宜硕网络科技有限公司 | Robot door closing control method and device and electronic equipment |
CN113459098B (en) * | 2021-07-01 | 2022-06-03 | 上海宜硕网络科技有限公司 | Robot door closing control method and device and electronic equipment |
CN113386138B (en) * | 2021-07-01 | 2022-06-03 | 上海宜硕网络科技有限公司 | Robot door opening control method and device and electronic equipment |
CN114018268A (en) * | 2021-11-05 | 2022-02-08 | 上海景吾智能科技有限公司 | Indoor mobile robot navigation method |
CN114296470A (en) * | 2022-01-21 | 2022-04-08 | 河南牧原智能科技有限公司 | Robot navigation method, device and medium |
CN114505857A (en) * | 2022-01-24 | 2022-05-17 | 达闼机器人股份有限公司 | Robot control method, device, system and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN105717928B (en) | 2018-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105717928A (en) | Vision-based robot navigation door-passing method | |
US10939791B2 (en) | Mobile robot and mobile robot control method | |
Nieto et al. | Recursive scan-matching SLAM | |
Xu et al. | Ceiling-based visual positioning for an indoor mobile robot with monocular vision | |
Maier et al. | Vision-based humanoid navigation using self-supervised obstacle detection | |
Hochdorfer et al. | 6 DoF SLAM using a ToF camera: The challenge of a continuously growing number of landmarks | |
Wei et al. | 3D semantic map-based shared control for smart wheelchair | |
Sáez et al. | Entropy minimization SLAM using stereo vision | |
Li et al. | A mobile robotic arm grasping system with autonomous navigation and object detection | |
Chen et al. | Design and Implementation of AMR Robot Based on RGBD, VSLAM and SLAM | |
Núnez et al. | Fast laser scan matching approach based on adaptive curvature estimation for mobile robots | |
Blanco et al. | Localization by voronoi diagrams correlation | |
CN115307641A (en) | Robot positioning method, device, robot and storage medium | |
Cupec et al. | Fast pose tracking based on ranked 3D planar patch correspondences | |
Chen et al. | Multiple-object tracking based on monocular camera and 3-D lidar fusion for autonomous vehicles | |
Bonin-Font et al. | A monocular mobile robot reactive navigation approach based on the inverse perspective transformation | |
Jia et al. | HOMOGRAPHY-BASEDVISUAL PREDICTIVECONTROLOFTRACKED MOBILEROBOTWITHFIELD-OF-VIEW CONSTRAINTS | |
Frintrop et al. | Pay attention when selecting features | |
KR100991194B1 (en) | System and method for transporting object of mobing robot | |
Yu et al. | Distance estimation method with snapshot landmark images in the robotic homing navigation | |
Tomono | Building an object map for mobile robots using LRF scan matching and vision-based object recognition | |
Wang et al. | Real-time obstacle detection with a single camera | |
Zeng et al. | An indoor global localization technique for mobile robots in long straight environments | |
Chen | Recognition and localization of target images for robot vision navigation control | |
Ray et al. | Simultaneous Localisation and Image Intensity Based Occupancy Grid Map Building--A New Approach |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: Building 65, No. 17, Jiujiang Road, Tongji New Economic Zone, Jimo District, Qingdao City, Shandong Province, 266200 Patentee after: Qingdao Evolver xiaopang Robot Technology Co.,Ltd. Address before: Room 02-A426, 2nd Floor, Block B, No. 22, Information Road, Haidian District, Beijing 100029 Patentee before: BEIJING EVOLVER ROBOTICS Co.,Ltd. |