CN105437251B - A kind of method and device of positioning robot position - Google Patents
A kind of method and device of positioning robot position Download PDFInfo
- Publication number
- CN105437251B CN105437251B CN201610006619.2A CN201610006619A CN105437251B CN 105437251 B CN105437251 B CN 105437251B CN 201610006619 A CN201610006619 A CN 201610006619A CN 105437251 B CN105437251 B CN 105437251B
- Authority
- CN
- China
- Prior art keywords
- point
- robot
- optical markings
- centre coordinate
- quick response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a kind of method and device of positioning robot position, this method includes:The two-dimension code image that robot is shot is received, the centre coordinate point and angle of Quick Response Code are obtained from two-dimension code image;According to the centre coordinate of pre-determined distance value and Quick Response Code point, the picture of optical markings point is intercepted from two-dimension code image;The centre coordinate of optical markings point is obtained from the picture of optical markings point;According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and the shooting angle for shooting two-dimension code image, current location and the angle of robot are determined.Realize and shoot the shooting angle of two-dimension code image to determine the position of robot by Quick Response Code, optical markings point and robot, improve the accuracy for determining the position that robot is presently in, reduce the error of positioning robot position, so as to which the judgment accuracy made according to the robot location oriented is very high, can quickly it be arrived at control machine people.
Description
Technical field
The present invention relates to robotics, in particular to a kind of method and device of positioning robot position.
Background technology
At present, with the fast development of information technology, Quick Response Code is widely used in each enterprise and various product, two dimension
Code summarizes the relevant information of enterprise and product, and relevant information can be obtained by scanning Quick Response Code.For example industrial robot leads to
The Quick Response Code of working region setting is over-scanned to obtain the location of itself information.
Currently, when industrial robot obtains the location of itself information by Quick Response Code, industrial robot is first passed through
The Quick Response Code set in the video camera shooting area that itself is installed, then parses Quick Response Code to obtain the position of Quick Response Code region
Confidence ceases, and the positional information of Quick Response Code is defined as into the position that robot is presently in, console is presently according to robot
Position and robot target location come control machine people movement.
There is deviation in physical location and the position of Quick Response Code when robot shoots Quick Response Code residing for it, robot is shot
Quick Response Code the position that is presently in as robot of positional information, cause the standard for the position that the robot determined is presently in
True property is very low, and the error of positioning robot position is very big.
The content of the invention
In view of this, the purpose of the embodiment of the present invention is to provide a kind of method and device of positioning robot position, real
Now shoot the shooting angle of two-dimension code image to determine the position of robot by Quick Response Code, optical markings point and robot, carry
The high accuracy for determining the position that robot is presently in, reduces the error of positioning robot position, so that according to positioning
The judgment accuracy that the robot location gone out makes is very high, can quickly be arrived at control machine people.
In a first aspect, the embodiments of the invention provide a kind of method of positioning robot position, methods described includes:
Receive robot shoot two-dimension code image, from the two-dimension code image obtain Quick Response Code centre coordinate point and
Angle;
According to the centre coordinate of pre-determined distance value and Quick Response Code point, optical markings are intercepted from the two-dimension code image
The picture of point;
The centre coordinate of optical markings point is obtained from the picture of the optical markings point;
According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and the shooting Quick Response Code figure
The shooting angle of piece, determines current location and the angle of robot.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the first of above-mentioned first aspect, its
In, the centre coordinate point according to pre-determined distance value and the Quick Response Code intercepts optical markings from the two-dimension code image
The picture of point, including:
In the two-dimension code image, it is determined that using the centre coordinate point of the Quick Response Code as round dot and with the pre-determined distance
It is worth the circle for radius;
The sectional drawing of multiple pre-set dimensions is intercepted centered on the point on the circular circular arc of determination, by described in interception
The sectional drawing of multiple pre-set dimensions as optical markings point picture.
With reference to the first possible implementation of first aspect, the embodiments of the invention provide the of above-mentioned first aspect
Two kinds of possible implementations, wherein, the centre coordinate that optical markings point is obtained from the picture of the optical markings point,
Including:
Detect successively in the picture of multiple optical markings points and whether include optical markings point;
When including optical markings point in the picture for detecting some optical markings point, the optical markings point is put
Greatly;
The optical markings point after parsing amplification, obtains the centre coordinate of the optical markings point.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the third of above-mentioned first aspect, its
In, it is described according to the centre coordinate point of the Quick Response Code, the centre coordinate of optical markings point and the shooting Quick Response Code figure
The shooting angle of piece, determines current location and the angle of robot, including:
Line between the centre coordinate point of the Quick Response Code and the centre coordinate of optical markings point is defined as base
Directrix;
According to the datum line, the centre coordinate point of the Quick Response Code and the shooting angle for shooting the two-dimension code image,
Position and angle by trigonometric function calculating robot.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the 4th of above-mentioned first aspect kind, its
In, after the current location for determining robot and angle, including:
According to the corresponding target location of the current location of the robot of determination and angle and the robot, generation
The corresponding move of the robot, sends the move to the robot, so that the robot is according to
Move is moved.
Second aspect, the embodiments of the invention provide a kind of device of positioning robot position, described device includes:
First acquisition module, the two-dimension code image for receiving robot shooting, two are obtained from the two-dimension code image
Tie up the centre coordinate point and angle of code;
Interception module, for the centre coordinate point according to pre-determined distance value and the Quick Response Code, from the two-dimension code image
The picture of middle interception optical markings point;
Second acquisition module, the centre coordinate for obtaining optical markings point from the picture of the optical markings point;
Determining module, for the centre coordinate point according to the Quick Response Code, the centre coordinate of optical markings point and bat
The shooting angle of the two-dimension code image is taken the photograph, current location and the angle of robot is determined.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the first of above-mentioned second aspect, its
In, the interception module includes:
First determining unit, in the two-dimension code image, it is determined that using the centre coordinate point of the Quick Response Code as circle
Put and using the pre-determined distance value as the circle of radius;
First interception unit, for intercepting multiple pre-set dimensions centered on the point on the circular circular arc of determination
Sectional drawing, using the sectional drawing of the multiple pre-set dimension of interception as optical markings point picture.
With reference to the first possible implementation of second aspect, the embodiments of the invention provide the of above-mentioned second aspect
Two kinds of possible implementations, wherein, first interception unit includes:
Whether optical markings point is included in detection sub-unit, the picture for detecting multiple optical markings points successively;
Amplify subelement, for when including optical markings point in the picture for detecting some optical markings point, to described
Optical markings point is amplified;
Subelement is parsed, for parsing the optical markings point after amplification, the center for obtaining the optical markings point is sat
Mark.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the third of above-mentioned second aspect, its
In, the determining module includes:
Second determining unit, for by the centre coordinate of the centre coordinate point of the Quick Response Code and the optical markings point it
Between line be defined as datum line;
Computing unit, for according to the datum line, the centre coordinate point of the Quick Response Code and the shooting Quick Response Code figure
The shooting angle of piece, position and angle by trigonometric function calculating robot.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the 4th of above-mentioned second aspect kind, its
In, described device also includes:
Generation module, it is corresponding for the current location of the robot according to determination and angle and the robot
Target location, generates the corresponding move of the robot;
Sending module, for sending the move to the robot, so that the robot is according to the movement
Instruction is moved.
In a kind of method and device of positioning robot position provided in an embodiment of the present invention, this method includes:Receive
The two-dimension code image that robot is shot, obtains the centre coordinate point and angle of Quick Response Code from two-dimension code image;According to it is default away from
From value and the centre coordinate point of Quick Response Code, the picture of optical markings point is intercepted from two-dimension code image;From the figure of optical markings point
The centre coordinate of optical markings point is obtained in piece;According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and bat
The shooting angle of two-dimension code image is taken the photograph, current location and the angle of robot is determined;The device includes the first acquisition module, interception
Module, the second acquisition module and determining module;First acquisition module, the two-dimension code image for receiving robot shooting, from two
Tie up the centre coordinate point and angle that Quick Response Code is obtained in code picture;Interception module, for according to pre-determined distance value and Quick Response Code
Centre coordinate point, intercepts the picture of optical markings point from two-dimension code image;Second acquisition module, for from optical markings point
The centre coordinate of optical markings point is obtained in picture;Determining module, for the centre coordinate point according to Quick Response Code, optical markings point
Centre coordinate and shoot two-dimension code image shooting angle, determine current location and the angle of robot.Realize by two
Dimension code, optical markings point and robot shoot the shooting angle of two-dimension code image to determine the position of robot, improve determination
The accuracy for the position that robot is presently in, reduces the error of positioning robot position, so that according to the machine oriented
The judgment accuracy that people makes position is very high, can quickly be arrived at control machine people.
To enable the above objects, features and advantages of the present invention to become apparent, preferred embodiment cited below particularly, and coordinate
Appended accompanying drawing, is described in detail below.
Brief description of the drawings
Technical scheme in order to illustrate more clearly the embodiments of the present invention, below will be attached to what is used needed for embodiment
Figure is briefly described, it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, therefore is not construed as pair
The restriction of scope, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to this
A little accompanying drawings obtain other related accompanying drawings.
Figure 1A shows a kind of flow chart of the method for positioning robot position that the embodiment of the present invention 1 is provided;
Figure 1B shows that what the embodiment of the present invention 1 provided obtains in optical markings point from the picture of optical markings point
The flow chart of heart coordinate;
Fig. 2A shows a kind of the first schematic diagram of the device for positioning robot position that the embodiment of the present invention 2 is provided;
Fig. 2 B show a kind of the second schematic diagram of the device for positioning robot position that the embodiment of the present invention 2 is provided;
Fig. 2 C show a kind of the 3rd schematic diagram of the device for positioning robot position that the embodiment of the present invention 2 is provided.
Embodiment
Below in conjunction with accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Ground is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.Generally exist
The component of the embodiment of the present invention described and illustrated in accompanying drawing can be arranged and designed with a variety of configurations herein.Cause
This, the detailed description of the embodiments of the invention to providing in the accompanying drawings is not intended to limit claimed invention below
Scope, but it is merely representative of the selected embodiment of the present invention.Based on embodiments of the invention, those skilled in the art are not doing
The every other embodiment obtained on the premise of going out creative work, belongs to the scope of protection of the invention.
In view of in the prior art, the position of physical location and Quick Response Code when robot shoots Quick Response Code residing for it is present
Deviation, the position that the positional information of the Quick Response Code that robot is shot is presently in as robot causes the robot determined
The accuracy for the position being presently in is very low, and the error of positioning robot position is very big.It is fixed the invention provides one kind based on this
The method and device of position robot location, realizes the bat that two-dimension code image is shot by Quick Response Code, optical markings point and robot
Take the photograph angle to determine the position of robot, improve the accuracy for determining the position that robot is presently in, reduce positioning robot
The error of position, so that the judgment accuracy made according to the robot location oriented is very high, can be quick with control machine people
Arrive at.It is described below by embodiment.
Embodiment 1
The embodiments of the invention provide a kind of method of positioning robot position.Wherein, the region of robot work is set
Having be both provided with around multiple Quick Response Codes, each Quick Response Code multiple optical markings points, and multiple Quick Response Codes are arranged according to certain rule,
The two-dimension code image of scanning is transferred to console by robot in real time by the two-dimension code image in scanning work region, is controlled
Platform processed identifies the position of Quick Response Code in two-dimension code image by method provided in an embodiment of the present invention, so as to be clapped according to robot
The shooting angle for taking the photograph two-dimension code image orients position and the angle of robot.Referring to Figure 1A, this method includes following S101-
S104 steps:
Step S101:The two-dimension code image that robot is shot is received, the center that Quick Response Code is obtained from two-dimension code image is sat
Punctuate and angle.
Camera or scanner are fitted with above-mentioned robot, when robot operationally, robot pass through install
Camera or scanner two-dimension code image in real time in scanning work region, and by the two-dimension code image scanned and oneself
Mark console is sent to by wireless network.
When console receive robot transmission two-dimension code image and robot mark after, console is to Quick Response Code figure
Piece is parsed, analytically after two-dimension code image in get the centre coordinate point and angle of Quick Response Code.
Step S102:According to the centre coordinate of pre-determined distance value and Quick Response Code point, optics mark is intercepted from two-dimension code image
Remember the picture of point.
Because robot in the process for shooting two-dimension code image may photograph the optical markings point of the Quick Response Code, the present invention is real
Apply the picture for intercepting optical markings point in example from two-dimension code image according to following S1021-S1022 steps.
Step S1021:In two-dimension code image, it is determined that using the centre coordinate point of Quick Response Code as round dot and with pre-determined distance value
For the circle of radius.
Above-mentioned pre-determined distance value can for Quick Response Code centre coordinate point and optical markings point centre coordinate between away from
From value.
Determine after above-mentioned circle, optical markings point may be included in the circular region, can be entered by S1022 steps
One step determines the particular location of optical markings point.
Step S1022:The sectional drawing of multiple pre-set dimensions is intercepted centered on the point on the circular circular arc of determination, will be intercepted
Multiple pre-set dimensions sectional drawing as optical markings point picture.
Above-mentioned pre-set dimension can be the size of the optical markings point of preset multiple, when the sectional drawing for intercepting multiple pre-set dimensions
Afterwards, console regard the sectional drawing of multiple pre-set dimensions of interception as the picture of optical markings point, some of which optical markings point
Picture in include optical markings point.
Step S103:The centre coordinate of optical markings point is obtained from the picture of optical markings point.
After the picture of optical markings point is got, console is by following S1031-S1033 steps from optical markings point
Picture in obtain optical markings point centre coordinate.
Step S1031:Detect successively in the picture of multiple optical markings points and whether include optical markings point.
Step S1032:When including optical markings point in the picture for detecting some optical markings point, to optical markings point
It is amplified.
Because the optical markings point being arranged on around Quick Response Code is smaller, in the two-dimension code image shot by robot
Quick Response Code is also smaller, so that the optical markings point in the picture of optical markings point is smaller, for the ease of to optical markings point
Identification, console is amplified to the optical markings point detected according to certain multiple.
Step S1033:Optical markings point after parsing amplification, obtains the centre coordinate of optical markings point.
When centre coordinate point and the Quick Response Code that Quick Response Code is got in the two-dimension code image that console is shot from robot
After the coordinate of corresponding optical markings point, console can determine current location and the angle of robot according to following S104 steps.
Step S104:According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and shooting two-dimension code image
Shooting angle, determine current location and the angle of robot.
Wherein, current location and the angle of robot can be determined by following S1041-S1042 steps.
Step S1041:Line between the centre coordinate point of Quick Response Code and the centre coordinate of optical markings point is defined as
Datum line;
Step S1042:According to datum line, the centre coordinate point of Quick Response Code and the shooting angle for shooting two-dimension code image, lead to
Cross position and the angle of trigonometric function calculating robot.
After console gets the centre coordinate point of Quick Response Code, while also getting the position of the centre coordinate point of Quick Response Code
Put and angle, the angle arrow of the centre coordinate point of Quick Response Code can be now calculated according to the plane coordinates of robot work region
Amount.Again because the centre coordinate and robot of optical markings point shoot the shooting angle of two-dimension code image, it is known that passing through robot
The plane coordinates of working region shoots Quick Response Code to the position of Quick Response Code centre coordinate point and vector angle, datum line and robot
The shooting angle of picture carries out trigonometric function calculating, so as to calculate position and the angle of robot.
After console determines current location and the angle of robot, console is current according to the robot of determination
The corresponding target location of position and angle and robot, the corresponding move of generation robot sends move to machine
Device people, so that robot is moved according to move.
It is corresponding that above-mentioned console judges whether robot reaches robot according to the current location of robot and angle first
Target location, if reaching, console, which is generated, includes the move of current location to current location, and sends the move
To robot, robot stops at current location after receiving the move;If not reaching, console is according to robot correspondence
Target location judge the position that the next step of robot should be moved to, what generation should be moved to comprising current location to next step
The move of the mobile route of position, and the move is sent to robot, robot is received after the move, is pressed
The corresponding target location of robot is moved to according to the mobile route in move.
In method provided in an embodiment of the present invention, method includes:The two-dimension code image that robot is shot is received, from two dimension
The centre coordinate point and angle of Quick Response Code are obtained in code picture;According to the centre coordinate of pre-determined distance value and Quick Response Code point, from two
Tie up the picture that optical markings point is intercepted in code picture;The centre coordinate of optical markings point is obtained from the picture of optical markings point;
According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and the shooting angle for shooting two-dimension code image, machine is determined
The current location of device people and angle.Realize the shooting that two-dimension code image is shot by Quick Response Code, optical markings point and robot
Angle improves the accuracy for determining the position that robot is presently in, reduces positioning machine to determine the position of robot
The error of people position, so that the judgment accuracy made according to the robot location oriented is very high, can be fast with control machine people
Speed is arrived at.
Embodiment 2
Referring to Fig. 2A, the device of positioning robot position includes:First acquisition module S1, interception module S2, second obtain
Module S3 and determining module S4.
First acquisition module S1, the two-dimension code image for receiving robot shooting, obtains two dimension from two-dimension code image
The centre coordinate point and angle of code;
Interception module S2, for the centre coordinate point according to pre-determined distance value and Quick Response Code, is intercepted from two-dimension code image
The picture of optical markings point;
Second acquisition module S3, the centre coordinate for obtaining optical markings point from the picture of optical markings point;
Determining module S4, for the centre coordinate point according to Quick Response Code, the centre coordinate of optical markings point and shooting two dimension
The shooting angle of code picture, determines current location and the angle of robot.
Camera or scanner are fitted with above-mentioned robot, when robot operationally, robot pass through install
Camera or scanner two-dimension code image in real time in scanning work region, and by the two-dimension code image scanned and oneself
Mark the device of positioning robot position is sent to by wireless network.
When the device of positioning robot position receive robot transmission two-dimension code image and robot mark after, should
Device is parsed to two-dimension code image, the first acquisition module S1 analytically after information in get the centre coordinate of Quick Response Code
Point and angle.
Due to robot shoot two-dimension code image process may photograph the optical markings point of the Quick Response Code, in the present invention
Intercept the picture of optical markings point in embodiment from two-dimension code image according to following interception module S2.
Include the first determining unit S21 and the first interception unit S22 referring to Fig. 2 B, above-mentioned interception module S2.
First determining unit S21, in two-dimension code image, it is determined that using the centre coordinate point of Quick Response Code as round dot and with
Pre-determined distance value is the circle of radius;
Above-mentioned pre-determined distance value can for Quick Response Code centre coordinate point and optical markings point centre coordinate between away from
From value.
Above-mentioned first determining unit S21 is determined after above-mentioned circle, and optical markings may be included in the circular region
Point, can further determine that the particular location of optical markings point by the first interception unit S22.
Multiple pre-set dimensions are intercepted centered on first interception unit S22, the point on the circular circular arc by determination
Sectional drawing, using the sectional drawing of multiple pre-set dimensions of interception as optical markings point picture.
Above-mentioned pre-determined distance size can be the size of the optical markings point of preset multiple, when the multiple pre-set dimensions of interception
After sectional drawing, the first interception unit S22 using the sectional drawing of multiple pre-set dimensions of interception as optical markings point picture, wherein certain
Include optical markings point in the picture of a little optical markings points.
Referring to Fig. 2 B, above-mentioned first interception unit S22 includes detection sub-unit S221, amplification subelement S222 and parsing
Cell S 223.
Whether optical markings point is included in detection sub-unit S221, the picture for detecting multiple optical markings points successively.
Amplify subelement S222, it is right for when including optical markings point in the picture for detecting some optical markings point
Optical markings point is amplified.
Because the optical markings point being arranged on around Quick Response Code is smaller, in the two-dimension code image shot by robot
Quick Response Code is also smaller, so that the optical markings point in the picture of optical markings point is smaller, for the ease of to optical markings point
Identification, the optical markings points that are detected to detection sub-unit S221 of amplification subelement S222 put according to certain multiple
Greatly.
Subelement S223 is parsed, for parsing the optical markings point after amplification, the centre coordinate of optical markings point is obtained.
When the centre coordinate that Quick Response Code is got in the two-dimension code image shot by the first acquisition module S1 from robot
Point gets the center of the corresponding optical markings point of Quick Response Code with angle and by interception module S2 and the second acquisition module S3
, can be by determining module S4 according to the centre coordinate point of Quick Response Code, the center of the corresponding optical markings point of the Quick Response Code after coordinate
The shooting angle of coordinate and robot shooting two-dimension code image determines current location and the angle of robot.
Include the second determining unit S41 and computing unit S42 referring to Fig. 2 B, above-mentioned determining module S4.
Second determining unit S41, for by between the centre coordinate point of Quick Response Code and the centre coordinate of optical markings point
Line is defined as datum line;
Computing unit S42, for according to datum line, the centre coordinate point of Quick Response Code and the shooting angle for shooting two-dimension code image
Degree, position and angle by trigonometric function calculating robot.
After the first acquisition module S1 gets the centre coordinate point of Quick Response Code, while the center for also getting Quick Response Code is sat
The position of punctuate and angle, now computing unit S42 Quick Response Code can be calculated according to the plane coordinates of robot work region
Centre coordinate point angle vector.Again because the centre coordinate and robot of optical markings point shoot the shooting of two-dimension code image
The centre coordinate of the centre coordinate point of Quick Response Code and optical markings point is carried out line by angle, it is known that working as the second determining unit S41
Determine out after datum line, computing unit S42 is by the plane coordinates of robot work region to Quick Response Code centre coordinate point
The shooting angle that position and vector angle, datum line and robot shoot two-dimension code image carries out trigonometric function calculating, so as to count
Calculate position and the angle of robot.
After current location and the angle of robot are determined by determining module S4, pass through generation module S5 and transmission
Module S6 control machines people moves.
Referring to Fig. 2 C, the device also includes generation module S5 and sending module S6.
Generation module, it is corresponding for the current location of the robot according to determination and angle and the robot
Target location, generates the corresponding move of the robot;
Sending module, for sending the move to the robot, so that the robot is according to the movement
Instruction is moved.
Above-mentioned generation module S5 judges whether robot reaches robot according to the current location of robot and angle first
Corresponding target location, if reaching, the mobile finger of mobile route of the generation module S5 generations comprising current location to current location
Order;If not reaching, generation module S5 is according to the corresponding target location of the current location according to robot and angle and robot
Judge the position that robot next step should be moved to, and generate the movement for the position that should be moved to next step comprising current location
The move of route.
After generation module S5 generates corresponding move, move is sent to robot by sending module S6;Machine
Device people is received after the move, and the mobile route in move stops at current location and angle, or according to
Mobile route in move is moved to the corresponding target location of robot.
In device provided in an embodiment of the present invention, the device includes the first acquisition module, interception module, the second acquisition mould
Block and determining module;First acquisition module, the two-dimension code image for receiving robot shooting, is obtained from the two-dimension code image
Take the centre coordinate point and angle of Quick Response Code;Interception module, for the centre coordinate according to pre-determined distance value and the Quick Response Code
Point, intercepts the picture of optical markings point from the two-dimension code image;Second acquisition module, for from the optical markings point
The centre coordinate of optical markings point is obtained in picture;Determining module, for the centre coordinate point according to the Quick Response Code, the light
Learn the centre coordinate of mark point and shoot the shooting angle of the two-dimension code image, determine current location and the angle of robot.
Realize and shoot the shooting angle of two-dimension code image to determine the position of robot by Quick Response Code, optical markings point and robot
Put, improve the accuracy for determining the position that robot is presently in, reduce the error of positioning robot position, so that according to
The judgment accuracy that the robot location oriented makes is very high, can quickly be arrived at control machine people.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained
Cover within protection scope of the present invention.Therefore, protection scope of the present invention described should be defined by scope of the claims.
Claims (10)
1. a kind of method of positioning robot position, it is characterised in that methods described includes:
The two-dimension code image that robot is shot is received, the centre coordinate point and two dimension of Quick Response Code are obtained from the two-dimension code image
The angle of code;
According to the centre coordinate of pre-determined distance value and Quick Response Code point, optical markings point is intercepted from the two-dimension code image
Picture;
The centre coordinate of optical markings point is obtained from the picture of the optical markings point;
According to the centre coordinate of Quick Response Code point, the centre coordinate of optical markings point and the shooting two-dimension code image
Shooting angle, determines current location and the angle of robot.
2. according to the method described in claim 1, it is characterised in that described according to pre-determined distance value and the center of the Quick Response Code
Coordinate points, intercept the picture of optical markings point from the two-dimension code image, including:
In the two-dimension code image, it is determined that using the centre coordinate point of the Quick Response Code as round dot and using the pre-determined distance value as
The circle of radius;
The sectional drawing of multiple pre-set dimensions is intercepted centered on the point on the circular circular arc of determination, by the multiple of interception
The sectional drawing of pre-set dimension as optical markings point picture.
3. method according to claim 2, it is characterised in that described to obtain optics from the picture of the optical markings point
The centre coordinate of mark point, including:
Detect successively in the picture of multiple optical markings points and whether include optical markings point;
When including optical markings point in the picture for detecting some optical markings point, the optical markings point is amplified;
The optical markings point after parsing amplification, obtains the centre coordinate of the optical markings point.
4. according to the method described in claim 1, it is characterised in that the centre coordinate point according to the Quick Response Code, described
The centre coordinate of optical markings point and the shooting angle for shooting the two-dimension code image, determine the current location and angle of robot
Degree, including:
Line between the centre coordinate point of the Quick Response Code and the centre coordinate of optical markings point is defined as datum line;
According to the datum line, the centre coordinate point of the Quick Response Code and the shooting angle for shooting the two-dimension code image, pass through
The position of trigonometric function calculating robot and angle.
5. according to the method described in claim 1, it is characterised in that after the current location for determining robot and angle,
Including:
According to the corresponding target location of the current location of the robot of determination and angle and the robot, generation is described
The corresponding move of robot, sends the move to the robot, so that the robot is according to the movement
Instruction is moved.
6. a kind of device of positioning robot position, it is characterised in that described device includes:
First acquisition module, the two-dimension code image for receiving robot shooting, Quick Response Code is obtained from the two-dimension code image
Centre coordinate point and Quick Response Code angle;
Interception module, for the centre coordinate point according to pre-determined distance value and the Quick Response Code, cuts from the two-dimension code image
Take the picture of optical markings point;
Second acquisition module, the centre coordinate for obtaining optical markings point from the picture of the optical markings point;
Determining module, for the centre coordinate point according to the Quick Response Code, the centre coordinate of optical markings point and shooting institute
The shooting angle of two-dimension code image is stated, current location and the angle of robot is determined.
7. device according to claim 6, it is characterised in that the interception module includes:
First determining unit, in the two-dimension code image, it is determined that using the centre coordinate point of the Quick Response Code as round dot and
Using the pre-determined distance value as the circle of radius;
First interception unit, for intercepting cutting for multiple pre-set dimensions centered on the point on the circular circular arc of determination
Figure, using the sectional drawing of the multiple pre-set dimension of interception as optical markings point picture.
8. device according to claim 7, it is characterised in that first interception unit includes:
Whether optical markings point is included in detection sub-unit, the picture for detecting multiple optical markings points successively;
Amplify subelement, for when including optical markings point in the picture for detecting some optical markings point, to the optics
Mark point is amplified;
Subelement is parsed, for parsing the optical markings point after amplification, the centre coordinate of the optical markings point is obtained.
9. device according to claim 6, it is characterised in that the determining module includes:
Second determining unit, for by between the centre coordinate point of the Quick Response Code and the centre coordinate of optical markings point
Line is defined as datum line;
Computing unit, for according to the datum line, the centre coordinate point of the Quick Response Code and the shooting two-dimension code image
Shooting angle, position and angle by trigonometric function calculating robot.
10. device according to claim 6, it is characterised in that described device also includes:
Generation module, for the corresponding target of the current location of the robot according to determination and angle and the robot
Position, generates the corresponding move of the robot;
Sending module, for sending the move to the robot, so that the robot is according to the move
Move.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610006619.2A CN105437251B (en) | 2016-01-04 | 2016-01-04 | A kind of method and device of positioning robot position |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610006619.2A CN105437251B (en) | 2016-01-04 | 2016-01-04 | A kind of method and device of positioning robot position |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105437251A CN105437251A (en) | 2016-03-30 |
CN105437251B true CN105437251B (en) | 2017-07-18 |
Family
ID=55548118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610006619.2A Active CN105437251B (en) | 2016-01-04 | 2016-01-04 | A kind of method and device of positioning robot position |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105437251B (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017201759A1 (en) * | 2016-05-27 | 2017-11-30 | 深圳市大疆创新科技有限公司 | Optical housing, and photographing apparatus and unmanned aerial vehicle using same |
CN106527432B (en) * | 2016-11-04 | 2019-02-05 | 浙江大学 | The indoor mobile robot cooperative system corrected certainly based on fuzzy algorithmic approach and two dimensional code |
CN106595678A (en) * | 2016-11-10 | 2017-04-26 | 广州市沃希信息科技有限公司 | Passenger positioning method and system |
CN108073163B (en) * | 2016-11-11 | 2020-11-03 | 中国科学院沈阳计算技术研究所有限公司 | Control method for determining accurate position of robot by using two-dimensional code feedback value compensation |
CN108154210B (en) * | 2016-12-02 | 2021-04-16 | 杭州海康机器人技术有限公司 | Two-dimensional code generation and identification method and device |
CN108154209B (en) * | 2016-12-02 | 2021-03-26 | 杭州海康机器人技术有限公司 | Two-dimensional code generation and identification method and device for mobile robot positioning |
CN107687855B (en) * | 2017-08-22 | 2020-07-31 | 广东美的智能机器人有限公司 | Robot positioning method and device and robot |
CN107671863B (en) * | 2017-08-22 | 2020-06-26 | 广东美的智能机器人有限公司 | Robot control method and device based on two-dimensional code and robot |
CN107720237B (en) * | 2017-08-31 | 2020-02-07 | 广州泰行智能科技有限公司 | Stacking layout method and device based on space coordinate system |
CN107782305B (en) * | 2017-09-22 | 2021-05-14 | 郑州郑大智能科技股份有限公司 | Mobile robot positioning method based on digital letter recognition |
CN108225303B (en) * | 2018-01-18 | 2024-06-14 | 港湾智能科技(苏州)有限公司 | Two-dimensional code positioning label, positioning navigation system and method based on two-dimensional code |
WO2019154435A1 (en) * | 2018-05-31 | 2019-08-15 | 上海快仓智能科技有限公司 | Mapping method, image acquisition and processing system, and positioning method |
CN109093621B (en) * | 2018-08-10 | 2021-03-12 | 北京极智嘉科技有限公司 | Robot operation precision monitoring method and device, robot, server and medium |
CN109009871A (en) * | 2018-08-16 | 2018-12-18 | 常州市钱璟康复股份有限公司 | A kind of upper-limbs rehabilitation training robot |
CN109346148A (en) * | 2018-08-16 | 2019-02-15 | 常州市钱璟康复股份有限公司 | The two dimensional code location recognition method and its system of upper-limbs rehabilitation training robot |
CN109397249B (en) * | 2019-01-07 | 2020-11-06 | 重庆大学 | Method for positioning and grabbing robot system by two-dimensional code based on visual identification |
CN110033069A (en) * | 2019-04-18 | 2019-07-19 | 天津工业大学 | A kind of location coding label design method of points based on QR code |
CN111026105A (en) * | 2019-10-18 | 2020-04-17 | 同济大学 | Mobile robot field positioning system and control method thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102944236A (en) * | 2012-11-20 | 2013-02-27 | 无锡普智联科高新技术有限公司 | Mobile robot positioning system and method based on a plurality of two-dimensional code readers |
CN103294059A (en) * | 2013-05-21 | 2013-09-11 | 无锡普智联科高新技术有限公司 | Hybrid navigation belt based mobile robot positioning system and method thereof |
CN104197899A (en) * | 2014-09-24 | 2014-12-10 | 中国科学院宁波材料技术与工程研究所 | Mobile robot location method and system |
CN104933387A (en) * | 2015-06-24 | 2015-09-23 | 上海快仓智能科技有限公司 | Rapid positioning and identifying method based on two-dimensional code decoding |
CN105138946A (en) * | 2015-10-09 | 2015-12-09 | 巴罗克(济南)生物技术有限公司 | Datamatrix two-dimensional code identification method by means of intelligent mobile phone |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007098555A (en) * | 2005-10-07 | 2007-04-19 | Nippon Telegr & Teleph Corp <Ntt> | Position indicating method, indicator and program for achieving the method |
-
2016
- 2016-01-04 CN CN201610006619.2A patent/CN105437251B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102944236A (en) * | 2012-11-20 | 2013-02-27 | 无锡普智联科高新技术有限公司 | Mobile robot positioning system and method based on a plurality of two-dimensional code readers |
CN103294059A (en) * | 2013-05-21 | 2013-09-11 | 无锡普智联科高新技术有限公司 | Hybrid navigation belt based mobile robot positioning system and method thereof |
CN104197899A (en) * | 2014-09-24 | 2014-12-10 | 中国科学院宁波材料技术与工程研究所 | Mobile robot location method and system |
CN104933387A (en) * | 2015-06-24 | 2015-09-23 | 上海快仓智能科技有限公司 | Rapid positioning and identifying method based on two-dimensional code decoding |
CN105138946A (en) * | 2015-10-09 | 2015-12-09 | 巴罗克(济南)生物技术有限公司 | Datamatrix two-dimensional code identification method by means of intelligent mobile phone |
Non-Patent Citations (1)
Title |
---|
移动机器人自主定位与导航构图研究;李光辉;《中国硕士学位论文全文数据库 信息科技辑》;20130315;I140-303 * |
Also Published As
Publication number | Publication date |
---|---|
CN105437251A (en) | 2016-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105437251B (en) | A kind of method and device of positioning robot position | |
CN104848858B (en) | Quick Response Code and be used for robotic vision-inertia combined navigation system and method | |
KR102347015B1 (en) | Vehicle tracking in a warehouse environment | |
CN112258567B (en) | Visual positioning method and device for object grabbing point, storage medium and electronic equipment | |
US10319110B2 (en) | Display control method and system | |
US9796083B2 (en) | Simulation apparatus for robot system | |
CN111442722A (en) | Positioning method, positioning device, storage medium and electronic equipment | |
US11524406B2 (en) | Calibration method and device for robotic arm system | |
CN103535021A (en) | Auto-focus tracking | |
CN105718862A (en) | Method, device and recording-broadcasting system for automatically tracking teacher via single camera | |
CN105184214A (en) | Sound source positioning and human face detection-based human body positioning method and system | |
US10210615B2 (en) | System and method for extrinsic camera parameters calibration by use of a three dimensional (3D) calibration object | |
CN110834327A (en) | Robot control method and device | |
EP3550402A1 (en) | Method for controlling display of screen of mobile terminal, and mobile terminal | |
CN113194253B (en) | Shooting method and device for removing reflection of image and electronic equipment | |
CN106584516A (en) | Intelligent photographing robot for tracing specified object | |
CN105446605B (en) | Information processing method and electronic equipment | |
US11070719B2 (en) | Image capture assist device and image capture assist method | |
CN107783551A (en) | The method and device that control unmanned plane follows | |
CN114092668A (en) | Virtual-real fusion method, device, equipment and storage medium | |
CN111583338B (en) | Positioning method and device for unmanned equipment, medium and unmanned equipment | |
KR20120116202A (en) | Method of location recognition using mobile terminal and thereof apparatus | |
TW201621273A (en) | Mobile positioning apparatus and positioning method thereof | |
CN102082905A (en) | Method for detecting position of camera optic axis | |
CN111457928B (en) | Robot positioning method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220210 Address after: 310000 room 603, South Building, No. 2, Chaofeng East Road, Yuhang Economic Development Zone, Yuhang District, Hangzhou, Zhejiang Province Patentee after: ZHEJIANG LIBIAO ROBOT CO.,LTD. Address before: 310000 0614-2 room, 199 business building, Xihu District Wensanlu Road, Hangzhou, Zhejiang. Patentee before: HANGZHOU YAMEILIJIA TECHNOLOGY Co.,Ltd. |
|
TR01 | Transfer of patent right |