CN105437251A - Method and device for locating position of robot - Google Patents

Method and device for locating position of robot Download PDF

Info

Publication number
CN105437251A
CN105437251A CN201610006619.2A CN201610006619A CN105437251A CN 105437251 A CN105437251 A CN 105437251A CN 201610006619 A CN201610006619 A CN 201610006619A CN 105437251 A CN105437251 A CN 105437251A
Authority
CN
China
Prior art keywords
point
robot
optical markings
centre coordinate
quick response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610006619.2A
Other languages
Chinese (zh)
Other versions
CN105437251B (en
Inventor
朱建强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Libiao Robots Co Ltd
Original Assignee
Hangzhou Yameilijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yameilijia Technology Co Ltd filed Critical Hangzhou Yameilijia Technology Co Ltd
Priority to CN201610006619.2A priority Critical patent/CN105437251B/en
Publication of CN105437251A publication Critical patent/CN105437251A/en
Application granted granted Critical
Publication of CN105437251B publication Critical patent/CN105437251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment

Abstract

The invention provides a method and device for locating the position of a robot. The method comprises the steps of receiving a two-dimension code picture taken by the robot and obtaining the center coordinate point and the angle of a two-dimension code from the two-dimension code picture; capturing a picture of an optical mark point from the two-dimension code picture according to the preset distance value and the center coordinate point of the two-dimension code; obtaining the center coordinate of the optical mark point from the picture of the optical mark point; according to the center coordinate point of the two-dimension code, the center coordinate of the optical mark point and the angle of shooting the two-dimension code picture, determining the current position and angle of the robot. The position of the robot can be determined through the two-dimension code, the optical mark point and the angle of shooting the two-dimension code picture by the robot, the accuracy of determining the current position of the robot is improved, the error of locating the position of the robot is reduced, and therefore the accuracy of judgments made according to the located position of the robot is high, and the robot can be controlled to arrive at a destination fast.

Description

A kind of method of positioning robot position and device
Technical field
The present invention relates to robotics, in particular to a kind of method and device of positioning robot position.
Background technology
At present, along with the fast development of information technology, Quick Response Code is widely used in each enterprise and various product, and Quick Response Code summarizes the relevant information of enterprise and product, can obtain relevant information by scanning Quick Response Code.Such as industrial robot obtains the positional information residing for self by the Quick Response Code that scanning work region is arranged.
Current, when industrial robot obtains the positional information residing for self by Quick Response Code, the video camera shooting area interior Quick Response Code that arrange of industrial robot first by self installing, then Quick Response Code is resolved to obtain the positional information of Quick Response Code region, the positional information of Quick Response Code is defined as the current residing position of robot, the position residing for console is current according to robot and the target location of robot are carried out control and are moved.
There is deviation in physical location during robot shooting Quick Response Code residing for it and the position of Quick Response Code, the positional information of the Quick Response Code taken by robot is as the current residing position of robot, cause the accuracy of the current residing position of robot determined very low, the error of positioning robot position is very large.
Summary of the invention
In view of this, the object of the embodiment of the present invention is the method and the device that provide a kind of positioning robot position, realize the position being determined robot by the shooting angle of Quick Response Code, optical markings point and robot shooting two-dimension code image, improve the accuracy determining the current residing position of robot, reduce the error of positioning robot position, thus the judgment accuracy made according to the robot location oriented is very high, can arrive destination fast by control.
First aspect, embodiments provide a kind of method of positioning robot position, described method comprises:
Receive the two-dimension code image of robot shooting, from described two-dimension code image, obtain centre coordinate point and the angle of Quick Response Code;
According to the centre coordinate point of predeterminable range value and described Quick Response Code, from described two-dimension code image, intercept the picture of optical markings point;
The centre coordinate of optical markings point is obtained from the picture of described optical markings point;
According to the centre coordinate of the centre coordinate point of described Quick Response Code, described optical markings point and the shooting angle of taking described two-dimension code image, determine current location and the angle of robot.
In conjunction with first aspect, embodiments provide the first possible implementation of above-mentioned first aspect, wherein, the described centre coordinate point according to predeterminable range value and described Quick Response Code, intercepts the picture of optical markings point, comprising from described two-dimension code image:
In described two-dimension code image, determine with the centre coordinate of described Quick Response Code point for round dot and the circle that is radius with described predeterminable range value;
The sectional drawing of multiple pre-set dimension is intercepted, using the picture of the sectional drawing of described multiple pre-set dimension of intercepting as optical markings point centered by point on the circular arc of the described circle determined.
In conjunction with the first possible implementation of first aspect, embodiments provide the implementation that the second of above-mentioned first aspect is possible, wherein, the described centre coordinate obtaining optical markings point from the picture of described optical markings point, comprising:
Detect successively in the picture of multiple described optical markings point and whether comprise optical markings point;
When comprising optical markings point in the picture detecting certain optical markings point, described optical markings point is amplified;
Resolve the described optical markings point after amplifying, obtain the centre coordinate of described optical markings point.
In conjunction with first aspect, embodiments provide the third possible implementation of above-mentioned first aspect, wherein, the centre coordinate of the described point of the centre coordinate according to described Quick Response Code, described optical markings point and take the shooting angle of described two-dimension code image, determine current location and the angle of robot, comprising:
Line between the centre coordinate of described Quick Response Code point and the centre coordinate of described optical markings point is defined as datum line;
According to the centre coordinate point of described datum line, described Quick Response Code and the shooting angle of taking described two-dimension code image, by position and the angle of trigonometric function calculating robot.
In conjunction with first aspect, embodiments provide the 4th kind of possible implementation of above-mentioned first aspect, wherein, described determine robot current location and angle after, comprising:
According to target location corresponding to the current location of the described robot determined and angle and described robot, generate the move that described robot is corresponding, send described move to described robot, move according to described move to make described robot.
Second aspect, embodiments provide a kind of device of positioning robot position, described device comprises:
First acquisition module, for receiving the two-dimension code image of robot shooting, obtains centre coordinate point and the angle of Quick Response Code from described two-dimension code image;
Interception module, for the centre coordinate point according to predeterminable range value and described Quick Response Code, intercepts the picture of optical markings point from described two-dimension code image;
Second acquisition module, for obtaining the centre coordinate of optical markings point in the picture from described optical markings point;
Determination module, for according to the centre coordinate of the centre coordinate point of described Quick Response Code, described optical markings point and the shooting angle of taking described two-dimension code image, determines current location and the angle of robot.
In conjunction with second aspect, embodiments provide the first possible implementation of above-mentioned second aspect, wherein, described interception module comprises:
First determining unit, in described two-dimension code image, determines with the centre coordinate of described Quick Response Code point for round dot and the circle that is radius with described predeterminable range value;
First interception unit, intercepts the sectional drawing of multiple pre-set dimension centered by the point on the circular arc of the described circle determined, using the picture of the sectional drawing of described multiple pre-set dimension of intercepting as optical markings point.
In conjunction with the first possible implementation of second aspect, embodiments provide the implementation that the second of above-mentioned second aspect is possible, wherein, described first interception unit comprises:
Detection sub-unit, for detect multiple described optical markings point successively picture in whether comprise optical markings point;
Amplifying subelement, for when comprising optical markings point in the picture detecting certain optical markings point, described optical markings point being amplified;
Resolving subelement, for resolving the described optical markings point after amplification, obtaining the centre coordinate of described optical markings point.
In conjunction with second aspect, embodiments provide the third possible implementation of above-mentioned second aspect, wherein, described determination module comprises:
Second determining unit, for being defined as datum line by the line between the centre coordinate of described Quick Response Code point and the centre coordinate of described optical markings point;
Computing unit, for according to the centre coordinate point of described datum line, described Quick Response Code and the shooting angle of taking described two-dimension code image, by position and the angle of trigonometric function calculating robot.
In conjunction with second aspect, embodiments provide the 4th kind of possible implementation of above-mentioned second aspect, wherein, described device also comprises:
Generation module, for according to target location corresponding to the current location of the described robot determined and angle and described robot, generates the move that described robot is corresponding;
Sending module, for sending described move to described robot, moves according to described move to make described robot.
In the method and device of a kind of positioning robot position that the embodiment of the present invention provides, the method comprises: the two-dimension code image receiving robot shooting, obtains centre coordinate point and the angle of Quick Response Code from two-dimension code image; According to the centre coordinate point of predeterminable range value and Quick Response Code, from two-dimension code image, intercept the picture of optical markings point; The centre coordinate of optical markings point is obtained from the picture of optical markings point; According to the shooting angle of the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point and shooting two-dimension code image, determine current location and the angle of robot; This device comprises the first acquisition module, interception module, the second acquisition module and determination module; First acquisition module, for receiving the two-dimension code image of robot shooting, obtains centre coordinate point and the angle of Quick Response Code from two-dimension code image; Interception module, for the centre coordinate point according to predeterminable range value and Quick Response Code, intercepts the picture of optical markings point from two-dimension code image; Second acquisition module, for obtaining the centre coordinate of optical markings point in the picture from optical markings point; Determination module, for the shooting angle according to the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point and shooting two-dimension code image, determines current location and the angle of robot.Achieve the position being determined robot by the shooting angle of Quick Response Code, optical markings point and robot shooting two-dimension code image, improve the accuracy determining the current residing position of robot, reduce the error of positioning robot position, thus the judgment accuracy made according to the robot location oriented is very high, can arrive destination fast by control.
For making above-mentioned purpose of the present invention, feature and advantage become apparent, preferred embodiment cited below particularly, and coordinate appended accompanying drawing, be described in detail below.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme of the embodiment of the present invention, be briefly described to the accompanying drawing used required in embodiment below, be to be understood that, the following drawings illustrate only some embodiment of the present invention, therefore the restriction to scope should be counted as, for those of ordinary skill in the art, under the prerequisite not paying creative work, other relevant accompanying drawings can also be obtained according to these accompanying drawings.
Figure 1A shows the flow chart of the method for a kind of positioning robot position that the embodiment of the present invention 1 provides;
Figure 1B shows the flow chart obtaining the centre coordinate of optical markings point from the picture of optical markings point that the embodiment of the present invention 1 provides;
Fig. 2 A shows the first schematic diagram of the device of a kind of positioning robot position that the embodiment of the present invention 2 provides;
Fig. 2 B shows the second schematic diagram of the device of a kind of positioning robot position that the embodiment of the present invention 2 provides;
Fig. 2 C shows the 3rd schematic diagram of the device of a kind of positioning robot position that the embodiment of the present invention 2 provides.
Detailed description of the invention
Below in conjunction with accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.The assembly of the embodiment of the present invention describing and illustrate in usual accompanying drawing herein can be arranged with various different configuration and design.Therefore, below to the detailed description of the embodiments of the invention provided in the accompanying drawings and the claimed scope of the present invention of not intended to be limiting, but selected embodiment of the present invention is only represented.Based on embodiments of the invention, the every other embodiment that those skilled in the art obtain under the prerequisite not making creative work, all belongs to the scope of protection of the invention.
Consider in prior art, there is deviation in physical location during robot shooting Quick Response Code residing for it and the position of Quick Response Code, the positional information of the Quick Response Code taken by robot is as the current residing position of robot, cause the accuracy of the current residing position of robot determined very low, the error of positioning robot position is very large.Based on this, the invention provides a kind of method and device of positioning robot position, realize the position being determined robot by the shooting angle of Quick Response Code, optical markings point and robot shooting two-dimension code image, improve the accuracy determining the current residing position of robot, reduce the error of positioning robot position, thus the judgment accuracy made according to the robot location oriented is very high, can arrive destination fast by control.Be described below by embodiment.
Embodiment 1
Embodiments provide a kind of method of positioning robot position.Wherein, the region of robot work is provided with multiple Quick Response Code, multiple optical markings point is provided with around each Quick Response Code, multiple Quick Response Code arranges according to certain rule, robot is by the two-dimension code image in scanning work region, and the two-dimension code image of scanning is transferred to console in real time, console identifies the position of Quick Response Code in two-dimension code image by the method that the embodiment of the present invention provides, thus orients position and the angle of robot according to the shooting angle of robot shooting two-dimension code image.See Figure 1A, the method comprises following S101-S104 step:
Step S101: the two-dimension code image receiving robot shooting, obtains centre coordinate point and the angle of Quick Response Code from two-dimension code image.
Above-mentioned robot is all provided with camera or scanner with it, when robot operationally, the two-dimension code image scanned and the mark of oneself by the camera installed or the two-dimension code image of scanner in real time in scanning work region, and are sent to console by wireless network by robot.
After console receives the two-dimension code image of robot transmission and the mark of robot, console is resolved two-dimension code image, gets centre coordinate point and the angle of Quick Response Code in the two-dimension code image analytically.
Step S102: according to the centre coordinate point of predeterminable range value and Quick Response Code, intercept the picture of optical markings point from two-dimension code image.
Because robot may photograph the optical markings point of this Quick Response Code in the process of shooting two-dimension code image, from two-dimension code image, in the embodiment of the present invention, intercepted the picture of optical markings point according to following S1021-S1022 step.
Step S1021: in two-dimension code image, determines with the centre coordinate of Quick Response Code point for round dot and the circle that is radius with predeterminable range value.
Above-mentioned predeterminable range value can be the distance value between the centre coordinate point of Quick Response Code and the centre coordinate of optical markings point.
After determining above-mentioned circle, optical markings point may be comprised in the region of this circle, determine the particular location of optical markings point by S1022 step further.
Step S1022: the sectional drawing intercepting multiple pre-set dimension centered by the point on the circular arc of the circle determined, using the picture of the sectional drawing of multiple pre-set dimension of intercepting as optical markings point.
Above-mentioned pre-set dimension can be the size of the optical markings point of preset multiple, after the sectional drawing intercepting multiple pre-set dimension, the sectional drawing of multiple pre-set dimension that intercepts as the picture of optical markings point, is included optical markings point in the picture of wherein some optical markings point by console.
Step S103: the centre coordinate obtaining optical markings point from the picture of optical markings point.
When after the picture getting optical markings point, console obtains the centre coordinate of optical markings point from the picture of optical markings point by following S1031-S1033 step.
Step S1031: detect successively in the picture of multiple optical markings point and whether comprise optical markings point.
Step S1032: when comprising optical markings point in the picture detecting certain optical markings point, optical markings point is amplified.
Because the optical markings point be arranged on around Quick Response Code is smaller, Quick Response Code in the two-dimension code image of being taken by robot is also smaller, thus optical markings point in the picture of optical markings point is smaller, for the ease of the identification to optical markings point, console amplifies according to certain multiple the optical markings point detected.
Step S1033: resolve the optical markings point after amplifying, obtains the centre coordinate of optical markings point.
After console gets the coordinate of the centre coordinate point of Quick Response Code and optical markings point corresponding to this Quick Response Code from the two-dimension code image that robot is taken, console can according to the current location of following S104 step determination robot and angle.
Step S104: according to the shooting angle of the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point and shooting two-dimension code image, determine current location and the angle of robot.
Wherein, current location and the angle of robot is determined by following S1041-S1042 step.
Step S1041: the line between the centre coordinate of Quick Response Code point and the centre coordinate of optical markings point is defined as datum line;
Step S1042: according to the centre coordinate point of datum line, Quick Response Code and the shooting angle of shooting two-dimension code image, by position and the angle of trigonometric function calculating robot.
After console gets the centre coordinate point of Quick Response Code, also get position and the angle of the centre coordinate point of Quick Response Code simultaneously, now can calculate the angle vector of the centre coordinate point of Quick Response Code according to the plane coordinates of robot work region.Shooting angle again due to the centre coordinate of optical markings point and robot shooting two-dimension code image is known, carry out trigonometric function calculating by the shooting angle of plane coordinates to the position of Quick Response Code centre coordinate point and vector angle, datum line and robot shooting two-dimension code image of robot work region, thus calculate position and the angle of robot.
After the current location that robot determined by console and angle, console is according to target location corresponding to the current location of the robot determined and angle and robot, generate the move that robot is corresponding, send move to robot, move according to move to make robot.
According to the current location of robot and angle, first above-mentioned console judges whether robot arrives target location corresponding to robot, if arrive, console generates the move comprising current location to current location, and sending this move to robot, robot stops at current location after receiving this move; If do not arrive, the target location that console is corresponding according to robot judges next step position that should move to of robot, generate the move comprising the mobile alignment of the position that current location should move to next step, and send this move to robot, after robot receives this move, move to target location corresponding to robot according to the mobile alignment in move.
In the method that the embodiment of the present invention provides, method comprises: the two-dimension code image receiving robot shooting, obtains centre coordinate point and the angle of Quick Response Code from two-dimension code image; According to the centre coordinate point of predeterminable range value and Quick Response Code, from two-dimension code image, intercept the picture of optical markings point; The centre coordinate of optical markings point is obtained from the picture of optical markings point; According to the shooting angle of the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point and shooting two-dimension code image, determine current location and the angle of robot.Achieve the position being determined robot by the shooting angle of Quick Response Code, optical markings point and robot shooting two-dimension code image, improve the accuracy determining the current residing position of robot, reduce the error of positioning robot position, thus the judgment accuracy made according to the robot location oriented is very high, can arrive destination fast by control.
Embodiment 2
See Fig. 2 A, the device of positioning robot position comprises: the first acquisition module S1, interception module S2, the second acquisition module S3 and determination module S4.
First acquisition module S1, for receiving the two-dimension code image of robot shooting, obtains centre coordinate point and the angle of Quick Response Code from two-dimension code image;
Interception module S2, for the centre coordinate point according to predeterminable range value and Quick Response Code, intercepts the picture of optical markings point from two-dimension code image;
Second acquisition module S3, for obtaining the centre coordinate of optical markings point in the picture from optical markings point;
Determination module S4, for the shooting angle according to the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point and shooting two-dimension code image, determines current location and the angle of robot.
Above-mentioned robot is all provided with camera or scanner with it, when robot operationally, the two-dimension code image scanned and the mark of oneself by the camera installed or the two-dimension code image of scanner in real time in scanning work region, and are sent to the device of positioning robot position by robot by wireless network.
When after the mark that the device of positioning robot position receives two-dimension code image that robot sends and robot, this device is resolved two-dimension code image, the first acquisition module S1 analytically after information in get centre coordinate point and the angle of Quick Response Code.
Because robot may photograph the optical markings point of this Quick Response Code in the process of shooting two-dimension code image, from two-dimension code image, intercepted the picture of optical markings point in embodiments of the present invention according to following interception module S2.
See Fig. 2 B, above-mentioned interception module S2 comprises the first determining unit S21 and the first interception unit S22.
First determining unit S21, in two-dimension code image, determines with the centre coordinate of Quick Response Code point for round dot and the circle that is radius with predeterminable range value;
Above-mentioned predeterminable range value can be the distance value between the centre coordinate point of Quick Response Code and the centre coordinate of optical markings point.
After above-mentioned first determining unit S21 determines above-mentioned circle, optical markings point may be comprised in the region of this circle, determine the particular location of optical markings point by the first interception unit S22 further.
First interception unit S22, intercepts the sectional drawing of multiple pre-set dimension centered by the point on the circular arc of the circle determined, using the picture of the sectional drawing of multiple pre-set dimension of intercepting as optical markings point.
Above-mentioned predeterminable range size can be the size of the optical markings point of preset multiple, after the sectional drawing intercepting multiple pre-set dimension, the sectional drawing of multiple pre-set dimension that intercepts as the picture of optical markings point, is included optical markings point in the picture of wherein some optical markings point by the first interception unit S22.
See Fig. 2 B, above-mentioned first interception unit S22 comprises detection sub-unit S221, amplifies subelement S222 and resolve subelement S223.
Detection sub-unit S221, for detect multiple optical markings point successively picture in whether comprise optical markings point.
Amplifying subelement S222, for when comprising optical markings point in the picture detecting certain optical markings point, optical markings point being amplified.
Because the optical markings point be arranged on around Quick Response Code is smaller, Quick Response Code in the two-dimension code image of being taken by robot is also smaller, thus optical markings point in the picture of optical markings point is smaller, for the ease of the identification to optical markings point, amplify subelement S222 and the optical markings point that detection sub-unit S221 detects is amplified according to certain multiple.
Resolving subelement S223, for resolving the point of the optical markings after amplification, obtaining the centre coordinate of optical markings point.
When after the centre coordinate being got the centre coordinate point of Quick Response Code and angle by the first acquisition module S1 and got optical markings point corresponding to Quick Response Code by interception module S2 and the second acquisition module S3 from the two-dimension code image that robot is taken, by determination module S4 according to the current location of the centre coordinate point of Quick Response Code, the centre coordinate of optical markings point that this Quick Response Code is corresponding and the shooting angle determination robot of robot shooting two-dimension code image and angle.
See Fig. 2 B, above-mentioned determination module S4 comprises the second determining unit S41 and computing unit S42.
Second determining unit S41, for being defined as datum line by the line between the centre coordinate of Quick Response Code point and the centre coordinate of optical markings point;
Computing unit S42, for according to the centre coordinate point of datum line, Quick Response Code and the shooting angle of shooting two-dimension code image, by position and the angle of trigonometric function calculating robot.
After the first acquisition module S1 gets the centre coordinate point of Quick Response Code, also get position and the angle of the centre coordinate point of Quick Response Code, now computing unit S42 can calculate the angle vector of the centre coordinate point of Quick Response Code according to the plane coordinates of robot work region simultaneously.Shooting angle again due to the centre coordinate of optical markings point and robot shooting two-dimension code image is known, when the centre coordinate of the centre coordinate of Quick Response Code point and optical markings point to carry out after line determines out datum line by the second determining unit S41, computing unit S42 carries out trigonometric function calculating by the shooting angle of plane coordinates to the position of Quick Response Code centre coordinate point and vector angle, datum line and robot shooting two-dimension code image of robot work region, thus calculates position and the angle of robot.
After the current location determining robot by determination module S4 and angle, moved by generation module S5 and sending module S6 control.
See Fig. 2 C, this device also comprises generation module S5 and sending module S6.
Generation module, for according to target location corresponding to the current location of the described robot determined and angle and described robot, generates the move that described robot is corresponding;
Sending module, for sending described move to described robot, moves according to described move to make described robot.
According to the current location of robot and angle, first above-mentioned generation module S5 judges whether robot arrives target location corresponding to robot, if arrive, generation module S5 generates the move comprising the mobile alignment of current location to current location; If do not arrive, generation module S5 judges next step position that should move to of robot according to according to target location corresponding to the current location of robot and angle and robot, and generates the move comprising the mobile alignment of the position that current location should move to next step.
After generation module S5 generates corresponding move, move is sent to robot by sending module S6; After robot receives this move, stop at current location and angle according to the mobile alignment in move, or move to target location corresponding to robot according to the mobile alignment in move.
In the device that the embodiment of the present invention provides, this device comprises the first acquisition module, interception module, the second acquisition module and determination module; First acquisition module, for receiving the two-dimension code image of robot shooting, obtains centre coordinate point and the angle of Quick Response Code from described two-dimension code image; Interception module, for the centre coordinate point according to predeterminable range value and described Quick Response Code, intercepts the picture of optical markings point from described two-dimension code image; Second acquisition module, for obtaining the centre coordinate of optical markings point in the picture from described optical markings point; Determination module, for according to the centre coordinate of the centre coordinate point of described Quick Response Code, described optical markings point and the shooting angle of taking described two-dimension code image, determines current location and the angle of robot.Achieve the position being determined robot by the shooting angle of Quick Response Code, optical markings point and robot shooting two-dimension code image, improve the accuracy determining the current residing position of robot, reduce the error of positioning robot position, thus the judgment accuracy made according to the robot location oriented is very high, can arrive destination fast by control.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should described be as the criterion with the protection domain of claim.

Claims (10)

1. a method for positioning robot position, is characterized in that, described method comprises:
Receive the two-dimension code image of robot shooting, from described two-dimension code image, obtain centre coordinate point and the angle of Quick Response Code;
According to the centre coordinate point of predeterminable range value and described Quick Response Code, from described two-dimension code image, intercept the picture of optical markings point;
The centre coordinate of optical markings point is obtained from the picture of described optical markings point;
According to the centre coordinate of the centre coordinate point of described Quick Response Code, described optical markings point and the shooting angle of taking described two-dimension code image, determine current location and the angle of robot.
2. method according to claim 1, is characterized in that, the described centre coordinate point according to predeterminable range value and described Quick Response Code, intercepts the picture of optical markings point, comprising from described two-dimension code image:
In described two-dimension code image, determine with the centre coordinate of described Quick Response Code point for round dot and the circle that is radius with described predeterminable range value;
The sectional drawing of multiple pre-set dimension is intercepted, using the picture of the sectional drawing of described multiple pre-set dimension of intercepting as optical markings point centered by point on the circular arc of the described circle determined.
3. method according to claim 2, is characterized in that, the described centre coordinate obtaining optical markings point from the picture of described optical markings point, comprising:
Detect successively in the picture of multiple described optical markings point and whether comprise optical markings point;
When comprising optical markings point in the picture detecting certain optical markings point, described optical markings point is amplified;
Resolve the described optical markings point after amplifying, obtain the centre coordinate of described optical markings point.
4. method according to claim 1, is characterized in that, the centre coordinate of the described point of the centre coordinate according to described Quick Response Code, described optical markings point and take the shooting angle of described two-dimension code image, determines current location and the angle of robot, comprising:
Line between the centre coordinate of described Quick Response Code point and the centre coordinate of described optical markings point is defined as datum line;
According to the centre coordinate point of described datum line, described Quick Response Code and the shooting angle of taking described two-dimension code image, by position and the angle of trigonometric function calculating robot.
5. method according to claim 1, is characterized in that, described determine robot current location and angle after, comprising:
According to target location corresponding to the current location of the described robot determined and angle and described robot, generate the move that described robot is corresponding, send described move to described robot, move according to described move to make described robot.
6. a device for positioning robot position, is characterized in that, described device comprises:
First acquisition module, for receiving the two-dimension code image of robot shooting, obtains centre coordinate point and the angle of Quick Response Code from described two-dimension code image;
Interception module, for the centre coordinate point according to predeterminable range value and described Quick Response Code, intercepts the picture of optical markings point from described two-dimension code image;
Second acquisition module, for obtaining the centre coordinate of optical markings point in the picture from described optical markings point;
Determination module, for according to the centre coordinate of the centre coordinate point of described Quick Response Code, described optical markings point and the shooting angle of taking described two-dimension code image, determines current location and the angle of robot.
7. device according to claim 6, is characterized in that, described interception module comprises:
First determining unit, in described two-dimension code image, determines with the centre coordinate of described Quick Response Code point for round dot and the circle that is radius with described predeterminable range value;
First interception unit, intercepts the sectional drawing of multiple pre-set dimension centered by the point on the circular arc of the described circle determined, using the picture of the sectional drawing of described multiple pre-set dimension of intercepting as optical markings point.
8. device according to claim 7, is characterized in that, described first interception unit comprises:
Detection sub-unit, for detect multiple described optical markings point successively picture in whether comprise optical markings point;
Amplifying subelement, for when comprising optical markings point in the picture detecting certain optical markings point, described optical markings point being amplified;
Resolving subelement, for resolving the described optical markings point after amplification, obtaining the centre coordinate of described optical markings point.
9. device according to claim 6, is characterized in that, described determination module comprises:
Second determining unit, for being defined as datum line by the line between the centre coordinate of described Quick Response Code point and the centre coordinate of described optical markings point;
Computing unit, for according to the centre coordinate point of described datum line, described Quick Response Code and the shooting angle of taking described two-dimension code image, by position and the angle of trigonometric function calculating robot.
10. device according to claim 6, is characterized in that, described device also comprises:
Generation module, for according to target location corresponding to the current location of the described robot determined and angle and described robot, generates the move that described robot is corresponding;
Sending module, for sending described move to described robot, moves according to described move to make described robot.
CN201610006619.2A 2016-01-04 2016-01-04 A kind of method and device of positioning robot position Active CN105437251B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610006619.2A CN105437251B (en) 2016-01-04 2016-01-04 A kind of method and device of positioning robot position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610006619.2A CN105437251B (en) 2016-01-04 2016-01-04 A kind of method and device of positioning robot position

Publications (2)

Publication Number Publication Date
CN105437251A true CN105437251A (en) 2016-03-30
CN105437251B CN105437251B (en) 2017-07-18

Family

ID=55548118

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610006619.2A Active CN105437251B (en) 2016-01-04 2016-01-04 A kind of method and device of positioning robot position

Country Status (1)

Country Link
CN (1) CN105437251B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106527432A (en) * 2016-11-04 2017-03-22 浙江大学 Indoor mobile robot cooperative system based on fuzzy algorithm and two-dimensional code self correction
CN107108044A (en) * 2016-05-27 2017-08-29 深圳市大疆创新科技有限公司 Optical shield and the filming apparatus and unmanned plane using the optical shield
CN107671863A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot control method, device and robot based on Quick Response Code
CN107687855A (en) * 2017-08-22 2018-02-13 广东美的智能机器人有限公司 Robot localization method, apparatus and robot
CN107720237A (en) * 2017-08-31 2018-02-23 广州泰行智能科技有限公司 A kind of pile part Nesting and device based on space coordinates
CN107782305A (en) * 2017-09-22 2018-03-09 郑州郑大智能科技股份有限公司 A kind of method for positioning mobile robot based on digital alphabet identification
WO2018086158A1 (en) * 2016-11-10 2018-05-17 曾昭兴 Passenger positioning method and system
CN108073163A (en) * 2016-11-11 2018-05-25 中国科学院沈阳计算技术研究所有限公司 The control method for determining robot exact position is compensated using Quick Response Code value of feedback
CN108154210A (en) * 2016-12-02 2018-06-12 杭州海康机器人技术有限公司 A kind of Quick Response Code generation, recognition methods and device
CN108154209A (en) * 2016-12-02 2018-06-12 杭州海康机器人技术有限公司 A kind of Quick Response Code generation, recognition methods and device for localization for Mobile Robot
CN108225303A (en) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code
CN109009871A (en) * 2018-08-16 2018-12-18 常州市钱璟康复股份有限公司 A kind of upper-limbs rehabilitation training robot
CN109093621A (en) * 2018-08-10 2018-12-28 北京极智嘉科技有限公司 Robot running precision monitoring method, device, robot, server and medium
CN109346148A (en) * 2018-08-16 2019-02-15 常州市钱璟康复股份有限公司 The two dimensional code location recognition method and its system of upper-limbs rehabilitation training robot
CN109397249A (en) * 2019-01-07 2019-03-01 重庆大学 The two dimensional code positioning crawl robot system algorithm of view-based access control model identification
CN110033069A (en) * 2019-04-18 2019-07-19 天津工业大学 A kind of location coding label design method of points based on QR code
WO2019154435A1 (en) * 2018-05-31 2019-08-15 上海快仓智能科技有限公司 Mapping method, image acquisition and processing system, and positioning method
CN111026105A (en) * 2019-10-18 2020-04-17 同济大学 Mobile robot field positioning system and control method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007098555A (en) * 2005-10-07 2007-04-19 Nippon Telegr & Teleph Corp <Ntt> Position indicating method, indicator and program for achieving the method
CN102944236A (en) * 2012-11-20 2013-02-27 无锡普智联科高新技术有限公司 Mobile robot positioning system and method based on a plurality of two-dimensional code readers
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
CN104197899A (en) * 2014-09-24 2014-12-10 中国科学院宁波材料技术与工程研究所 Mobile robot location method and system
CN104933387A (en) * 2015-06-24 2015-09-23 上海快仓智能科技有限公司 Rapid positioning and identifying method based on two-dimensional code decoding
CN105138946A (en) * 2015-10-09 2015-12-09 巴罗克(济南)生物技术有限公司 Datamatrix two-dimensional code identification method by means of intelligent mobile phone

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007098555A (en) * 2005-10-07 2007-04-19 Nippon Telegr & Teleph Corp <Ntt> Position indicating method, indicator and program for achieving the method
CN102944236A (en) * 2012-11-20 2013-02-27 无锡普智联科高新技术有限公司 Mobile robot positioning system and method based on a plurality of two-dimensional code readers
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
CN104197899A (en) * 2014-09-24 2014-12-10 中国科学院宁波材料技术与工程研究所 Mobile robot location method and system
CN104933387A (en) * 2015-06-24 2015-09-23 上海快仓智能科技有限公司 Rapid positioning and identifying method based on two-dimensional code decoding
CN105138946A (en) * 2015-10-09 2015-12-09 巴罗克(济南)生物技术有限公司 Datamatrix two-dimensional code identification method by means of intelligent mobile phone

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李光辉: "移动机器人自主定位与导航构图研究", 《中国硕士学位论文全文数据库 信息科技辑》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107108044A (en) * 2016-05-27 2017-08-29 深圳市大疆创新科技有限公司 Optical shield and the filming apparatus and unmanned plane using the optical shield
CN106527432A (en) * 2016-11-04 2017-03-22 浙江大学 Indoor mobile robot cooperative system based on fuzzy algorithm and two-dimensional code self correction
WO2018086158A1 (en) * 2016-11-10 2018-05-17 曾昭兴 Passenger positioning method and system
CN108073163A (en) * 2016-11-11 2018-05-25 中国科学院沈阳计算技术研究所有限公司 The control method for determining robot exact position is compensated using Quick Response Code value of feedback
CN108154209A (en) * 2016-12-02 2018-06-12 杭州海康机器人技术有限公司 A kind of Quick Response Code generation, recognition methods and device for localization for Mobile Robot
CN108154210A (en) * 2016-12-02 2018-06-12 杭州海康机器人技术有限公司 A kind of Quick Response Code generation, recognition methods and device
CN108154209B (en) * 2016-12-02 2021-03-26 杭州海康机器人技术有限公司 Two-dimensional code generation and identification method and device for mobile robot positioning
CN107687855A (en) * 2017-08-22 2018-02-13 广东美的智能机器人有限公司 Robot localization method, apparatus and robot
CN107671863A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot control method, device and robot based on Quick Response Code
CN107687855B (en) * 2017-08-22 2020-07-31 广东美的智能机器人有限公司 Robot positioning method and device and robot
CN107720237A (en) * 2017-08-31 2018-02-23 广州泰行智能科技有限公司 A kind of pile part Nesting and device based on space coordinates
CN107720237B (en) * 2017-08-31 2020-02-07 广州泰行智能科技有限公司 Stacking layout method and device based on space coordinate system
CN107782305A (en) * 2017-09-22 2018-03-09 郑州郑大智能科技股份有限公司 A kind of method for positioning mobile robot based on digital alphabet identification
CN107782305B (en) * 2017-09-22 2021-05-14 郑州郑大智能科技股份有限公司 Mobile robot positioning method based on digital letter recognition
CN108225303A (en) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code
WO2019154435A1 (en) * 2018-05-31 2019-08-15 上海快仓智能科技有限公司 Mapping method, image acquisition and processing system, and positioning method
CN109093621A (en) * 2018-08-10 2018-12-28 北京极智嘉科技有限公司 Robot running precision monitoring method, device, robot, server and medium
CN109346148A (en) * 2018-08-16 2019-02-15 常州市钱璟康复股份有限公司 The two dimensional code location recognition method and its system of upper-limbs rehabilitation training robot
CN109009871A (en) * 2018-08-16 2018-12-18 常州市钱璟康复股份有限公司 A kind of upper-limbs rehabilitation training robot
CN109397249A (en) * 2019-01-07 2019-03-01 重庆大学 The two dimensional code positioning crawl robot system algorithm of view-based access control model identification
CN110033069A (en) * 2019-04-18 2019-07-19 天津工业大学 A kind of location coding label design method of points based on QR code
CN111026105A (en) * 2019-10-18 2020-04-17 同济大学 Mobile robot field positioning system and control method thereof

Also Published As

Publication number Publication date
CN105437251B (en) 2017-07-18

Similar Documents

Publication Publication Date Title
CN105437251A (en) Method and device for locating position of robot
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
CN110834327B (en) Robot control method and device
Levinson et al. Automatic online calibration of cameras and lasers.
KR101374643B1 (en) Object recognition method and recognition apparatus
US11486988B2 (en) Method for calibrating the alignment of a moving object sensor
CN109100707B (en) Scaling method, device, equipment and the storage medium of radar sensor
CN103170980A (en) Positioning system and positioning method for household service robot
US11619698B2 (en) Method and terminal for controlling power plant
KR20130140898A (en) Structure measuring unit for tracking, measuring and marking edges and corners of adjacent surfaces
JP5976027B2 (en) Sensor axis deviation detection device and sensor axis deviation detection method
CN102082905B (en) A kind of method detecting position of camera optic axis
KR101658465B1 (en) Location estimation method based on direction finding using tdoa of signal and direction finder thereof
TW201621273A (en) Mobile positioning apparatus and positioning method thereof
AU2021411355A1 (en) Apparatus for use in calibrating laser level
US8103286B2 (en) Wireless communication system for automatically generating a received signal strength distribution map
CN103906234A (en) Indoor positioning method based on WIFI signals
CA2958759A1 (en) Enhanced positioning method for moving target in mine shaft based on witness nodes under internet of things architecture
CN115631099A (en) Radial distortion parameter measuring method and device and electronic equipment
KR20190007304A (en) Apparatus for measuring location using light and method thereof
JP2015081814A (en) Position measurement system of movable body
WO2023236506A1 (en) Positioning method, device, and storage medium
CN115682992A (en) Method for detecting installation error of infrared emission device, electronic equipment and system
WO2022000209A1 (en) Positioning method and positioning device
US11433542B2 (en) Calibration detecting apparatus, method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220210

Address after: 310000 room 603, South Building, No. 2, Chaofeng East Road, Yuhang Economic Development Zone, Yuhang District, Hangzhou, Zhejiang Province

Patentee after: ZHEJIANG LIBIAO ROBOT CO.,LTD.

Address before: 310000 0614-2 room, 199 business building, Xihu District Wensanlu Road, Hangzhou, Zhejiang.

Patentee before: HANGZHOU YAMEILIJIA TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right