CN104655007B - One kind creates environment scene world coordinates method and system - Google Patents

One kind creates environment scene world coordinates method and system Download PDF

Info

Publication number
CN104655007B
CN104655007B CN201310597902.3A CN201310597902A CN104655007B CN 104655007 B CN104655007 B CN 104655007B CN 201310597902 A CN201310597902 A CN 201310597902A CN 104655007 B CN104655007 B CN 104655007B
Authority
CN
China
Prior art keywords
angle
robot
data
world coordinates
lrf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310597902.3A
Other languages
Chinese (zh)
Other versions
CN104655007A (en
Inventor
秦召红
朱定局
张敏
王凌霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201310597902.3A priority Critical patent/CN104655007B/en
Publication of CN104655007A publication Critical patent/CN104655007A/en
Application granted granted Critical
Publication of CN104655007B publication Critical patent/CN104655007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention is applied to robot and builds Cartographic Technique field, environment scene world coordinates method and system are created there is provided one kind, this method is applied to the system comprising laser range finder and robot, and the laser range finder is mounted in the robot, and methods described includes:Robot local coordinate data of the external environment condition relative to the robot in the process of moving are gathered by the laser range finder;The travel speed and deviation angle of the robot are obtained, the local coordinate data are converted to by world coordinates data according to the travel speed and deviation angle;The cartographic information of external environment condition according to the world coordinates data creation.Pass through the present invention so that data acquisition more comprehensively, accurately, can accurately reproduce two-dimensional environment scene.

Description

One kind creates environment scene world coordinates method and system
Technical field
Cartographic Technique field, more particularly to a kind of establishment environment scene world coordinates method are built the invention belongs to robot And system.
Background technology
Laser range finder measures environmental surfaces corresponding points to the distance of measurement position by laser scanning, by adjusting the distance and Angle can obtain the coordinate of corresponding points relative laser rangefinder.But the detection range of laser range finder is not 360 degree, by it The limitation of scanning range, every time scanning can only all obtain a part of data of scene, to obtain scene global coordinate, it is necessary to not Same angle is scanned to environment scene.The data that laser scanner is scanned every time are all environment scenes relative to scanning position The local coordinate value put, i.e., in the data that different scan positions is obtained not under same coordinate system.Prior art needs The data point set that different scanning position is scanned is matched to obtain scene global coordinate, and amount of calculation is larger and not accurate enough Really.
The content of the invention
The embodiment of the present invention is to provide a kind of establishment environment scene world coordinates method, to solve prior art in establishment The problem of amount of calculation is larger and not accurate enough during scene global coordinate.
The first aspect of the embodiment of the present invention creates environment scene world coordinates method there is provided one kind, applied to comprising sharp The system of optar and robot, the laser range finder is mounted in the robot, and methods described includes:
By the laser range finder gather the robot in the process of moving external environment condition relative to the robot Local coordinate data;
The travel speed and deviation angle of the robot are obtained, according to the travel speed and deviation angle by the office Portion's coordinate data is converted to world coordinates data;
The cartographic information of external environment condition according to the world coordinates data creation.
The second aspect of the embodiment of the present invention creates environment scene world coordinates system there is provided one kind, and the system includes:
Laser range finder and robot, the laser range finder are mounted in the robot;
Data acquisition module, for gathering robot external environment condition in the process of moving by the laser range finder Relative to the local coordinate data of the robot;
Data conversion module, travel speed and deviation angle for obtaining the robot, according to the travel speed The local coordinate data are converted into world coordinates data with deviation angle;
Data processing module, the cartographic information for the external environment condition according to the world coordinates data creation.
The beneficial effect that the embodiment of the present invention exists compared with prior art is:The embodiment of the present invention using robot as swash The carrier of optar, enables laser range finder more comprehensively to gather the local coordinate number of external environment condition by robot According to, and the local coordinate data are converted to by world coordinates data according to the travel speed and deviation angle of robot so that Data result more comprehensively, accurately, can accurately reproduce two-dimensional environment scene.And the embodiment of the present invention is realized simply, to hard Part requires relatively low, so that product cost is advantageously reduced, with stronger usability and practicality.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art In required for the accompanying drawing that uses be briefly described, it should be apparent that, drawings in the following description are only some of the present invention Embodiment, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these Accompanying drawing obtains other accompanying drawings.
Fig. 1 is the application scenarios that the establishment environment scene world coordinates method that first embodiment of the invention is provided is applicable Figure;
Fig. 2 is the schematic diagram for the robot that first embodiment of the invention is provided;
Fig. 3 is the implementation process figure for the establishment environment scene world coordinates method that second embodiment of the invention is provided;
Fig. 4 is the schematic diagram in the robot driving process that second embodiment of the invention is provided;
Fig. 5 is another schematic diagram in the robot driving process that second embodiment of the invention is provided;
Fig. 6 is the composition structure chart for the establishment environment scene world coordinates system that third embodiment of the invention is provided.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.
In order to illustrate technical solutions according to the invention, illustrated below by specific embodiment.
Embodiment one:
Fig. 1 shows the application scenarios that the establishment environment scene world coordinates method that first embodiment is provided is applicable, and is It is easy to explanation, illustrate only the part related to the embodiment of the present invention.
As shown in figure 1, the application scenarios include laser range finder 1, robot 2 and processor 3, the laser range finder 1 is mounted in the robot 2, is mounted in the expansion board of the robot 2.Exemplary, the machine Device people 2 is also configured with two front-wheels, rear single wheel, wireless network module and linux operating systems etc..
The present embodiment is after start machine people 2, speed when setting the straight trip speed of the robot, turning(Work as straight line Set the speed of two front-wheels identical during traveling;It is outer ring wheel velocity to set inner ring wheel velocity in two front-wheels when cornering Half)And initial travel direction.Exemplary, as shown in Fig. 2 using the startup position of robot 2 as the origin of coordinates, initially Travel direction is y-axis, and the distance between two front-wheels are d, and the scanning angle of laser range finder 1 is(- 30 °, 120 °).
In order to more comprehensively gather the data of external environment condition, using robot 2 as laser range finder 1 carrier, by described Laser range finder 1 gathers the robot 2 local coordinate number of the external environment condition relative to the robot 2 in the process of moving According to.Wherein, the laser range finder 1 measures preset range using non-contacting mode(5.6m, 240 ° of scope)Interior object Size and position.
In the present embodiment, processor 3 sends control instruction to robot 2, and the control instruction includes the row of robot 2 Direction, travel speed etc. are sailed, robot 2 is after the control instruction that the processor 3 is sent is received, according to the control instruction In travel direction, travel speed travelled.Laser range finder 1 gathers the robot 2 external environment condition in the process of moving Relative to the local coordinate data of the robot 2, and the local coordinate data are sent to the robot 2 deposited The local coordinate data are converted to world coordinates data by storage, robot 2 according to the travel speed and deviation angle of itself, and When the world coordinates data reach predetermined threshold value, the robot 2 suspends N seconds(N is the integer more than or equal to 1), and The world coordinates data are sent to the processor 3 during pause, can be specifically according to ICP/IP protocol The world coordinates data are sent to the processor 3.
Robot 2 continues to travel after currently stored world coordinates data are sent into processor 3, in circulation execution The steps such as data acquisition, data conversion and data transmission are stated, until required external environment condition data have all been gathered or machine Device people 2 is received after the stopping of the transmission of processor 3, and laser range finder 1 stops the collection of data.Processor 3 is according to receiving Whole world coordinates data create the cartographic information of the external environment condition by visual softwares such as matlab, to reduce outside Two-dimensional environment scene.
The embodiment of the present invention is realized simply, relatively low to hardware requirement, with higher flexibility.And data acquisition is complete It is face, accurate so that outside two-dimensional environment scene can be with accurate reproduction.
Embodiment two:
Fig. 3 shows the implementation process for the establishment environment scene world coordinates method that second embodiment is provided, and this method can Applied to the application scenarios shown in Fig. 1, details are as follows for this method process:
In step S301, by laser range finder gather robot in the process of moving external environment condition relative to the machine The local coordinate data of device people.
In the present embodiment, the laser range finder is mounted in the robot, can be specifically be mounted in it is described In the expansion board of robot.Using robot as the carrier of laser range finder 1, the machine is gathered by the laser range finder People's local coordinate data of the external environment condition relative to the robot in the process of moving.Wherein, the laser range finder is used Non-contacting mode measures preset range(5.6m, 240 ° of scope)Interior dimension of object and position.
In the present embodiment, described 682 points of laser range finder single pass, from(- 30 °, 120 °)Scan, obtain one by one Point again by angle calculation, obtain coordinate points, formula is as follows:
The angle value of every line of laser range finder is:
angle=(i-(LRF_DATA_NB/2.0-1024/4.0))*360.0/1024.0;
The i-th point coordinates is in 682 points:
x=kb_lrf_DistanceData[i]*cos(angle*M_PI/180.0);
y=kb_lrf_DistanceData[i]*sin(angle*M_PI/180.0);
Wherein, kb_lrf_DistanceData [i] is distance of the laser range finder to object in preset range, LRF_DATA_NB values are that 682, M_PI values are 3.14.
In step s 302, the travel speed and deviation angle of the robot are obtained, according to the travel speed and partially Move angle and the local coordinate data are converted into world coordinates data.
Particularly, when the inceptive direction that the Robot is set(Such as y-axis, as shown in Figure 4)During traveling, Laser Measuring Distance meter scanning range changes, and the angle of every line of laser range finder is constant, and x-axis seat target value is constant, and y-axis coordinate value becomes Change, being transformed into the i-th point coordinates in world coordinates data is:
x=kb_lrf_DistanceData[i]×cos(angle×M_PI/180.0);
y=ch_y+kb_lrf_DistanceData[i]×sin(angle×M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0, LRF_DATA_NB values It is the distance of the robot movement for 682, ch_y(It can be obtained according to the time of traveling and the straight trip speed of setting), kb_ Lrf_DistanceData [i] is distance of the laser range finder to object in preset range, and M_PI is 3.14.
When the robot bends to right arrival(X2, y2)(As shown in Figure 5)When, laser range finder angle change is: angle=(i-(LRF_DATA_NB/2.0-1024/4.0))*360.0/1024.0+ch_angle;
Ch_angle is the step angle of robot, ch_angle values are 360 °/(N-1), N is the integer more than 1, N tools Body can be the number of times that robot is scanned around a circle in testing, when robot is reached(X2, y2)When, what laser range finder was scanned Surrounding environment Coordinate Conversion i-th point coordinates into world coordinates data is:
x=ch_x+1.5d×(1-cos(ch_angle×M_PI/180.0));
y=ch_y+1.5d×sin(ch_angle×M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and N can be specifically time of the robot around a circle scanning in experiment Number, d is the distance between described two front-wheels of robot, and M_PI values are 3.14;
When the robot turns to the left, the i-th point coordinates is in the world coordinates data being transformed into:
x=ch_x-1.5d×sin(ch_angle×M_PI/180.0);
y=ch_y+1.5d×cos(ch_angle×M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and N can be specifically time of the robot around a circle scanning in experiment Number, d is the distance between described two front-wheels of robot, and M_PI values are 3.14.
When the Robot is travelled relative to the oblique line directions of inceptive direction(As shown in figure 5, from(X2, y2)Arrive(X3, y3))When, laser range finder angle change is:angle=(i-(LRF_DATA_NB/2.0-1024/4.0))*360.0/1024.0 +ch_angle;
Ch_angle is the step angle of robot, ch_angle values are 360 °/(N-1), N is the integer more than 1, N tools Body can be the number of times that robot is scanned around a circle in testing, by circumference equation:(x-a)2+(y-b)2=r2, (a, b) is that coordinate is former Point, being transformed into the i-th point coordinates in the world coordinates data of oblique line directions is:
x=ch_x+kb_lrf_DistanceData[i]×cos(angle×M_PI/180.0);
y=ch_y+kb_lrf_DistanceData[i]×sin(angle×M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0+ch_angle, LRF_ DATA_NB values are 682, and (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and N can be specifically time of the robot around a circle scanning in experiment Number, kb_lrf_DistanceData [i] is distance of the laser range finder to object in preset range, and M_PI values are 3.14.
The present embodiment, can be by the robot local seat that laser range finder is collected in the process of moving by above-mentioned formula Mark data are transformed into world coordinates data.
Further, the present embodiment is before the travel speed and deviation angle of the robot is obtained, in addition to:
Start machine people, the speed when straight trip speed of the robot being set, being turned(Two are set when straight-line travelling The speed of front-wheel is identical;Inner ring wheel velocity is the half of outer ring wheel velocity in two front-wheels of setting when cornering)And just Beginning travel direction etc..
In step S303, the cartographic information of external environment condition according to the world coordinates data creation.
Can be specifically to be created according to the whole world coordinates data received by visual softwares such as matlab The cartographic information of the external environment condition, to reduce the two-dimensional environment scene of outside.
By the embodiment of the present invention so that data acquisition more comprehensively, accurately, can accurately reproduce two-dimensional environment scene.And And realize simply, it is relatively low to hardware requirement, with higher flexibility.
Embodiment three:
Fig. 6 shows the composition structure for the establishment environment scene world coordinates system that third embodiment of the invention is provided, and is It is easy to explanation, illustrate only the part related to the embodiment of the present invention.
The system can be applied to the application scenarios described in Fig. 1.
The establishment environment scene world coordinates system includes laser range finder 1 and robot 2, the laser range finder 1 It is mounted in the robot 2.Wherein, the system also includes:
Data acquisition module 11, data conversion module 21 and data processing module 31;
The data acquisition module 11 is used for outer in the process of moving by the laser range finder collection robot Local coordinate data of portion's environment relative to the robot;
The data conversion module 21 is used for the travel speed and deviation angle for obtaining the robot, according to the traveling The local coordinate data are converted to world coordinates data by speed and deviation angle;
The data processing module 31, the map letter for the external environment condition according to the world coordinates data creation Breath.
Exemplary, the data acquisition module 11 can be applied to laser range finder 1;The data conversion module 21 can be answered For robot 2;The data processing module 31 can be applied to the processor 3 shown in Fig. 1.In practical application, it can be not limited to This, such as the data conversion module 21 can also be applied to processor 3, this is no longer going to repeat them.
Further, the robot 2 can also include data transmission blocks 22, and the data transmission blocks 22 are used for will The world coordinates data are sent to data processing module 31;
Further, the laser range finder 2 can also include:
Setup module 23, for before the travel speed and deviation angle of the robot is obtained, starting the machine People, speed and initial travel direction when the straight trip speed of the robot being set, being turned.
Further, the data conversion module 21 specifically for:
When the inceptive direction traveling that the Robot is set, the i-th point coordinates is in world coordinates data:
x=kb_lrf_DistanceData[i]×cos(angle×M_PI/180.0);
y=ch_y+kb_lrf_DistanceData[i]×sin(angle×M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0, LRF_DATA_NB values It is the distance of robot movement for 682, ch_y, kb_lrf_DistanceData [i] is the laser range finder to making a reservation for In the range of object distance, M_PI values are 3.14;
When the robot bends to right, the i-th point coordinates is in world coordinates data:
x=ch_x+1.5d×(1-cos(ch_angle×M_PI/180.0));
y=ch_y+1.5d×sin(ch_angle×M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and d is the distance between described two front-wheels of robot, and M_PI values are 3.14;
When the robot turns to the left, the i-th point coordinates is in world coordinates data:
x=ch_x-1.5d×sin(ch_angle×M_PI/180.0);
y=ch_y+1.5d×cos(ch_angle×M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and d is the distance between described two front-wheels of robot, and M_PI values are 3.14;
When the Robot non-initial direction running, the i-th point coordinates is in world coordinates data:
x=ch_x+kb_lrf_DistanceData[i]×cos(angle×M_PI/180.0);
y=ch_y+kb_lrf_DistanceData[i]×sin(angle×M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0+ch_angle, LRF_ DATA_NB values are 682, and (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, Ch_angle values are 360 °/(N-1), N is integer more than 1, and kb_lrf_DistanceData [i] is the laser range finder The distance of object in preset range, M_PI values are 3.14.
It is apparent to those skilled in the art that, for convenience and simplicity of description, only with above-mentioned each function Unit, the division progress of module are for example, in practical application, as needed can distribute above-mentioned functions by different work( Can unit, module complete, i.e., the internal structure of described system is divided into different functional unit or module, to complete above description All or part of function.Each functional unit or module in embodiment can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit, above-mentioned integrated list Member or module can both be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.In addition, each function Unit, the specific name of module are also only to facilitate mutually differentiation, is not limited to the protection domain of the application.Above-mentioned system Unit, the specific work process of module, may be referred to the corresponding process of preceding method embodiment, will not be repeated here in system.
In summary, the embodiment of the present invention is made using robot as the carrier of laser range finder by the enforcement of robot Laser range finder can more comprehensively gather the local coordinate data of external environment condition, and travel speed according to robot and/or partially Move angle and the local coordinate data are converted into world coordinates data so that data result more comprehensively, accurately, can be accurate Reproduce two-dimensional environment scene.And the embodiment of the present invention realize it is simple, it is relatively low to hardware requirement, thus advantageously reduce product into This, with stronger usability and practicality.
Those of ordinary skill in the art are further appreciated that all or part of step realized in above-described embodiment method is can To instruct the hardware of correlation to complete by program, described program can be stored in a computer read/write memory medium In, described storage medium, including ROM/RAM, disk, CD etc..
Above content is to combine specific preferred embodiment further description made for the present invention, it is impossible to assert The specific implementation of the present invention is confined to these explanations.For general technical staff of the technical field of the invention, Some equivalent substitutes or obvious modification are made on the premise of not departing from present inventive concept, and performance or purposes are identical, all should It is considered as belonging to the scope of patent protection that the present invention is determined by the claims submitted.

Claims (8)

1. one kind creates environment scene world coordinates method, it is characterised in that applied to comprising laser range finder and robot System, the laser range finder is mounted in the robot, and methods described includes:
Office of the external environment condition relative to the robot in the process of moving of the robot is gathered by the laser range finder Portion's coordinate data;
The travel speed and deviation angle of the robot are obtained, the part is sat according to the travel speed and deviation angle Mark data are converted to world coordinates data;
The cartographic information of external environment condition according to the world coordinates data creation;
The travel speed and deviation angle for obtaining the robot, according to the travel speed and deviation angle by the office Portion's coordinate data, which is converted to world coordinates data, to be included:
When the inceptive direction straight-line travelling that the Robot is set, the i-th point coordinates is in world coordinates data:
X=kb_lrf_DistanceData [i] × cos (angle × M_PI/180.0);
Y=ch_y+kb_lrf_DistanceData [i] × sin (angle × M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0, angle represents angle value, LRF_DATA_NB values are that 682, ch_y is the distance that the robot is moved, and kb_lrf_DistanceData [i] is described sharp Optar is to the distance of object in preset range, and M_PI values are 3.14.
2. the method as described in claim 1, it is characterised in that obtain the robot travel speed and deviation angle it Before, methods described includes:
Start the robot, speed and initial travel direction when the straight trip speed of the robot being set, being turned.
3. the method as described in claim 1, it is characterised in that the travel speed and deviation angle of the acquisition robot Degree, the local coordinate data are converted into world coordinates data according to the travel speed and deviation angle includes:
When the robot bends to right, the i-th point coordinates is in world coordinates data:
X=ch_x+1.5d × (1-cos (ch_angle × M_PI/180.0));
Y=ch_y+1.5d × sin (ch_angle × M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, ch_ Angle values are 360 °/(N-1), N represents the time of robot pause, and N is integer more than 1, before d is two, the robot Distance between wheel;
When the robot turns to the left, the i-th point coordinates is in world coordinates data:
X=ch_x-1.5d × sin (ch_angle × M_PI/180.0);
Y=ch_y+1.5d × cos (ch_angle × M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, ch_ Angle values are 360 °/(N-1), N represents the time of robot pause, and N is integer more than 1, before d is two, the robot Distance between wheel, M_PI values are 3.14.
4. the method as described in claim 1, it is characterised in that the travel speed and deviation angle of the acquisition robot Degree, the local coordinate data are converted into world coordinates data according to the travel speed and deviation angle includes:
When oblique line directions traveling of the Robot relative to inceptive direction, the i-th point coordinates is in world coordinates data:
X=ch_x+kb_lrf_DistanceData [i] × cos (angle × M_PI/180.0);
Y=ch_y+kb_lrf_DistanceData [i] × sin (angle × M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0+ch_angle, angle tables Show angle value, LRF_DATA_NB values are 682, and (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the machine The step angle of people, ch_angle values are 360 °/(N-1), and N represents the time of robot pause, and N is integer more than 1, kb_ Lrf_DistanceData [i] is distance of the laser range finder to object in preset range, and M_PI values are 3.14.
5. one kind creates environment scene world coordinates system, it is characterised in that the system includes:
Laser range finder and robot, the laser range finder are mounted in the robot;
Data acquisition module, for gathering the robot by the laser range finder, external environment condition is relative in the process of moving In the local coordinate data of the robot;
Data conversion module, travel speed and deviation angle for obtaining the robot, according to the travel speed and partially Move angle and the local coordinate data are converted into world coordinates data;
Data processing module, the cartographic information for the external environment condition according to the world coordinates data creation;
The data conversion module specifically for:
When the inceptive direction traveling that the Robot is set, the i-th point coordinates is in world coordinates data:
X=kb_lrf_DistanceData [i] × cos (angle × M_PI/180.0);
Y=ch_y+kb_lrf_DistanceData [i] × sin (angle × M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0, angle represents angle value, LRF_DATA_NB values are that 682, ch_y is the distance that the robot is moved, and kb_lrf_DistanceData [i] is described sharp Optar is to the distance of object in preset range, and M_PI values are 3.14.
6. system as claimed in claim 5, it is characterised in that the system also includes:
Setup module, for before the travel speed and deviation angle of the robot is obtained, starting the robot, is set The straight trip speed of the robot, speed and initial travel direction when turning.
7. system as claimed in claim 5, it is characterised in that the data conversion module specifically for:
When the robot bends to right, the i-th point coordinates is in world coordinates data:
X=ch_x+1.5d × (1-cos (ch_angle × M_PI/180.0));
Y=ch_y+1.5d × sin (ch_angle × M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, ch_ Angle values are 360 °/(N-1), N represents the time of robot pause, and N is integer more than 1, before d is two, the robot Distance between wheel, M_PI values are 3.14;
When the robot turns to the left, the i-th point coordinates is in world coordinates data:
X=ch_x-1.5d × sin (ch_angle × M_PI/180.0);
Y=ch_y+1.5d × cos (ch_angle × M_PI/180.0);
Wherein, (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the step angle of the robot, ch_ Angle values are 360 °/(N-1), N represents the time of robot pause, and N is integer more than 1, before d is two, the robot Distance between wheel, M_PI values are 3.14.
8. system as claimed in claim 5, it is characterised in that the data conversion module specifically for:
When oblique line directions traveling of the Robot relative to inceptive direction, the i-th point coordinates is in world coordinates data:
X=ch_x+kb_lrf_DistanceData [i] × cos (angle × M_PI/180.0);
Y=ch_y+kb_lrf_DistanceData [i] × sin (angle × M_PI/180.0);
Wherein, angle=(i- (LRF_DATA_NB/2.0-1024/4.0)) × 360.0/1024.0+ch_angle, angle tables Show angle value, LRF_DATA_NB values are 682, and (ch_x, ch_y) is the world coordinates of the i-th -1 point, and ch_angle is the machine The step angle of people, ch_angle values are 360 °/(N-1), and N represents the time of robot pause, and N is integer more than 1, kb_ Lrf_DistanceData [i] is distance of the laser range finder to object in preset range, and M_PI values are 3.14.
CN201310597902.3A 2013-11-22 2013-11-22 One kind creates environment scene world coordinates method and system Active CN104655007B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310597902.3A CN104655007B (en) 2013-11-22 2013-11-22 One kind creates environment scene world coordinates method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310597902.3A CN104655007B (en) 2013-11-22 2013-11-22 One kind creates environment scene world coordinates method and system

Publications (2)

Publication Number Publication Date
CN104655007A CN104655007A (en) 2015-05-27
CN104655007B true CN104655007B (en) 2017-08-25

Family

ID=53246419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310597902.3A Active CN104655007B (en) 2013-11-22 2013-11-22 One kind creates environment scene world coordinates method and system

Country Status (1)

Country Link
CN (1) CN104655007B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107188042B (en) * 2017-06-21 2018-12-18 欧浦智网股份有限公司 A kind of ground coordinate method for drafting and device for cargo handling
CN107678306B (en) * 2017-10-09 2021-04-16 驭势(上海)汽车科技有限公司 Dynamic scene information recording and simulation playback method, device, equipment and medium
CN109029257B (en) * 2018-07-12 2020-11-06 中国科学院自动化研究所 Large-scale workpiece pose measurement system and method based on stereoscopic vision and structured light vision

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4408328C2 (en) * 1994-03-11 2002-09-26 Siemens Ag Method for setting up a cellularly structured environment map of a self-moving mobile unit, which is oriented with the aid of sensors based on wave reflection
CN100449444C (en) * 2006-09-29 2009-01-07 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN101008571A (en) * 2007-01-29 2007-08-01 中南大学 Three-dimensional environment perception method for mobile robot
JP2011129095A (en) * 2009-12-18 2011-06-30 Korea Electronics Telecommun Map creating method using autonomous traveling robot, optimal travel route computing method using the same, and robot control device carrying out the methods
CN103198751B (en) * 2013-03-06 2015-03-04 南京邮电大学 Line feature map creation method of mobile robot based on laser range finder

Also Published As

Publication number Publication date
CN104655007A (en) 2015-05-27

Similar Documents

Publication Publication Date Title
Seif et al. Autonomous driving in the iCity—HD maps as a key challenge of the automotive industry
CN105607071B (en) A kind of indoor orientation method and device
CN109100730B (en) Multi-vehicle cooperative rapid map building method
CN106918830A (en) A kind of localization method and mobile robot based on many navigation modules
Wang et al. A simple and parallel algorithm for real-time robot localization by fusing monocular vision and odometry/AHRS sensors
CN103412565B (en) A kind of robot localization method with the quick estimated capacity of global position
CN106774301A (en) A kind of avoidance follower method and electronic equipment
CN108844543A (en) Indoor AGV navigation control method based on UWB positioning and dead reckoning
CN107024210A (en) A kind of Indoor Robot barrier-avoiding method, device and navigation system
CN105157697A (en) Indoor mobile robot pose measurement system and measurement method based on optoelectronic scanning
CN205581643U (en) Location navigation of robot
CN104655007B (en) One kind creates environment scene world coordinates method and system
CN102855673B (en) The recording method of driving training information, device and driving training information recorder
CN103472434B (en) Robot sound positioning method
CN106332280B (en) Single mobile beacon node assisted location method in WSNs based on energy efficient
CN109917790A (en) It is a kind of independently to guide vehicle and its travel control method and control device
CN109683617A (en) A kind of automatic Pilot method, apparatus and electronic equipment
CN108170166A (en) The follow-up control method and its intelligent apparatus of robot
CN106203341B (en) A kind of Lane detection method and device of unmanned vehicle
Gu et al. Design and performance evaluation of wiimote-based two-dimensional indoor localization systems for indoor mobile robot control
CN115857504A (en) DWA-based robot local path planning method, equipment and storage medium in narrow environment
CN103076023A (en) Method and device for calculating step
JP2003247805A (en) Method for measuring volume and program for measuring volume
Gao et al. Localization of mobile robot based on multi-sensor fusion
CN109960260A (en) A kind of independently guiding vehicle and its air navigation aid and control device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20150527

Assignee: SHENZHEN YIHUO TECHNOLOGY CO.,LTD.

Assignor: SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY CHINESE ACADEMY OF SCIENCES

Contract record no.: X2023980043401

Denomination of invention: A Method and System for Creating Global Coordinates of Environmental Scenes

Granted publication date: 20170825

License type: Common License

Record date: 20231013

EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20150527

Assignee: Shenzhen Future Industry Center Co.,Ltd.

Assignor: SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY CHINESE ACADEMY OF SCIENCES

Contract record no.: X2023980043922

Denomination of invention: A Method and System for Creating Global Coordinates of Environmental Scenes

Granted publication date: 20170825

License type: Common License

Record date: 20231020

EE01 Entry into force of recordation of patent licensing contract