CN107966986A - A kind of robot and its air navigation aid, system, equipment - Google Patents
A kind of robot and its air navigation aid, system, equipment Download PDFInfo
- Publication number
- CN107966986A CN107966986A CN201711183604.4A CN201711183604A CN107966986A CN 107966986 A CN107966986 A CN 107966986A CN 201711183604 A CN201711183604 A CN 201711183604A CN 107966986 A CN107966986 A CN 107966986A
- Authority
- CN
- China
- Prior art keywords
- robot
- edge
- real
- sensor group
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000012937 correction Methods 0.000 claims abstract description 35
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000004590 computer program Methods 0.000 claims description 3
- 238000004140 cleaning Methods 0.000 description 24
- 239000011521 glass Substances 0.000 description 24
- 238000003491 array Methods 0.000 description 12
- 230000001960 triggered effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/617—Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/242—Means based on the reflection of waves generated by the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/10—Specific applications of the controlled vehicles for cleaning, vacuuming or polishing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/70—Industrial sites, e.g. warehouses or factories
- G05D2107/75—Electric power generation plants
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
This application discloses a kind of robot and its air navigation aid, system, equipment, this method to include:When the edge of robot along support object moves, obtain the positional information that first position sensor group and second place sensor group detect in real time respectively, obtain the first real-time position information and the second real-time position information;According to first real-time position information and second real-time position information, correction processing is carried out to the direct of travel of the robot, so that the first real-time position information is the external location information at the edge after correction is handled every time, and the internal location informa that the second real-time position information is the edge.The edge that the application causes robot and can either reach supporting surface carries out operation, and can avoid slipping from the edge of supporting surface and drop, that is, so that robot carries out navigation walking on the body surface of no frame.
Description
Technical field
The present invention relates to robotic technology field, more particularly to a kind of robot and its air navigation aid, system, equipment.
Background technology
Currently, robot industry development is swift and violent, and more and more robot kinds, which are developed, to be produced, and is applied to
Industry-by-industry, specific boundless application prospect.
Need to be applied to clean target object now with many robots, repair etc. in operating environment.Work as machine
People is needed when being moved on above-mentioned target object, it usually needs is navigated by means of the frame of target object.It is for example, traditional
Photovoltaic cleaning robot to conventional photovoltaic array when carrying out washing and cleaning operation, it is necessary to which the frame for relying on conventional photovoltaic array is walked.
However, under many circumstances, wait on the object of robot cleaning or maintenance and do not have frame, in such case
Under, how to allow robot to carry out navigation walking on a surface of an and become for a unusual stubborn problem.For example, current
Field of photovoltaic power generation in, the photovoltaic array (double glass photovoltaic arrays) based on double glazing component be in following photovoltaic art very
Important Novel photovoltaic array structure, is no frame on the photovoltaic array of this structure, at this time conventional photovoltaic cleaning robot
People just can not carry out navigation in this photovoltaic array and walk.
The content of the invention
In view of this, it is an object of the invention to provide a kind of robot and its air navigation aid, system, equipment, can make
Obtain robot and navigation walking is carried out on the body surface of no frame.Its concrete scheme is as follows:
In a first aspect, the invention discloses a kind of robot navigation method, including:
When the edge of robot along support object moves, first position sensor group and second are obtained in real time respectively
The positional information that sensor group detects is put, obtains the first real-time position information and the second real-time position information;Wherein, described
One position sensor group and the second place sensor group are placed in the shortest with the Edge Distance of the robot
On on the outside of one chassis;
According to first real-time position information and second real-time position information, to the direct of travel of the robot
Correction processing is carried out, so that the first real-time position information is the external location information at the edge after correction is handled every time, and
And second real-time position information be the edge internal location informa.
Optionally, first position sensor group and second place sensing are equipped with two outsides on the robot chassis
Device group.
Optionally, the first position sensor group includes a position sensor or multiple position sensors;Described
Two position sensor groups include a position sensor or multiple position sensors.
Optionally, it is described according to first real-time position information and second real-time position information, to the machine
The direct of travel of people carries out correction processing, so that the first real-time position information is the outside at the edge after correction is handled every time
Positional information, and the second real-time position information be the edge internal location informa the step of, including:
When the internal location informa that the first real-time position information is the edge, then the robot is controlled to edge outside
Offset;
When the external location information that the second real-time position information is the edge, then the robot is controlled to edge inner side
Offset.
Optionally, the robot navigation method, further includes:
The robot is utilized respectively to be placed in the third place sensor group of front end in advance and be placed in rear end in advance
4th position sensor group come detect it is described support object edge;
When any sensor group in the third place sensor group and the 4th position sensor group detects institute
The edge of support object is stated, then controls the robot to carry out being moved away from marginal operation.
Optionally, any sensor worked as in the third place sensor group and the 4th position sensor group
Group detects the edge of the support object, then the step of controlling the robot to carry out being moved away from marginal operation, including:
When any sensor group in the third place sensor group and the 4th position sensor group detects institute
The edge of support object is stated, then the edge is determined as object edge, and controls the robot to carry out being moved away from edge behaviour in real time
Make, so that the front end of the robot is kept towards with the line segment extending direction in the object edge after being moved away from edge
Unanimously, and ensure during marginal operation is moved away from, when the third place sensor group detects the side of the support object
Edge, then drive the robot to be moved rearwards, when the 4th position sensor group detect it is described support object edge, then
The robot is driven to move forward.
Optionally, the robot navigation method, further includes:
After edge is moved away from, judge whether the location of presently described robot is that standard starts position, if
It is then to start the edge movement that the robot continues on the support object, if it is not, then the position to the robot
It is finely adjusted, until the position of the robot starts position for standard;
Wherein, when the robot, which is in the standard, starts position, the first real-time position information is the support pair
As the external location information at edge, the second real-time position information is the internal location informa of the support target edges, and institute
State the edge that the 4th position sensor group detects the support object.
Second aspect, the invention discloses a kind of Algorithms of Robots Navigation System, including:
Position information acquisition module, for when edge of the robot along support object moves, obtaining the in real time respectively
The positional information that one position sensor group and second place sensor group detect, obtains the first real-time position information and second in fact
When positional information;Wherein, the first position sensor group and the second place sensor group are placed in the robot
A chassis shortest with the Edge Distance on the outside of on;
Direction correcting module, for according to first real-time position information and second real-time position information, to institute
The direct of travel for stating robot carries out correction processing, so that the first real-time position information is the edge after correction processing every time
External location information, and the second real-time position information be the edge internal location informa.
The third aspect, the invention discloses a kind of robot navigation's equipment, including first position sensor group, the second place
Sensor group, memory and processor;Wherein, the processor is by performing the computer program being stored in the memory
Realize following steps:
When the edge of robot along support object moves, the first position sensor group and institute are obtained in real time respectively
The positional information that second place sensor group detects is stated, obtains the first real-time position information and the second real-time position information;Its
In, the first position sensor group and the second place sensor group be placed in the robot with the edge away from
On on the outside of a shortest chassis;
According to first real-time position information and second real-time position information, to the direct of travel of the robot
Correction processing is carried out, so that the first real-time position information is the external location information at the edge after correction is handled every time, and
And second real-time position information be the edge internal location informa.
Fourth aspect, the invention also discloses a kind of robot, including foregoing disclosed robot navigation's equipment.
As it can be seen that the present invention has disposed first position sensor group and the on the outside of a chassis nearest apart from edge in advance
Two position sensor groups, can be according to above-mentioned two position sensor group when the edge of robot along support object moves
The real-time position information detected, correction processing is carried out to the method for advance of robot, so that first after correction processing is real
When positional information and the second real-time position information be respectively support target edges external location information and internal location informa.Press
Constantly the direct of travel of robot is corrected according to above-mentioned correction mode, robot may finally be caused to reach branch
The edge in support face carries out operation, and can avoid slipping from the edge of supporting surface and drop, that is, so that robot exists
Do not have to carry out navigation walking on the body surface of frame.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
There is attached drawing needed in technology description to be briefly described, it should be apparent that, drawings in the following description are only this
The embodiment of invention, for those of ordinary skill in the art, without creative efforts, can also basis
The attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of robot navigation method flow chart disclosed by the embodiments of the present invention;
Fig. 2 is a kind of specific robot navigation method sub-process figure disclosed by the embodiments of the present invention;
Fig. 3 is a kind of specific robot navigation method application schematic diagram disclosed by the embodiments of the present invention;
Fig. 4 is a kind of specific robot schematic diagram disclosed by the embodiments of the present invention;
Fig. 5 is a kind of specific robot navigation method application schematic diagram disclosed by the embodiments of the present invention;
Fig. 6 is a kind of robot ambulation error schematic diagram disclosed by the embodiments of the present invention;
Fig. 7 is a kind of specific robot navigation method application schematic diagram disclosed by the embodiments of the present invention;
Fig. 8 is a kind of specific robot navigation method schematic diagram disclosed by the embodiments of the present invention;
Fig. 9 is a kind of Algorithms of Robots Navigation System structure diagram disclosed by the embodiments of the present invention.
Embodiment
Below in conjunction with the attached drawing in the embodiment of the present invention, the technical solution in the embodiment of the present invention is carried out clear, complete
Site preparation describes, it is clear that described embodiment is only part of the embodiment of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, those of ordinary skill in the art are obtained every other without making creative work
Embodiment, belongs to the scope of protection of the invention.
Shown in Figure 1 the embodiment of the invention discloses a kind of robot navigation method, this method includes:
Step S11:When the edge of robot along support object moves, first position sensor group is obtained in real time respectively
The positional information detected with second place sensor group, obtains the first real-time position information and the second real-time position information;Its
In, first position sensor group and second place sensor group are placed in a chassis shortest with Edge Distance for robot
On outside.
It is understood that if robot is only along clockwise or only along counterclockwise in the edge progress of support object
When mobile, then the present embodiment only can be equipped with first position on the outside of a robot chassis shortest with Edge Distance and sense
Device group and second place sensor group.Certainly, if robot both may be along clockwise may also be along supporting counterclockwise
When the edge of object is moved, then current embodiment require that being equipped with first position sensing on two outsides on robot chassis
Device group and second place sensor group.
In the present embodiment, the real-time position information that first position sensor group detects can be specifically support target edges
External location information or internal location informa.The real-time position information detected by first position sensor group, can be true
Make whether first position sensor group is the outside edges region for being located at support object, or in the edge of support object
Portion region.Similarly, the real-time position information that the second place sensor group in the present embodiment detects can be specifically support pair
As the external location information or internal location informa at edge.The real-time position information detected by second place sensor group,
It can determine whether second place sensor group is the outside edges region for being located at support object, or positioned at support object
Edge interior zone.
In addition, it is necessary to, it is noted that in the present embodiment, in above-mentioned first position sensor group and second place sensor group
Any sensor group in, a position sensor can be only included, when the real-time position information that the position sensor detects
To support the external location information of target edges, then it can directly judge corresponding position sensor group positioned at the edge of support object
Perimeter, similarly, when the real-time position information that the position sensor detects to support the internal location informa of target edges,
It then can directly judge edge interior zone of the corresponding position sensor group positioned at support object.
Certainly, in order to lift navigation accuracy, error situation, the present embodiment first position sensor group and the second place are reduced
In any sensor group in sensor group, multiple position sensors can also be included, also, above-mentioned multiple position sensors can
To be distributed on the straight line parallel with the crawler body side of robot, naturally it is also possible to needed according to practical application in track bottom
Above-mentioned multiple position sensors are reasonably distributed on the outside of disk.
In the present embodiment, when any of the above-described position sensor group includes multiple position sensors, phase can be got
The multigroup real-time position information answered, in above-mentioned multigroup real-time position information, if real-time position information is support target edges
The group number of external location information be more than the half of total group number, then can be determined that corresponding position sensor group positioned at support object
Outside edges region, if real-time position information be greater than or equal to for the group number of the internal location informa of support target edges it is total
The half of group number, then can be determined that edge interior zone of the corresponding position sensor group positioned at support object.
In addition, the position sensor in the present embodiment can be range sensor or Boolean value output sensor, including
But it is not limited to ultrasonic distance sensor, close switch, optoelectronic switch and 3D laser radars.
It is further noted that the supporting surface that object is supported in the present embodiment can be that plane can also be curved surface,
Either rectangular surfaces or the irregular face of rule, do not limit it more specifically herein.
Step S12:According to the first real-time position information and the second real-time position information, the direct of travel of robot is carried out
Correction is handled, so that the first real-time position information is the external location information at edge after correction is handled every time, and second is real
When positional information be edge internal location informa.
It is above-mentioned according to the first real-time position information and the second real-time position information in a kind of embodiment, to machine
The direct of travel of device people carries out correction processing, so that the first real-time position information is the outer position at edge after correction is handled every time
Confidence cease, and the second real-time position information be edge internal location informa the step of, can specifically include:
When the internal location informa that the first real-time position information is edge, then robot is controlled to edge lateral offset;When
Second real-time position information is the external location information at edge, then controls robot to edge offset inboard.
As it can be seen that the present invention has disposed first position sensor group and the on the outside of a chassis nearest apart from edge in advance
Two position sensor groups, can be according to above-mentioned two position sensor group when the edge of robot along support object moves
The real-time position information detected, correction processing is carried out to the method for advance of robot, so that first after correction processing is real
When positional information and the second real-time position information be respectively support target edges external location information and internal location informa.Press
Constantly the direct of travel of robot is corrected according to above-mentioned correction mode, robot may finally be caused to reach branch
The edge in support face carries out operation, and can avoid slipping from the edge of supporting surface and drop, that is, so that robot exists
Do not have to carry out navigation walking on the body surface of frame.
On the basis of previous embodiment, the embodiment of the present invention has made further optimization and explanation.Specifically:
Shown in Figure 2, the robot navigation method in the present embodiment, can also include:
Step S21:Be utilized respectively robot be placed in advance front end the third place sensor group and in advance be placed in after
The 4th position sensor group at end supports the edge of object to detect;
Step S22:When any sensor group in the third place sensor group and the 4th position sensor group detects branch
The edge of object is supportted, then controls robot to carry out being moved away from marginal operation.
It is understood that the third place sensor group and the 4th position sensor group in the present embodiment can be wrapped only
A position sensor is included, if the position sensor detects the edge of support object, can directly judge corresponding position
Put the edge that sensor group has detected support object.Certainly, in order to lift detection accuracy, they can also include multiple positions
Sensor is put, in addition, if above-mentioned the third place sensor group or the 4th position sensor group include multiple position sensors,
In this case, for any position sensor group, if detecting the sensor number of support target edges in the sensor group
Amount reaches predetermined threshold value, then can determine that the sensor group has detected the edge of support object, otherwise judge the sensor group also
Do not detect the edge of support object.In order to improve the security of robot, avoid robot from slipping and drop, can will be above-mentioned pre-
If threshold value is set as smaller numerical value as far as possible, such as is set to 1.
In the present embodiment, the above-mentioned any sensor group worked as in the third place sensor group and the 4th position sensor group is visited
The step of measuring the edge of support object, then controlling robot to carry out being moved away from marginal operation, can specifically include:
When any sensor group in the third place sensor group and the 4th position sensor group detects support object
Edge, then be determined as object edge by the edge, and real time control machine device people carries out being moved away from marginal operation, so that being moved away from edge
After robot front end towards being consistent with the line segment extending direction in object edge, and ensure be moved away from marginal operation
During, when the third place sensor group detects the edge of support object, then driving robot is moved rearwards, when the 4th position
Sensor group detects the edge of support object, then drives robot to move forward.
Further, the robot navigation method in the present embodiment, can also include:
After edge is moved away from, judge whether the location of current robot is that standard starts position, if it is,
Start the edge movement that robot continues on support object, if it is not, then being finely adjusted to the position of robot, until machine
The position of people starts position for standard;
Wherein, when robot, which is in standard, starts position, the first real-time position information is the outside of support target edges
Positional information, the second real-time position information is supports the internal location informa of target edges, and the 4th position sensor group is visited
Measure the edge of the support object.
In the present embodiment, position passes in the first position sensor group and second place sensor group at left and right sides of robot
The quantity of sensor may each be 1, specific as shown in Figure 3.In Fig. 3, supporting object is double glass photovoltaic arrays, robot close to double
The side of glass photovoltaic array left side edge is equipped with first position sensor group A1 and second place sensor group B1, robot
Side away from double glass photovoltaic array left side edges is equipped with second place sensor group A2 and second place sensor group B2, machine
The front end of device people is equipped with the third place sensor group M, and the rear end of robot is equipped with the 4th position sensor group N.
In Fig. 3, the position relationship between first position sensor group A1 and second place sensor group B1 has certain
Feature, that is, the position relationship between first position sensor group A1 and second place sensor group B1 is characterized in first
Sensor group A1 and second place sensor group B1 are put respectively positioned at pair both sides of the left side edge of glass photovoltaic arrays.Work as robot
Cleaning chassis when being deviated to the left in moving process, B1 sensors leave double glass photovoltaic array surfaces, can produce at this time
Raw corresponding correction trigger signal, is rectified a deviation, (angular dimension is by journey for low-angle of turning right with control cleaning chassis according to intrinsic program
Sequence is set), untill B1 is located at the surface on double glass photovoltaic array surfaces again;Cleaning the situation that chassis deviates to the right is also
Similar, A1 sensors, to positioned at the surface of double glass photovoltaic arrays, produce at this time never positioned at the surface of double glass photovoltaic arrays
Raw another correction trigger signal, is turned left low-angle with control cleaning chassis, until A1 sensors are positioned at double glass photovoltaic arrays
Surface.In the above example, position sensor position is characterized in fixation across sideline arrangement.Clean the biography on the right side of chassis
What sensor arrangement considered is corresponding same sideline, and chassis is returned toward movement, therefore the position sensor of robot both sides can be opposite
Chassis central axes are arranged symmetrically.
Certainly, in the present embodiment, in the first position sensor group and second place sensor group at left and right sides of robot
The quantity of position sensor can also be 2, specifically as shown in Figure 4 and Figure 5.In Fig. 4 and Fig. 5, supporting object is double glass photovoltaic battle arrays
Row, robot is equipped with first position sensor group and second place sensing close to the side of double glass photovoltaic array left side edges
Device group, wherein, first position sensor group includes laterally distributed position sensor A11, A12, second place sensor group
Including laterally distributed position sensor B11, B12, the front end of robot is equipped with the third place sensor group M, robot
Rear end is equipped with the 4th position sensor group N.In Fig. 5, when robot causes sensor B11 to be triggered to deviation on the outside of edge,
Robot control system, which can not be done, interferes, and just controls robot to deviate inwardly until sensor B12 is triggered, similarly, when
Robot to deviate on the inside of edge cause the sensors A 12 to be triggered when, robot control system, which can not be done, interferes, until sensing
Device A11, which is triggered, just controls robot to deviate laterally,
Furthermore, it is contemplated that there was only a position sensor or institute in first position sensor group and second place sensor group
When some position sensors is laterally distributed, easily there is walking error so that robot, which slips, to drop, as shown in Figure 6.Institute
With in the present embodiment, can also make includes multiple position sensings in first position sensor group and second place sensor group
Device, and every group of sensor is in genesis analysis, namely be distributed each along direction of travel, it is specific as shown in Figure 7, it is possible to understand that
It is that sensor is bigger along the mounting distance of direction of travel in Fig. 7, deflection angle when correction triggers is just smaller, so as to significantly drop
The possibility of low walking error.
In addition, when robot moves forward at the edge of double glass photovoltaic arrays, if the position sensor group of front end reaches
Directly over the edge of double glass photovoltaic arrays, then cornering operation can be triggered.Detailed process may be referred to shown in Fig. 8, in Fig. 8, when
After cleaning robot is put on double glass photovoltaic arrays with the side posture parallel with double glass photovoltaic array end long side A, cleaning machine
Device people moves forward, and until sensor detects marginal point d, original place bends to right 90 °, and angle of turn can pass through crawler body
The control of turning machine producing linear, retract, by the sensor-triggered of cleaning robot rear end, find the double of cleaning robot rear end
Glass photovoltaic array edge A, if the sensor on the cleaning robot left side does not trigger at this time, cleaning robot is forward to moving to left
Dynamic, until triggering left sensor, after the double glass photovoltaic array edge B for finding left side, low-angle adjusts cleaning robot to the right
Vehicle body, and retract, at double glass photovoltaic array end d, the sensor on the left of cleaning robot with afterbody is all in triggering at this time
State, illustrates that cleaning robot is in place by setting.Start cleaning robot, moved with straight manner forward, in moving process
Constantly rectified a deviation along side edge B according to the real-time position information that sensor detects, to ensure that track route is straight line as far as possible, directly
Sensor to front end is triggered, point of arrival e positions;Original place is turned right 90 °, and straight forward, until front end sensor again
It is secondary to be triggered, f points are reached, original place bends to right 90 ° again;Continue straight forward, along D sides, visited again by position sensor
The real-time position information measured keeps the route accuracy of straight trip over long distances, final to arrive at starting point marginal point g, at this time cleaning robot
People's front end sensors trigger.90 ° of pivot turn to the right again, cleaning robot left side side is with being originally placed double glass photovoltaics at this time
Posture on array is consistent, parallel with double glass photovoltaic array edge A.Patch A sides move forward to point d, and it is double so to complete a line
The cleaning process of glass photovoltaic array plate face.Above flow is a kind of embodiment, according to different navigation modes, double glass photovoltaic battle arrays
Row specification, cleaning area of bed, cleaning process are also different.
Correspondingly, the embodiment of the invention also discloses a kind of Algorithms of Robots Navigation System, shown in Figure 9, which includes:
Position information acquisition module 11, for when edge of the robot along support object moves, obtaining in real time respectively
The positional information that first position sensor group and second place sensor group detect, obtains the first real-time position information and second
Real-time position information;Wherein, first position sensor group and second place sensor group be placed in robot with edge away from
On on the outside of a shortest chassis;
Direction correcting module 12, for according to the first real-time position information and the second real-time position information, to robot
Direct of travel carries out correction processing, so that the first real-time position information is believed for the external position at edge after correction processing every time
Breath, and the internal location informa that the second real-time position information is edge.
The corresponding contents disclosed in previous embodiment are may be referred on the more specifical course of work of above-mentioned modules,
No longer repeated herein.
Correspondingly, the embodiment of the invention also discloses a kind of robot navigation's equipment, including first position sensor group,
Two position sensor groups, memory and processor;Wherein, processor is by performing the computer program reality preserved in memory
Existing following steps:
When the edge of robot along support object moves, first position sensor group and second are obtained in real time respectively
The positional information that sensor group detects is put, obtains the first real-time position information and the second real-time position information;Wherein, first
Put sensor group and second place sensor group is placed on the outside of a chassis shortest with Edge Distance for robot;
According to the first real-time position information and the second real-time position information, the direct of travel of robot is carried out at correction
Reason so that every time after correction processing the first real-time position information be edge external location information, and the second real time position
Information is the internal location informa at edge.
The corresponding contents disclosed in previous embodiment are may be referred on the more specifical implementation procedure of above-mentioned processor,
This is no longer repeated.
In addition, the invention also discloses a kind of robot, including foregoing disclosed robot navigation's equipment.Wherein, it is above-mentioned
Robot is specifically as follows the robot for cleaning photovoltaic array, also, above-mentioned photovoltaic array for tracing type photovoltaic array or
Double glass photovoltaic arrays.
Each embodiment is described by the way of progressive in this specification, what each embodiment stressed be with it is other
The difference of embodiment, between each embodiment same or similar part mutually referring to.For dress disclosed in embodiment
For putting, since it is corresponded to the methods disclosed in the examples, so description is fairly simple, related part is referring to method part
Explanation.
Professional further appreciates that, with reference to each exemplary unit of the embodiments described herein description
And algorithm steps, can be realized with electronic hardware, computer software or the combination of the two, in order to clearly demonstrate hardware and
The interchangeability of software, generally describes each exemplary composition and step according to function in the above description.These
Function is performed with hardware or software mode actually, application-specific and design constraint depending on technical solution.Specialty
Technical staff can realize described function to each specific application using distinct methods, but this realization should not
Think beyond the scope of this invention.
Can directly it be held with reference to the step of method or algorithm that the embodiments described herein describes with hardware, processor
Capable software module, or the two combination are implemented.Software module can be placed in random access memory (RAM), memory, read-only deposit
Reservoir (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or technology
In any other form of storage medium well known in field.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by
One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation
Between there are any actual relationship or order.Moreover, term " comprising ", "comprising" or its any other variant meaning
Covering non-exclusive inclusion, so that process, method, article or equipment including a series of elements not only include that
A little key elements, but also including other elements that are not explicitly listed, or further include for this process, method, article or
The intrinsic key element of equipment.In the absence of more restrictions, the key element limited by sentence "including a ...", is not arranged
Except also there are other identical element in the process, method, article or apparatus that includes the element.
A kind of robot provided by the present invention and its air navigation aid, system, equipment are described in detail above, this
Specific case is applied in text to be set forth the principle of the present invention and embodiment, the explanation of above example is only intended to
Help to understand method and its core concept of the invention;Meanwhile for those of ordinary skill in the art, the think of according to the present invention
Think, in specific embodiments and applications there will be changes, in conclusion this specification content should not be construed as pair
The limitation of the present invention.
Claims (10)
- A kind of 1. robot navigation method, it is characterised in that including:When the edge of robot along support object moves, first position sensor group is obtained in real time respectively and the second place passes The positional information that sensor group detects, obtains the first real-time position information and the second real-time position information;Wherein, described first Put sensor group and the second place sensor group is placed in one shortest with the Edge Distance of the robot On on the outside of chassis;According to first real-time position information and second real-time position information, the direct of travel of the robot is carried out Correction is handled, so that the first real-time position information is the external location information at the edge after correction is handled every time, and the Two real-time position informations are the internal location informa at the edge.
- 2. robot navigation method according to claim 1, it is characterised in that on two outsides on the robot chassis It is equipped with first position sensor group and second place sensor group.
- 3. robot navigation method according to claim 1, it is characterised in that the first position sensor group includes one A position sensor or multiple position sensors;The second place sensor group includes a position sensor or multiple positions Sensor.
- 4. robot navigation method according to claim 1, it is characterised in that described to be believed according to first real time position Breath and second real-time position information, correction processing is carried out to the direct of travel of the robot, so that every time at correction The first real-time position information is the external location information at the edge after reason, and the second real-time position information is the edge The step of internal location informa, including:When the internal location informa that the first real-time position information is the edge, then robot lateral deviation to outside edge is controlled Move;When the external location information that the second real-time position information is the edge, then robot lateral deviation into edge is controlled Move.
- 5. robot navigation method according to any one of claims 1 to 4, it is characterised in that further include:The robot is utilized respectively to be placed in the third place sensor group of front end in advance and be placed in the 4th of rear end in advance Position sensor group come detect it is described support object edge;When any sensor group in the third place sensor group and the 4th position sensor group detects the branch The edge of object is supportted, then controls the robot to carry out being moved away from marginal operation.
- 6. robot navigation method according to claim 5, it is characterised in that described to work as the third place sensor group The edge of the support object is detected with any sensor group in the 4th position sensor group, then controls the machine The step of people carries out being moved away from marginal operation, including:When any sensor group in the third place sensor group and the 4th position sensor group detects the branch The edge of object is supportted, then the edge is determined as object edge, and controls the robot to carry out being moved away from marginal operation in real time, with So that the front end direction of the robot is consistent with the line segment extending direction in the object edge after being moved away from edge, And ensure during marginal operation is moved away from, when the third place sensor group detect it is described support object edge, then Drive the robot to be moved rearwards, when the 4th position sensor group detect it is described support object edge, then drive The robot moves forward.
- 7. robot navigation method according to claim 5, it is characterised in that further include:After edge is moved away from, judge whether the location of presently described robot is that standard starts position, if it is, Start the robot and continue on the edge movement of the support object, if it is not, then being carried out to the position of the robot Fine setting, until the position of the robot starts position for standard;Wherein, when the robot, which is in the standard, starts position, the first real-time position information is the support object edges The external location information of edge, the second real-time position information for the support target edges internal location informa, and described the Four position sensor groups detect the edge of the support object.
- A kind of 8. Algorithms of Robots Navigation System, it is characterised in that including:Position information acquisition module, for when edge of the robot along support object moves, obtaining first in real time respectively The positional information that sensor group and second place sensor group detect is put, obtains the first real-time position information and the second real-time position Confidence ceases;Wherein, the first position sensor group and the second place sensor group be placed in the robot with On on the outside of the shortest chassis of Edge Distance;Direction correcting module, for according to first real-time position information and second real-time position information, to the machine The direct of travel of device people carries out correction processing, so that the first real-time position information is the outer of the edge after correction processing every time Portion's positional information, and the internal location informa that the second real-time position information is the edge.
- 9. a kind of robot navigation's equipment, it is characterised in that including first position sensor group, second place sensor group, deposit Reservoir and processor;Wherein, the computer program that the processor is stored in the memory by performing realizes following step Suddenly:When the edge of robot along support object moves, the first position sensor group and described the are obtained in real time respectively The positional information that two position sensor groups detect, obtains the first real-time position information and the second real-time position information;Wherein, institute State first position sensor group and the second place sensor group be placed in the robot with the Edge Distance most On on the outside of a short chassis;According to first real-time position information and second real-time position information, the direct of travel of the robot is carried out Correction is handled, so that the first real-time position information is the external location information at the edge after correction is handled every time, and the Two real-time position informations are the internal location informa at the edge.
- 10. a kind of robot, it is characterised in that including robot navigation's equipment as claimed in claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711183604.4A CN107966986A (en) | 2017-11-23 | 2017-11-23 | A kind of robot and its air navigation aid, system, equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711183604.4A CN107966986A (en) | 2017-11-23 | 2017-11-23 | A kind of robot and its air navigation aid, system, equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107966986A true CN107966986A (en) | 2018-04-27 |
Family
ID=62001499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711183604.4A Pending CN107966986A (en) | 2017-11-23 | 2017-11-23 | A kind of robot and its air navigation aid, system, equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107966986A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110030932A (en) * | 2019-05-24 | 2019-07-19 | 广东嘉腾机器人自动化有限公司 | AGV displacement measuring method and AGV offset measuring device |
CN114305261A (en) * | 2021-12-29 | 2022-04-12 | 广州科语机器人有限公司 | Route deviation rectifying processing method and device for sweeper |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101259000A (en) * | 2007-03-07 | 2008-09-10 | 得利诚健康生活科技股份有限公司 | Floor cleaning device |
WO2014196480A1 (en) * | 2013-06-03 | 2014-12-11 | シンフォニアテクノロジー株式会社 | Solar panel cleaning device |
WO2014208192A1 (en) * | 2013-06-27 | 2014-12-31 | シンフォニアテクノロジー株式会社 | Cleaning device |
CN105057246A (en) * | 2015-08-20 | 2015-11-18 | 北京天诚同创电气有限公司 | Cleaning device |
CN105057301A (en) * | 2015-09-17 | 2015-11-18 | 中国船舶重工集团公司第七一三研究所 | Automatic deviation rectifying method and automatic deviation rectifying system for advancement of solar panel cleaning vehicle |
CN105107772A (en) * | 2015-09-17 | 2015-12-02 | 中国船舶重工集团公司第七一三研究所 | Intelligent photovoltaic array washing car |
WO2015199198A1 (en) * | 2014-06-25 | 2015-12-30 | 株式会社未来機械 | Self-propelled robot |
CN205049978U (en) * | 2015-09-17 | 2016-02-24 | 中国船舶重工集团公司第七一三研究所 | Photovoltaic array cleaning head position appearance control system |
CN106502279A (en) * | 2016-12-27 | 2017-03-15 | 河南森源重工有限公司 | A kind of brush holder adaptive tracking system of solar panel cleaning device and method |
CN106712694A (en) * | 2016-11-17 | 2017-05-24 | 浙江国自机器人技术有限公司 | Photovoltaic array cross-panel cleaning method and device |
CN106774322A (en) * | 2016-12-20 | 2017-05-31 | 杭州华电双冠能源科技有限公司 | A kind of photovoltaic plant cruising inspection system and its operation method |
CN107362994A (en) * | 2017-08-21 | 2017-11-21 | 浙江大学 | Apparatus for work in inclined plane and its apply cleaning method in photovoltaic plant |
US20190269290A1 (en) * | 2016-03-31 | 2019-09-05 | Miraikikai, Inc. | Self-propelled robot |
-
2017
- 2017-11-23 CN CN201711183604.4A patent/CN107966986A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101259000A (en) * | 2007-03-07 | 2008-09-10 | 得利诚健康生活科技股份有限公司 | Floor cleaning device |
WO2014196480A1 (en) * | 2013-06-03 | 2014-12-11 | シンフォニアテクノロジー株式会社 | Solar panel cleaning device |
WO2014208192A1 (en) * | 2013-06-27 | 2014-12-31 | シンフォニアテクノロジー株式会社 | Cleaning device |
WO2015199198A1 (en) * | 2014-06-25 | 2015-12-30 | 株式会社未来機械 | Self-propelled robot |
CN105057246A (en) * | 2015-08-20 | 2015-11-18 | 北京天诚同创电气有限公司 | Cleaning device |
CN105107772A (en) * | 2015-09-17 | 2015-12-02 | 中国船舶重工集团公司第七一三研究所 | Intelligent photovoltaic array washing car |
CN105057301A (en) * | 2015-09-17 | 2015-11-18 | 中国船舶重工集团公司第七一三研究所 | Automatic deviation rectifying method and automatic deviation rectifying system for advancement of solar panel cleaning vehicle |
CN205049978U (en) * | 2015-09-17 | 2016-02-24 | 中国船舶重工集团公司第七一三研究所 | Photovoltaic array cleaning head position appearance control system |
US20190269290A1 (en) * | 2016-03-31 | 2019-09-05 | Miraikikai, Inc. | Self-propelled robot |
CN106712694A (en) * | 2016-11-17 | 2017-05-24 | 浙江国自机器人技术有限公司 | Photovoltaic array cross-panel cleaning method and device |
CN106774322A (en) * | 2016-12-20 | 2017-05-31 | 杭州华电双冠能源科技有限公司 | A kind of photovoltaic plant cruising inspection system and its operation method |
CN106502279A (en) * | 2016-12-27 | 2017-03-15 | 河南森源重工有限公司 | A kind of brush holder adaptive tracking system of solar panel cleaning device and method |
CN107362994A (en) * | 2017-08-21 | 2017-11-21 | 浙江大学 | Apparatus for work in inclined plane and its apply cleaning method in photovoltaic plant |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110030932A (en) * | 2019-05-24 | 2019-07-19 | 广东嘉腾机器人自动化有限公司 | AGV displacement measuring method and AGV offset measuring device |
CN110030932B (en) * | 2019-05-24 | 2020-12-15 | 广东嘉腾机器人自动化有限公司 | AGV deviation measurement method and AGV deviation measurement device |
CN114305261A (en) * | 2021-12-29 | 2022-04-12 | 广州科语机器人有限公司 | Route deviation rectifying processing method and device for sweeper |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110244743B (en) | Mobile robot autonomous escaping method fusing multi-sensor information | |
CN104390612B (en) | Six-degree-of-freedom parallel robot benchmark pose scaling method for Stewart platform configuration | |
JP2009031884A (en) | Autonomous mobile body, map information creation method in autonomous mobile body and moving route specification method in autonomous mobile body | |
US10571277B2 (en) | Charger, and method, apparatus and system for finding charger based on map constructing | |
KR101297388B1 (en) | Moving apparatus and method for compensating position | |
US7873438B2 (en) | Mobile apparatus and control program therefor | |
CN103431812A (en) | Cleaning robot based on ultrasonic radar detection and travelling control method thereof | |
CN109813305B (en) | Unmanned fork truck based on laser SLAM | |
CN103592944A (en) | Supermarket shopping robot and advancing path planning method thereof | |
US9964956B2 (en) | Operating environment information generating device for mobile robot | |
CN104057456A (en) | Robot picking system and method of manufacturing a workpiece | |
JP2007310866A (en) | Robot using absolute azimuth and map creation method using it | |
JP2009093308A (en) | Robot system | |
CN107526085B (en) | Ultrasonic array ranging modeling method and system | |
EP1657612A3 (en) | Moving distance sensing apparatus for robot cleaner and method therefor | |
US20110153137A1 (en) | Method of generating spatial map using freely travelling robot, method of calculating optimal travelling path using spatial map, and robot control device performing the same | |
KR101951573B1 (en) | Device for detecting an obstacle by means of intersecting planes and detection method using such a device | |
KR100803203B1 (en) | Apparatus and method for correcting location information of mobile body, and computer-readable recording media storing computer program controlling the apparatus | |
CN105806337A (en) | Positioning method applied to indoor robot, and indoor robot | |
CN110865640B (en) | Obstacle avoidance structure of intelligent robot | |
JP2018185767A (en) | Environment maintenance robot, and control program of the same | |
CN107966986A (en) | A kind of robot and its air navigation aid, system, equipment | |
KR20180033837A (en) | Window cleaning robot and method of controlling the same | |
CN107977001A (en) | A kind of robot and its air navigation aid, system, equipment | |
KR970705678A (en) | EXCAVATION AREA SETTING SYSTEM FOR AREA LIMITING EXCAVATION CONTROL IN CONSTRUCTION MACHINES |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180427 |
|
RJ01 | Rejection of invention patent application after publication |