CN111829510A - Automatic navigation method, server and storage medium - Google Patents

Automatic navigation method, server and storage medium Download PDF

Info

Publication number
CN111829510A
CN111829510A CN201910300910.4A CN201910300910A CN111829510A CN 111829510 A CN111829510 A CN 111829510A CN 201910300910 A CN201910300910 A CN 201910300910A CN 111829510 A CN111829510 A CN 111829510A
Authority
CN
China
Prior art keywords
guided vehicle
automatic guided
information
unit
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910300910.4A
Other languages
Chinese (zh)
Inventor
周严伟
谢恺
占兆武
罗为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuhuake Precision Industry Shenzhen Co ltd
Original Assignee
Fuhuake Precision Industry Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhuake Precision Industry Shenzhen Co ltd filed Critical Fuhuake Precision Industry Shenzhen Co ltd
Priority to CN201910300910.4A priority Critical patent/CN111829510A/en
Publication of CN111829510A publication Critical patent/CN111829510A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an automatic navigation method, which comprises the steps of controlling a camera unit to position an automatic guided vehicle, and controlling an inertia detection unit on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises acceleration information and yaw angle information; judging whether the positioning information obtained by the camera shooting unit is effective or not; when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit to be pose information currently output by the inertia detection unit; when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pose information currently detected by the inertia detection unit and the pose information of the automatic guided vehicle at the moment k; and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in the navigation space.

Description

Automatic navigation method, server and storage medium
Technical Field
The present invention relates to the field of navigation, and in particular, to an automatic navigation method, a server, and a storage medium.
Background
Automated Guided Vehicles (AGVs) have become an important infrastructure in smart logistics, smart manufacturing, and digital factories. AGV can make things convenient for the transportation of mill, improves the production efficiency and the operating efficiency of mill. Common AGV navigation methods include magnetic stripe navigation, laser navigation, two-dimensional code navigation, and the like. Magnetic stripe navigation AGV price is cheap relatively, and the operation is reliable stable. However, the magnetic stripes need to be additionally laid in the application scene, the laying workload of the magnetic stripes is large, and the magnetic stripes can be laid only by modifying the application field. If the navigation path is changed, the magnetic strip needs to be laid again at the application field. And the bottom of the magnetic strip can be weakened due to long-term use, so that the magnetic strip is not favorable for repeated use. The AGV based on laser navigation is to construct an indoor complete map through a laser radar, obtain complete surrounding environment information, obtain the surrounding information in real time by utilizing laser scanning and realize algorithm navigation. The AGV of laser navigation does not need to lay a fixed track in an application field, and then when the laser navigation meets a large object or a wall, the positioning may be inaccurate, and the phenomenon of departing from a preset path may occur.
With the development of digital factories and intelligent factories, enterprises put higher demands on factory automation. At present, most of logistics robots operate under the condition of rail guidance, and a small part of logistics robots adopt relatively expensive laser radar trackless navigation and inertial navigation robots with high technical complexity. The rail guided navigation cannot adapt to the requirement of path change, and the laser navigation needs an additional reflecting plate. The mobile robot based on visual navigation can overcome the problems, and only a camera is required to be installed on the roof, so that accurate navigation is realized within the coverage range of the camera. However, the scheme based on the visual navigation has the problem that a camera covers a blind area, namely, the position and pose information of the AGV cannot be obtained in a place without the camera covering. When the AGV runs to a blind area covered by the camera, the server cannot acquire the pose information of the AGV, so that a control instruction cannot be provided for the AGV. Therefore, additional sensors are needed to assist in providing the pose information of the AGV, which in turn increases hardware costs.
Disclosure of Invention
In view of the above, it is desirable to provide an automatic guidance method, a server, and a storage medium, which can determine yaw angle and pose information when an automatic guided vehicle travels in a blind area of a camera in a navigation space, and control the automatic guided vehicle to travel along a virtual path according to the yaw angle and pose information.
The invention provides an automatic navigation method according to inertia detection unit information, which is applied to a server, wherein the server is in communication connection with at least one automatic guided vehicle and at least one camera unit, and the method comprises the following steps:
controlling the camera unit to position the automatic guided vehicle;
controlling an inertia detection unit arranged on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises acceleration information and yaw angle information;
judging whether the positioning information obtained by the camera shooting unit is effective or not;
when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit to pose information currently output by the inertia detection unit;
when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pose information currently detected by the inertia detection unit and the pose information of the automatic guided vehicle at the moment k;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space.
A second aspect of the present invention provides a server comprising:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method.
A third aspect of the present invention provides a storage medium having stored thereon at least one computer instruction for execution by a processor and loaded to perform the method of automated navigation.
The automatic navigation method can estimate the driving route of the automatic guided vehicle through the yaw angle and the pose information detected by the inertia detection unit when the automatic guided vehicle drives in a blind area which cannot be navigated by the camera unit, so that the automatic guided vehicle is prevented from deviating from the preset route.
Drawings
FIG. 1 is a diagram of an automatic navigation system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a server in the automatic navigation system according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating an automatic navigation method according to an embodiment of the present invention.
Description of the main elements
Figure BDA0002028193140000031
Figure BDA0002028193140000041
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, a schematic diagram of an automatic navigation system 10 according to an embodiment of the invention is shown. The automatic navigation system 10 includes, but is not limited to, a server 1, an automatic guided vehicle 2, and at least one camera unit 3. In the present embodiment, the server 1 and the automatic guided vehicle 2 are in communication connection in a wireless manner, and the server 1 and the at least one imaging unit 3 are electrically connected by a wired manner. The camera unit 3 is a camera and is installed in a navigation space.
In the present embodiment, the server 1 controls the automatic guided vehicle 2 to navigate in the navigation space by estimating pose information of the automatic guided vehicle 2, which will be described in detail later. The pose information includes position information and yaw angle information. The navigation space can be an indoor space (such as in a storage bin) or an outdoor space.
Referring to fig. 2, a schematic diagram of the server 1 in the automatic navigation system 10 according to an embodiment of the invention is shown. In the present embodiment, the server 1 includes, but is not limited to, a network unit 11, a memory 12, and a processor 13. The network unit 11, the memory 12 and the processor 13 are electrically connected to each other.
In this embodiment, the network unit 11 is configured to provide a network communication function for the server 1 through a wired or wireless network transmission manner. So that the server 1 can be network communicatively connected to the automated guided vehicle 2.
In the present embodiment, the memory 12 is used to store software programs and data installed in the server 1. In the present embodiment, the storage 12 may be an internal storage unit of the server 1, for example, a hard disk or a memory of the server 1. In other embodiments, the memory 12 may also be an external storage device of the server 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the server 1.
In the present embodiment, the memory 12 further stores a virtual map (e.g., an electronic map) of the navigation space, where the virtual map includes a plurality of virtual paths. The virtual path is composed of a reference point and a connection relation between the reference point and the reference point. The virtual path defines a plurality of reference points of the automatic guided vehicle 2 on the whole virtual path and position coordinates corresponding to each reference point, and the sequence of each automatic guided vehicle 2 passing through the plurality of reference points.
In the present embodiment, the memory 12 stores in advance the position coordinates of each reference point on the virtual path. The position coordinates of the respective reference points on the virtual path may refer to coordinates in a coordinate system (XOY) established based on an area where the entire navigation space is located.
In this embodiment, the processor 13 may be a Central Processing Unit (CPU), or other microprocessor or other data processing chip capable of executing control functions. The processor 13 is used for executing software program codes, calculating data, and the like.
Referring to fig. 3, a schematic diagram of the automatic guided vehicle 2 in the automatic navigation system 10 according to an embodiment of the present invention is shown. In the present embodiment, the number of the automatic guided vehicle 2 may be one or more. The automatic guided vehicle 2 includes, but is not limited to, a battery 20, a wireless unit 21, a controller 22, and an inertia detection unit 23. The battery 20, the wireless unit 21, the controller 22 and the inertia detecting unit 23 are electrically connected.
In the present embodiment, the battery 20 is used to supply power to the wireless unit 21 and the controller 22.
The wireless unit 21 is used to provide a network connection between the automatic guided vehicle 2 and the server 1. The controller 22 is configured to receive a control instruction sent by the server 1 to control the automatic guided vehicle 2 to travel along a virtual path.
The Inertial Measurement Unit (IMU) 23 is used for measuring the three-axis attitude angle, angular velocity, and acceleration of the object. The inertia detecting unit 23 includes a plurality of single-axis accelerometers and a plurality of single-axis gyroscopes, the accelerometers detect acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, measure angular velocity and acceleration of the object in three-dimensional space, and calculate the attitude of the object based on the angular velocity and acceleration signals.
In this embodiment, the automatic guided vehicle 2 may further include a charging unit (not shown) for providing power to the battery 20.
In the prior art, the server 1 can assist in controlling the automatic guided vehicle 2 to walk on the virtual path according to the video or image information shot by the camera unit 3 installed in the navigation space. However, when the automatic guided vehicle 2 travels to the blind area of the camera unit 3, the server 1 cannot confirm whether the automatic guided vehicle 2 still travels on the virtual path. Therefore, in the present scheme, the server 1 may collect the pose information of the automatic guided vehicle 2 detected by the inertia detection unit 23 installed on the automatic guided vehicle 2, and estimate the pose information of the automatic guided vehicle 2 at the next time according to the pose information, so that when the automatic guided vehicle 2 travels to a blind area that cannot be covered by the camera unit 3 in the navigation space, the pose information of the automatic guided vehicle 2 may be obtained, and a control instruction may be generated according to the pose information and the reference point information on the virtual path to control the automatic guided vehicle 2 to travel on the virtual path. In the present embodiment, the pose information includes position information and yaw angle information.
For example, first, the server 1 sends a control command to the camera unit 3 to control the camera unit 3 to position the automatic guided vehicle 2, and controls the inertia detection unit 23 mounted on the automatic guided vehicle 2 to detect the pose information of the automatic guided vehicle 2, where the pose information further includes acceleration information. The positioning information determined according to the image captured by the camera unit 3 installed in the navigation space is the prior art, and is not described herein again.
Further, it is determined whether the positioning information obtained by the camera unit 3 is valid, and when it is determined that the positioning information obtained by the camera unit 3 is valid, the positioning information currently obtained by the camera unit 3 is updated to the pose information currently output by the inertia detection unit 23. When the positioning information obtained by the camera unit 3 is judged to be invalid, the pose information of the automatic guided vehicle 2 at the moment k +1 is calculated according to the pose information currently detected by the inertia detection unit 23 and the pose information of the automatic guided vehicle 2 at the moment k. The position information of the automatic guided vehicle 2 at the time k is position information determined from an image captured by the imaging unit 3 installed in the navigation space or initial position information predetermined on the virtual path by the automatic guided vehicle 2.
Further, a control instruction is generated according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information prestored on a virtual path in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
Referring to fig. 4, a flowchart of an automatic navigation method according to an embodiment of the invention is shown. The order of the steps in the flow diagrams may be changed, and some steps may be omitted or combined, according to different needs.
In step S41, the imaging unit 3 is controlled to position the automatic guided vehicle 2.
Step S42, controlling the inertia detection unit 23 mounted on the automatic guided vehicle 2 to detect the pose information of the automatic guided vehicle 2, where the pose information at least includes acceleration information and yaw angle information.
Step S43, it is determined whether or not the positioning information obtained by the image pickup unit 3 is valid.
Specifically, the step S43 includes determining whether the image captured by the imaging unit 3 includes the complete automatic guided vehicle 2; when the image shot by the camera unit 3 contains the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is valid; and when the image shot by the camera unit 3 does not contain the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is invalid.
In step S44, when it is determined that the positioning information obtained by the image capturing unit 3 is valid, the positioning information currently obtained by the image capturing unit 3 is updated to the pose information currently output by the inertia detecting unit 23.
Step S45, when it is determined that the positioning information obtained by the camera unit 3 is invalid, calculating the pose information of the automated guided vehicle 2 at the time k +1 according to the pose information currently detected by the inertia detection unit 23 and the pose information of the automated guided vehicle 2 at the time k.
Specifically, the step S45 includes: calculating the displacement change value delta S of the automatic guided vehicle 2 in the x-axis direction between the k moment and the k +1 momentx. Wherein, the calculation formula is:
Figure BDA0002028193140000081
wherein accX is the acceleration of the automatic guided vehicle 2 on the x axis currently detected by the inertia detection unit 23.
Calculating the displacement change value delta S of the automatic guided vehicle 2 in the y-axis direction between the k moment and the k +1 momenty. Wherein, the calculation formula is:
Figure BDA0002028193140000091
wherein accY is the acceleration of the automatic guided vehicle 2 on the y axis currently detected by the inertia detection unit 23.
Calculating the displacement change value of the automatic guided vehicle 2 between the k moment and the k +1 moment
Figure BDA0002028193140000092
Calculating the yaw angle change value of the automatic guided vehicle 2 between the moment k and the moment k +1
Figure BDA0002028193140000093
Calculating the position and attitude information (x) of the automatic guided vehicle 2 at the moment of k +1k+1,yk+1,θk+1)=(xk+ΔS*cos(θk+Δθ),yk+ΔS*sin(θk+Δθ),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle 2 at time kk,yk,θk) Detected and determined by the camera unit 3.
And step S46, generating a control instruction according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information on a virtual path prestored in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
In step S47, it is determined whether or not the positioning information obtained by the image pickup unit 3 has changed from invalid to valid.
In step S48, when it is determined that the positioning information obtained by the imaging unit 3 has changed from invalid to valid, the automatic guided vehicle 2 is navigated based on the positioning information obtained by the imaging unit 3.
Step S49, when it is determined that the positioning information obtained by the imaging unit 3 is still invalid, continues navigating the automatic guided vehicle 2 according to the pose information of the automatic guided vehicle 2 at the time k + 1.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.

Claims (10)

1. An automatic navigation method is applied to a server, and the server is in communication connection with at least one automatic guided vehicle and at least one camera unit, and is characterized by comprising the following steps:
controlling the camera unit to position the automatic guided vehicle;
controlling an inertia detection unit arranged on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises acceleration information and yaw angle information;
judging whether the positioning information obtained by the camera shooting unit is effective or not;
when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit to pose information currently output by the inertia detection unit;
when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pose information currently detected by the inertia detection unit and the pose information of the automatic guided vehicle at the moment k; and
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space.
2. The automated navigation method of claim 1, wherein the pose information of the automated guided vehicle at time k includes position information and yaw information of the automated guided vehicle at time k, wherein the position information and yaw information of the automated guided vehicle at time k is determined from the image captured by the imaging unit or is initial position information of the automated guided vehicle on the virtual path.
3. The automatic navigation method according to claim 2, wherein determining whether the positioning information obtained by the camera unit is valid comprises:
judging whether the image shot by the camera shooting unit contains the complete automatic guided vehicle or not;
when the image shot by the camera shooting unit contains the complete automatic guide vehicle, judging that the positioning information obtained by the camera shooting unit is effective; and
and when the image shot by the camera shooting unit does not contain the complete automatic guide vehicle, judging that the positioning information obtained by the camera shooting unit is invalid.
4. The automated navigation method of claim 1, wherein calculating the pose information of the automated guided vehicle at the time k +1 according to the pose information currently detected by the inertial detection unit and the pose information of the automated guided vehicle at the time k comprises:
calculating the displacement change value delta S of the automatic guided vehicle in the x-axis direction between the k moment and the k +1 momentx
The automatic guided vehicle is used for calculating the time from the moment k to the moment k +1Displacement variation Δ S in y-axis directiony
Calculating the displacement change value of the automatic guided vehicle between the k moment and the k +1 moment
Figure FDA0002028193130000021
Calculating the change of the yaw angle of the automatic guided vehicle between the moment k and the moment k +1
Figure FDA0002028193130000022
And
calculating the position and attitude information (x) of the automatic guided vehicle at the moment of k +1k+1,yk+1,θk+1)=(xk+ΔS*cos(θk+Δθ),yk+ΔS*sin(θk+Δθ),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle k at time instantk,yk,θk) And detecting and determining by the camera shooting unit.
5. The automated navigation method according to claim 4, wherein a displacement variation Δ S of the automated guided vehicle in the x-axis direction between time k and time k +1 is calculatedxThe calculation formula of (2) is as follows:
Figure FDA0002028193130000031
and the accX is the acceleration of the automatic guided vehicle at the x axis currently detected by the inertia detection unit.
6. The automated navigation method according to claim 5, wherein a displacement variation Δ S of the automated guided vehicle in the y-axis direction between the time k and the time k +1 is calculatedyThe calculation formula of (2) is as follows:
Figure FDA0002028193130000032
and the accY is the acceleration of the automatic guided vehicle on the y axis currently detected by the inertia detection unit.
7. The automated navigation method according to claim 1, wherein when a control instruction is generated according to the pose information of the automated guided vehicle at the time k +1 and reference point information on a virtual path prestored in the server to control the automated guided vehicle to travel on a corresponding virtual path in the navigation space, the method further comprises:
it is determined whether the positioning information obtained by the image pickup unit changes from invalid to valid.
8. The automated navigation method of claim 7, the method further comprising:
when the positioning information obtained by the camera shooting unit is judged to be changed from invalid to valid, navigating the automatic guided vehicle according to the positioning information obtained by the camera shooting unit; and
and when the positioning information obtained by the camera unit is still invalid, continuing to navigate the automatic guided vehicle according to the pose information of the automatic guided vehicle at the moment of k + 1.
9. A server, characterized in that the server comprises:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method of any one of claims 1-8.
10. A storage medium having stored thereon at least one computer instruction, wherein the instruction is loaded by a processor to perform the automated navigation method of any one of claims 1-8.
CN201910300910.4A 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium Pending CN111829510A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910300910.4A CN111829510A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910300910.4A CN111829510A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Publications (1)

Publication Number Publication Date
CN111829510A true CN111829510A (en) 2020-10-27

Family

ID=72915695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910300910.4A Pending CN111829510A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Country Status (1)

Country Link
CN (1) CN111829510A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105180934A (en) * 2015-09-16 2015-12-23 成都四威高科技产业园有限公司 AVG inertial navigation method
CN106647729A (en) * 2015-11-03 2017-05-10 南京苏立机器人科技有限公司 AGV navigation system based on image identification and wireless network and navigation method thereof
CN106774315A (en) * 2016-12-12 2017-05-31 深圳市智美达科技股份有限公司 Autonomous navigation method of robot and device
CN107087016A (en) * 2017-03-06 2017-08-22 清华大学 The air navigation aid and system of mobile object in building based on video surveillance network
CN107179091A (en) * 2017-06-27 2017-09-19 广东嘉腾机器人自动化有限公司 A kind of AGV walkings vision positioning error correcting method
WO2017161588A1 (en) * 2016-03-25 2017-09-28 华为技术有限公司 Positioning method and apparatus
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring
CN107463173A (en) * 2017-07-31 2017-12-12 广州维绅科技有限公司 AGV air navigation aids of storing in a warehouse and device, computer equipment and storage medium
CN108153302A (en) * 2017-12-08 2018-06-12 深圳市易成自动驾驶技术有限公司 Guidance method, device and the computer readable storage medium of automatic guided vehicle
CN108318050A (en) * 2017-12-14 2018-07-24 富华科精密工业(深圳)有限公司 Central controller and the system and method for utilizing the central controller mobile navigation
WO2018169977A1 (en) * 2017-03-14 2018-09-20 Trimble Inc. Integrated vision-based and inertial sensor systems for use in vehicle navigation
CN108733039A (en) * 2017-04-18 2018-11-02 广东工业大学 The method and apparatus of navigator fix in a kind of robot chamber
CN108759834A (en) * 2018-04-28 2018-11-06 温州大学激光与光电智能制造研究院 A kind of localization method based on overall Vision
CN109443345A (en) * 2018-10-29 2019-03-08 温州大学 For monitoring the localization method and system of navigation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105180934A (en) * 2015-09-16 2015-12-23 成都四威高科技产业园有限公司 AVG inertial navigation method
CN106647729A (en) * 2015-11-03 2017-05-10 南京苏立机器人科技有限公司 AGV navigation system based on image identification and wireless network and navigation method thereof
WO2017161588A1 (en) * 2016-03-25 2017-09-28 华为技术有限公司 Positioning method and apparatus
CN106774315A (en) * 2016-12-12 2017-05-31 深圳市智美达科技股份有限公司 Autonomous navigation method of robot and device
CN107087016A (en) * 2017-03-06 2017-08-22 清华大学 The air navigation aid and system of mobile object in building based on video surveillance network
WO2018169977A1 (en) * 2017-03-14 2018-09-20 Trimble Inc. Integrated vision-based and inertial sensor systems for use in vehicle navigation
CN108733039A (en) * 2017-04-18 2018-11-02 广东工业大学 The method and apparatus of navigator fix in a kind of robot chamber
CN107179091A (en) * 2017-06-27 2017-09-19 广东嘉腾机器人自动化有限公司 A kind of AGV walkings vision positioning error correcting method
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring
CN107463173A (en) * 2017-07-31 2017-12-12 广州维绅科技有限公司 AGV air navigation aids of storing in a warehouse and device, computer equipment and storage medium
CN108153302A (en) * 2017-12-08 2018-06-12 深圳市易成自动驾驶技术有限公司 Guidance method, device and the computer readable storage medium of automatic guided vehicle
CN108318050A (en) * 2017-12-14 2018-07-24 富华科精密工业(深圳)有限公司 Central controller and the system and method for utilizing the central controller mobile navigation
CN108759834A (en) * 2018-04-28 2018-11-06 温州大学激光与光电智能制造研究院 A kind of localization method based on overall Vision
CN109443345A (en) * 2018-10-29 2019-03-08 温州大学 For monitoring the localization method and system of navigation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李丰阳等: "惯性/视觉组合导航在不同应用场景的发展", 《导航定位学报》 *
胡小平: "《导航技术基础》", 31 July 2015 *

Similar Documents

Publication Publication Date Title
CN110312912B (en) Automatic vehicle parking system and method
US20090312871A1 (en) System and method for calculating location using a combination of odometry and landmarks
Ohya et al. Vision-based navigation by a mobile robot with obstacle avoidance using single-camera vision and ultrasonic sensing
TWI827649B (en) Apparatuses, systems and methods for vslam scale estimation
Burschka et al. Vision-based control of mobile robots
Harapanahalli et al. Autonomous Navigation of mobile robots in factory environment
CN105486311A (en) Indoor robot positioning navigation method and device
CN112183133B (en) Aruco code guidance-based mobile robot autonomous charging method
US20220012509A1 (en) Overhead-view image generation device, overhead-view image generation system, and automatic parking device
KR20200037548A (en) Vehicle charging robot
KR101341204B1 (en) Device and method for estimating location of mobile robot using raiser scanner and structure
CN109506652B (en) Optical flow data fusion method based on carpet migration and cleaning robot
KR102006291B1 (en) Method for estimating pose of moving object of electronic apparatus
KR101880185B1 (en) Electronic apparatus for estimating pose of moving object and method thereof
CN108458712A (en) Unmanned trolley navigation system and air navigation aid, unmanned trolley
CN115903857A (en) RFID-based unmanned grain surface inspection device and positioning method
Smith et al. PiPS: Planning in perception space
Csaba et al. Differences between Kinect and structured lighting sensor in robot navigation
CN114714357A (en) Sorting and carrying method, sorting and carrying robot and storage medium
CN111624990A (en) Automatic navigation method, server and storage medium
CN103328907B (en) Robotic heliostat calibaration system and method
CN111829510A (en) Automatic navigation method, server and storage medium
KR101650128B1 (en) Apparatus for building map of moving robot and method thereof
CN111830955A (en) Automatic navigation method, server and storage medium
Shioya et al. Minimal Autonomous Mover-MG-11 for Tsukuba Challenge–

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201027

WD01 Invention patent application deemed withdrawn after publication