CN113686332A - Mobile robot and navigation method, device, equipment and storage medium thereof - Google Patents

Mobile robot and navigation method, device, equipment and storage medium thereof Download PDF

Info

Publication number
CN113686332A
CN113686332A CN202111053614.2A CN202111053614A CN113686332A CN 113686332 A CN113686332 A CN 113686332A CN 202111053614 A CN202111053614 A CN 202111053614A CN 113686332 A CN113686332 A CN 113686332A
Authority
CN
China
Prior art keywords
map
mobile robot
navigation
determining
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111053614.2A
Other languages
Chinese (zh)
Inventor
王金洋
王鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Quicktron Intelligent Technology Co Ltd
Original Assignee
Shanghai Quicktron Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Quicktron Intelligent Technology Co Ltd filed Critical Shanghai Quicktron Intelligent Technology Co Ltd
Priority to CN202111053614.2A priority Critical patent/CN113686332A/en
Publication of CN113686332A publication Critical patent/CN113686332A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60PVEHICLES ADAPTED FOR LOAD TRANSPORTATION OR TO TRANSPORT, TO CARRY, OR TO COMPRISE SPECIAL LOADS OR OBJECTS
    • B60P3/00Vehicles adapted to transport, to carry or to comprise special loads or objects

Abstract

The application provides a mobile robot and a navigation method, a navigation device, equipment and a storage medium thereof, the mobile robot is applied to storage with a goods shelf, and the mobile robot comprises: the bottom of the machine body is provided with a moving wheel; the image acquisition device is arranged in the front of the side of the machine body and used for acquiring environment images, and the environment images comprise ceilings of storage and tops of shelves. The mobile robot can perform stable and accurate positioning and navigation in a dynamic environment.

Description

Mobile robot and navigation method, device, equipment and storage medium thereof
Technical Field
The present application relates to the field of intelligent storage technologies, and in particular, to a mobile robot, a navigation method, an apparatus, a device, and a storage medium thereof.
Background
With the development of the e-commerce, 3C goods and logistics industries, large-scale storage warehouses at home and abroad develop very rapidly. In order to improve the space utilization efficiency of the warehouse, the interior of the large warehouse presents a trend of faster and faster pace and low labor cost for the design of the working time and working rhythm scheme. Automated handling robots (either mobile robots or smart devices) are the best choice for the need for fast paces and low labor costs.
Under a specific height (such as 3 meters) of an indoor large warehouse, due to high utilization efficiency of a cargo space, cargo handling changes greatly, for example, more than 80%, and such an environment is called a dynamic environment. The dynamic environment may result in the carrier robot being unable to position and navigate according to previously established ambient markers and maps.
Disclosure of Invention
The embodiment of the application provides a mobile robot and a navigation method, a navigation device, equipment and a storage medium thereof, which are used for solving the problems in the related technology, and the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a mobile robot, which is applied to a warehouse with a shelf, and includes:
the bottom of the machine body is provided with a moving wheel;
the image acquisition device is arranged in the front of the side of the machine body and used for acquiring environment images, and the environment images comprise ceilings of storage and tops of shelves.
In one embodiment, the image capture device is angled from 55 degrees to 65 degrees from vertical.
In one embodiment, the image capture device includes a binocular structured light sensor, the structured light of which is dual concentric circles.
In one embodiment, the binocular structured light sensor is provided with an inertial measurement sensor unit.
In a second aspect, embodiments of the present application provide a warehouse, including a rack and a mobile robot according to any of the embodiments of the present application.
In one embodiment, the ceiling of the warehouse is provided with at least one spatial marker.
In a third aspect, an embodiment of the present application provides a navigation method for a mobile robot, where the navigation method is applied to the mobile robot, and the navigation method includes:
collecting an environment image, wherein the environment image comprises a storage ceiling and the top of a shelf;
determining the current position according to the environment image;
sending a navigation request to a map server, wherein the navigation request comprises a current position and a target position, so that the map server returns a 3D map and a planned path according to the current position and the target position;
and moving according to the 3D map and the planned path.
In one embodiment, the navigation method further comprises:
determining environmental illumination information;
and the navigation request comprises the ambient illumination information so that the map server returns the 3D map corresponding to the ambient illumination information.
In one embodiment, the 3D map includes a 3D sparse map and a 3D dense map.
In one embodiment, the environment image further includes a spatial marker disposed on a ceiling, and the navigation method further includes:
loop back detection is performed based on the spatial marker.
In a fourth aspect, an embodiment of the present application provides a navigation method for a mobile robot, where the navigation method is applied to a map server, and the navigation method includes:
receiving a navigation request initiated by a mobile robot, wherein the navigation request comprises environment illumination information;
determining a 3D map to be returned according to the ambient illumination information;
returning the 3D map to the mobile robot.
In one embodiment, determining a 3D map to return from ambient lighting information includes:
determining whether the ambient light meets a preset condition or not according to the ambient light information;
determining the 3D map as a 3D sparse map under the condition that a preset condition is met; or, in the case that the preset condition is not satisfied, determining the 3D map as a dense map.
In a fifth aspect, an embodiment of the present application provides a navigation device for a mobile robot, where the navigation device is applied to the mobile robot, and the navigation device includes:
the image acquisition module is used for acquiring an environment image, and the environment image comprises a storage ceiling and the top of a shelf;
the position determining module is used for determining the current position according to the environment image;
the navigation request sending module is used for sending a navigation request to the map server, wherein the navigation request comprises a current position and a target position, so that the map server returns a 3D map and a planned path according to the current position and the target position;
and the movement module is used for moving according to the 3D map and the planned path.
In one embodiment, the navigation device further comprises:
the environment illumination information determining module is used for determining environment illumination information;
and the navigation request comprises the ambient illumination information so that the map server returns the 3D map corresponding to the ambient illumination information.
In one embodiment, the 3D map includes a 3D sparse map and a 3D dense map.
In one embodiment, the environment image further includes a spatial marker disposed on a ceiling, and the navigation device further includes:
and the loop detection module is used for carrying out loop detection according to the space marker.
In a sixth aspect, an embodiment of the present application provides a navigation device for a mobile robot, where the navigation device is applied to a map server, and the navigation device includes:
the navigation request receiving module is used for receiving a navigation request initiated by the mobile robot, and the navigation request comprises environment illumination information;
the 3D map determining module is used for determining a 3D map to be returned according to the ambient illumination information;
and the 3D map returning module is used for returning the 3D map to the mobile robot.
In one embodiment, the 3D map determination module includes:
the preset condition judgment submodule is used for determining whether the ambient light meets the preset condition or not according to the ambient light information;
the 3D map determining submodule is used for determining the 3D map as a 3D sparse map under the condition that a preset condition is met; or, in the case that the preset condition is not satisfied, determining the 3D map as a dense map.
In a seventh aspect, an embodiment of the present application provides a navigation apparatus for a mobile robot, where the apparatus includes: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method for navigating the mobile robot.
In an eighth aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that, when executed on a computer, perform a method in any one of the above-described aspects.
The advantages or beneficial effects in the above technical solution at least include: the method can perform robust and accurate positioning and navigation in a dynamic environment.
The foregoing summary is provided for the purpose of description only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present application will be readily apparent by reference to the drawings and following detailed description.
Drawings
In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
FIG. 1 shows a schematic view of a mobile robot according to an embodiment of the application;
fig. 2 shows a flow chart of a navigation method of a mobile robot according to an embodiment of the application;
FIG. 3 shows a schematic diagram of a 3D map according to an embodiment of the application;
fig. 4 shows a flow chart of a navigation method of a mobile robot according to an embodiment of the application;
fig. 5 is a diagram illustrating an application example of a navigation method of a mobile robot according to an embodiment of the present application;
fig. 6 shows a block diagram of a navigation device of a mobile robot according to an embodiment of the present application;
fig. 7 shows a block diagram of a navigation device of a mobile robot according to another embodiment of the present application;
fig. 8 shows a block diagram of a navigation device of a mobile robot according to an embodiment of the present application.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present application. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
Fig. 1 shows a schematic structural diagram of a mobile robot according to an embodiment of the present application. As shown in fig. 1, the mobile robot includes a body 101 and an image pickup device 102. The mobile robot can move (move) in a warehouse in which a plurality of racks are stored to store goods. In this embodiment, the warehouse is an indoor storage, that is, the warehouse has a ceiling.
The bottom of the body 101 is mounted with moving wheels 103. The number of the moving wheels 103 is not limited in this embodiment as long as the mobile robot can be moved. The spatial intersection of the connecting lines of the centers of all the motion wheels of the mobile robot can be used as the center of the mobile robot.
The image capturing device 102 is disposed in a lateral front of the body 101 and captures an environmental image. The image capturing device 102 is disposed obliquely to the ceiling, for example, the image capturing device 102 is disposed at an angle of 55 degrees to 65 degrees from the vertical, so that the ceiling and the top of the shelf can be included in the environment image. Preferably, the angle between the image acquisition device and the vertical direction is 60 degrees. The direction of the image capturing device 102 can be set arbitrarily, as long as the viewing angle is ensured to be set obliquely to the ceiling, so that the environment image can include the ceiling and the top of the shelf.
In this embodiment, the image acquisition area of the mobile robot is the ceiling of the warehouse and the top of the goods shelf, and the probability of the change of the area in the dynamic environment is very small, so that the influence on the positioning and navigation of the mobile robot is small, and the mobile robot can be stably and accurately positioned and navigated in the dynamic environment.
In one embodiment, the image capture device 102 may be a binocular structured light sensor. Illustratively, the structured light of the binocular structured light sensor takes two concentric circles, i.e., the pattern projected onto the surface of the object is two concentric circles.
The 2D laser navigation and positioning uses 2D environment plane outline, uses limited information quantity for surrounding environment, cannot fully use environment data information, and further develops information mining based on the environment data information. In the embodiment, the binocular structure optical sensor is adopted, parallax can be formed on the surface of the object based on the weak texture, and then 3D world coordinates of the surface of the object are calculated, so that a map server can conveniently generate a 3D map (3D point cloud map) for storage. The 3D map contains a large amount of environmental information, so that the navigation precision and accuracy can be improved, the space utilization rate of storage is further improved, and the dense storage is favorably realized. And, 3D point cloud map has the advantage that does not receive 24 hours day night environment light influences, therefore, the mobile robot of this embodiment can adapt to the ambient light under any time and the condition in the storage, and then charges when realizing 24 hours lack of electricity, and the goods of all the other times no matter day or night all work is transported and is gone into and out the warehouse at the high efficiency and low cost. And the structured light of double concentric circles is adopted, so that the measurement of the surface of the 3D object can be more accurate.
Illustratively, the binocular structured light sensor may include: the glass lens group is a main optical system component of the binocular structure optical sensor; a mask serving as an optical path adjusting member; the front panel is a main supporting structural component at the front end of the binocular structure optical sensor and can be made of aluminum; a binocular Printed Circuit Board (PCB) module, which is a core optical imaging part of the binocular structured light sensor; RGB cameras, which are direct sensors for color imaging; the heat dissipation plate can be a specially-made metal heat dissipation plate. The binocular structured light sensor may further include a binocular camera PCB processor board, a USB cap, an aluminum back cover plate, and a dust-proof auxiliary cap.
In one embodiment, the binocular structured light sensor is provided with an Inertial Measurement Unit (IMU) to measure linear and angular accelerations of the object. Through built-in IMU in binocular structure light sensor, consequently, IMU and binocular structure light sensor are the time synchronization of integrated circuit board level, so do not need external computational unit to carry out the timestamp synchronous computation to IMU data and binocular structure light sensor data alone to avoid the phenomenon that the computational unit system card that causes because of the timestamp is synchronous is pause.
Continuing to refer to fig. 1, the mobile robot of the present embodiment may further include: a main acousto-optic sensor 104, which can be arranged at the front top of the body 101; a first emergency stop button 105A disposed at a front side of the body 101, wherein when an emergency occurs, an emergency stop signal can be triggered by pressing the first emergency stop button 105A; a second emergency stop button 105B, which may be disposed at the rear middle of the body 101, and when an emergency occurs, the emergency stop signal may be triggered by pressing the second emergency stop button 105B; a roller 106, which can be used for goods transfer; the inspection door 107 can be used for battery inspection or debugging of the mobile robot and can be provided with a high-speed serial port; and an anti-collision emergency stop sensor 108 arranged at the front bottom of the body 101 and used for generating an emergency stop signal when the front of the mobile robot collides with an obstacle. In response to receiving the emergency stop signal, the control system of the mobile robot may emergency brake for a preset time (e.g., 10ms) and shut off the power source.
Fig. 2 illustrates a navigation method of a mobile robot according to an embodiment of the present application, which is applied to the mobile robot. As shown in fig. 2, the navigation method includes:
step S201: collecting an environment image, wherein the environment image comprises a storage ceiling and the top of a shelf;
step S202: determining the current position in the warehouse according to the environment image;
step S203: sending a navigation request to a map server, wherein the navigation request comprises a current position and a target position in storage, so that the map server returns a 3D map and a planned path according to the current position and the target position;
step S204: and moving according to the 3D map and the planned path.
The mobile robot sends a navigation request to the map server, and the map server determines corresponding current map points and target map points according to the current position and the target position of the mobile robot, further plans a path between the current map points and the target map points, and issues the planned path and the 3D map to the mobile robot. Wherein the 3D map may be a sub-map between the current map point and the target map point.
Based on the interaction between the mobile robot and the map server, the mobile robot does not need to store a 3D map, the repeated waste of computer hardware storage resources can be reduced in a warehousing environment with a large number of mobile robots (such as more than 100 mobile robots), and when the map is required to be updated, each mobile robot is not required to be updated independently, so that the workload and the labor cost can be reduced, and the energy efficiency is improved.
The map server can communicate with a plurality of mobile robots simultaneously, and can communicate with a plurality of mobile robots through XMLRPC communication protocol, including mapping; distributing, splicing, upgrading and updating the sub-map; map mode switching, etc.
In one example, the mapping process may include: the mobile robot moves in the warehouse and carries out path planning based on the self obstacle detection and avoidance algorithm, thereby fully covering the whole warehouse space. The obstacles may include shelves, temporary goods, people, vehicles, or other smart devices, etc. In the movement process, the mobile robot continuously collects environment images and sends the environment images to the map server. And the map server identifies and processes the environment image uploaded by the mobile robot to generate a 3D sub-map. When the area of the mobile robot for collecting the environment image exceeds a preset value (such as 100 square meters), the map server can splice, correct and optimize the established 3D sub-map to obtain a 3D global map. That is to say, the mapping process of the map server is performed along with the movement of the mobile robot, and the mapping is not performed uniformly after the mobile robot scans or collects all the environment images, so that the mapping efficiency can be improved.
Wherein the 3D map comprises a 3D dense map and a 3D sparse map. The 3D dense map comprises discrete space point cloud data, and the 3D coefficient map extracts and screens sparse features of the space point cloud data and takes sparse feature points as map data.
Preferably, a plurality of mobile robots can be arranged and uniformly distributed in different areas of the warehouse, so that the mapping speed and efficiency are further improved. Fig. 3 illustrates an example of a 3D map generated according to a mapping method of an embodiment of the present application. Wherein 301 represents the planned path of the mobile robot in the warehouse; 302 represents a representation of the mobile robot at a current location; 303 indicates a shelf or an outer wall profile of the warehouse.
After the map building is completed, both the mobile robot and the map server leave the map building mode, and the mobile robot performs the tracking mode, that is, the process goes to step S201. The map server enters a map distribution mode and a path planning mode. And in the tracking mode, finishing the movement from the current map point to the target map point based on the planned path and the 3D map issued by the map server.
Illustratively, the current position and the target position are positions (a current map point and a target map point) of the mobile robot on the 3D map.
Exemplarily, a relocation function may be performed in step S202. In the tracking mode, if the mobile robot has a positioning loss, that is, the current position cannot be determined, the mobile robot may trigger a repositioning function. For example: the mobile robot collects the current environment image and sends a repositioning request to the map server, wherein the repositioning request comprises the current environment image. And the map server extracts the features of the current environment image according to the repositioning request, matches the features with the 3D global map based on the extracted features, further determines corresponding map points, wherein the map points are the repositioned map points, and sends the repositioned map points to the mobile robot, so that the mobile robot determines the position of the mobile robot in the 3D global map.
Illustratively, the relative spatial attitude between the mobile robot and the currently acquired environmental data is determined and fed back to the central control software, the central control software transmits the data stream to the operation control, and the operation control adjusts the synthetic velocity vector of the moving wheels of the mobile robot, so as to complete the compensation of the relative positioning differential attitude of the mobile robot.
In one implementation, the method of the embodiment of the present application may further include: ambient lighting information is determined. Further, in step S203, the navigation request further includes ambient lighting information, so that the map server determines a corresponding type of 3D map according to the ambient lighting information.
For example, the mobile robot may independently detect the light condition of a local area where the mobile robot is located, obtain environmental illumination information, such as luminance, and report the environmental illumination information to the map server, and the map server issues a 3D dense map or a 3D sparse map for the mobile robot according to the current light condition of the mobile robot.
Illustratively, a light brightness sensor is arranged on the mobile robot, so as to detect the ambient illumination information. Or the mobile robot can acquire an environment image and further determine the environment illumination information according to the environment image.
In one embodiment, the environment image includes a spatial marker on a ceiling, and the navigation method of this embodiment further includes: loop back detection is performed based on the spatial marker.
After the mobile robot runs for a long time, along with the accumulation of errors of sensor hardware and the accumulation of errors of an algorithm of embedded computer software, system errors are larger and larger, and the continuous positioning and navigation capacity and precision of the mobile robot are seriously influenced. When the mobile robot collects (scans) the previous porch marker again, loop detection can be performed, namely, the continuous accumulated error in the motion process is subjected to once integral optimization, so that the system error is closed loop, and the accumulated error in long-time continuous operation is eliminated. The global optimization includes global key frame optimization or key node optimization.
Fig. 4 shows a flowchart of a navigation method of a mobile robot according to an embodiment of the present application, which may be applied to a map server. As shown in fig. 4, the method includes:
step S401: receiving a navigation request initiated by a mobile robot, wherein the navigation request comprises environment illumination information;
step S402: determining a 3D map to be returned according to the ambient illumination information;
step S403: returning the 3D map to the mobile robot.
In one embodiment, step S402 may include: determining whether the ambient light meets a preset condition or not according to the ambient light information; determining the 3D map as a 3D sparse map under the condition that a preset condition is met; or, in the case that the preset condition is not satisfied, determining the 3D map as a dense map.
Illustratively, the ambient lighting information may be light brightness, and a light brightness threshold may be preset. When the light brightness exceeds the light brightness threshold value, the preset condition is determined to be met, so that the 3D sparse map is returned to the mobile robot, the calculated amount is reduced under the condition that the ambient light is bright, and the stable and accurate positioning is realized. When the light brightness is lower than the light brightness threshold, it is determined that the preset condition is not satisfied, and thus a 3D dense map is transmitted to the mobile robot, so that the mobile robot operates robustly and accurately in a situation where ambient light is dark (such as at night).
In an application example, as shown in fig. 5, after the mobile robot is powered on, the mobile robot performs a self-test, which includes:
(1) and checking whether the emergency stop of the safety signal is normal. And if the scram signal is abnormal, entering an abnormal handling mechanism. If the emergency stop signal is normal, the principle of judgment is that the emergency stop is simulated once by testing and then is automatically recovered, and if the power of the motor can be normally recovered, the emergency stop signal is normal.
(2) It is checked whether the binocular structured light camera (binocular structured light sensor) is normal. The first is whether the projected laser matrix pattern (double concentric circle structured light) is normal or not and whether serious distortion exists or not; the second is whether the parallax calculation of the left and right 2 cameras is normal; the third is whether the transmission frame format of the external image signal of the binocular structure optical camera is normal; and the fourth is whether the structure light emitter is normally turned on or off. If there is a check exception, then an exception handling mechanism is entered.
(3) And checking whether the operation control is normal. Checking whether signals of a motor signal controller, a wheel set angle feedback device and the like connected with operation control are normal or not; and detecting whether all wheel sets (moving wheels) are normal or not, wherein the detection comprises the heartbeat signals of the wheel sets, the angle control of the wheel sets and the consistency of angle feedback signals or not.
(4) It is checked whether the map server is normal. Checking whether the map data is normal; checking whether the sub map splicing module is normal; and checking whether the communication between the map server and all the individuals of each mobile robot is normal or not.
The embodiment of the application also provides a warehouse, which comprises the goods shelf and the mobile robot in any one of the above embodiments. The warehouse is an indoor warehouse, namely a ceiling is arranged, and one or more space markers can be arranged on the ceiling and used for loop detection of the mobile robot.
Fig. 6 illustrates a navigation device of a mobile robot according to an embodiment of the present application, which is applicable to a mobile robot. As shown in fig. 6, the navigation device includes:
the image acquisition module 601 is used for acquiring an environment image, wherein the environment image comprises a storage ceiling and the top of a shelf;
a position determining module 602, configured to determine a current position according to the environment image;
a navigation request sending module 603, configured to send a navigation request to a map server, where the navigation request includes a current location and a target location, so that the map server returns a 3D map and a planned path according to the current location and the target location;
a motion module 604 for moving according to the 3D map and the planned path.
In one embodiment, the navigation device further comprises:
the environment illumination information determining module is used for determining environment illumination information;
and the navigation request comprises the ambient illumination information so that the map server returns the 3D map corresponding to the ambient illumination information.
In one embodiment, the 3D map includes a 3D sparse map and a 3D dense map.
In one embodiment, the environment image further includes a spatial marker disposed on a ceiling, and the navigation device further includes:
and the loop detection module is used for carrying out loop detection according to the space marker.
Fig. 7 shows a navigation device of a mobile robot according to an embodiment of the present application, which is applicable to a map server. As shown in fig. 7, the navigation device includes:
a navigation request receiving module 701, configured to receive a navigation request initiated by a mobile robot, where the navigation request includes ambient lighting information;
a 3D map determining module 702, configured to determine a 3D map to be returned according to the ambient lighting information;
and a 3D map returning module 703, configured to return a 3D map to the mobile robot.
In one embodiment, the 3D map determination module 702 includes:
the preset condition judgment submodule is used for determining whether the ambient light meets the preset condition or not according to the ambient light information;
the 3D map determining submodule is used for determining the 3D map as a 3D sparse map under the condition that a preset condition is met; or, in the case that the preset condition is not satisfied, determining the 3D map as a dense map.
The functions of each module in each apparatus in the embodiment of the present application may refer to corresponding descriptions in the above method, and are not described herein again.
Fig. 8 is a block diagram showing a navigation apparatus of a mobile robot according to an embodiment of the present application. As shown in fig. 8, the navigation apparatus includes: a memory 801 and a processor 802, the memory 801 having stored therein instructions executable on the processor 802. The processor 802, when executing the instructions, implements the methods in the embodiments described above. The number of the memory 801 and the processor 802 may be one or more. The navigation device is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The navigation device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
The navigation device may further include a communication interface 803 for communicating with an external device for data interactive transmission. The various devices are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor 802 can process instructions for execution within the navigation device, including instructions stored in or on a memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to an interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple navigation devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a set of blade servers, or a multi-processor system). The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 8, but this is not intended to represent only one bus or type of bus.
Optionally, in an implementation, if the memory 801, the processor 802, and the communication interface 803 are integrated on a chip, the memory 801, the processor 802, and the communication interface 803 may complete communication with each other through an internal interface.
It should be understood that the processor may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or any conventional processor or the like. It is noted that the processor may be a processor supporting an Advanced reduced instruction set machine (ARM) architecture.
Embodiments of the present application provide a computer-readable storage medium (such as the above-mentioned memory 801) storing computer instructions, which when executed by a processor implement the methods provided in embodiments of the present application.
Alternatively, the memory 801 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory 801 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 801 may optionally include memory located remotely from the processor 802, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more (two or more) executable instructions for implementing specific logical functions or steps in the process. And the scope of the preferred embodiments of the present application includes other implementations in which functions may be performed out of the order shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. All or part of the steps of the method of the above embodiments may be implemented by hardware that is configured to be instructed to perform the relevant steps by a program, which may be stored in a computer-readable storage medium, and which, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module may also be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present application, and these should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A mobile robot applied to a warehouse having a shelf, the mobile robot comprising:
the device comprises a machine body, wherein the bottom of the machine body is provided with a moving wheel;
the image acquisition device is arranged in the front of the side of the machine body and used for acquiring environment images, and the environment images comprise the ceiling of the warehouse and the tops of the goods shelves.
2. The mobile robot of claim 1, wherein the image capture device is angled 55 degrees to 65 degrees from vertical.
3. The mobile robot of claim 1, wherein the image capture device comprises a binocular structured light sensor having structured light in the form of two concentric circles.
4. The mobile robot according to claim 3, wherein the binocular structured light sensor is provided with an inertial measurement sensor unit.
5. Warehouse, characterized in that it comprises a rack and a mobile robot according to any of claims 1 to 4.
6. The warehouse as claimed in claim 5, wherein a ceiling of the warehouse is provided with at least one spatial marker.
7. A navigation method of a mobile robot is characterized by being applied to the mobile robot and comprising the following steps:
collecting an environment image, wherein the environment image comprises a stored ceiling and the top of a shelf;
determining a current position according to the environment image;
sending a navigation request to a map server, wherein the navigation request comprises the current position and a target position, so that the map server returns a 3D map and a planned path according to the current position and the target position;
and moving according to the 3D map and the planned path.
8. The navigation method of claim 7, further comprising:
determining environmental illumination information;
the navigation request comprises the environment illumination information, so that the map server returns a 3D map corresponding to the environment illumination information.
9. The navigation method according to claim 8, wherein the 3D map includes a 3D sparse map and a 3D dense map.
10. The navigation method according to claim 7, further comprising a spatial marker disposed on the ceiling in the environment image, the navigation method further comprising:
performing a loop back detection based on the spatial marker.
11. A navigation method of a mobile robot is applied to a map server, and comprises the following steps:
receiving a navigation request initiated by a mobile robot, wherein the navigation request comprises environment illumination information;
determining a 3D map to be returned according to the environment illumination information;
returning the 3D map to the mobile robot.
12. The navigation method according to claim 11, wherein determining a 3D map to return based on the ambient lighting information comprises:
determining whether the ambient light meets a preset condition according to the ambient light information;
determining that the 3D map is a 3D sparse map under the condition that the preset condition is met; or, in the case that the preset condition is not satisfied, determining that the 3D map is a dense map.
13. A navigation device for a mobile robot, applied to the mobile robot, the navigation device comprising:
the image acquisition module is used for acquiring an environment image, and the environment image comprises a stored ceiling and the top of a shelf;
the position determining module is used for determining the current position according to the environment image;
the navigation request sending module is used for sending a navigation request to a map server, wherein the navigation request comprises the current position and the target position, so that the map server returns a 3D map and a planned path according to the current position and the target position;
and the movement module is used for moving according to the 3D map and the planned path.
14. The navigation device of claim 13, further comprising:
the environment illumination information determining module is used for determining environment illumination information;
the navigation request comprises the environment illumination information, so that the map server returns a 3D map corresponding to the environment illumination information.
15. The navigation device of claim 14, wherein the 3D map comprises a 3D sparse map and a 3D dense map.
16. The navigation device according to claim 13, wherein a spatial marker provided on the ceiling is further included in the environment image, the navigation device further comprising:
and the loop detection module is used for carrying out loop detection according to the space marker.
17. A navigation device of a mobile robot, which is applied to a map server, the navigation device comprising:
the navigation request receiving module is used for receiving a navigation request initiated by the mobile robot, and the navigation request comprises environment illumination information;
the 3D map determining module is used for determining a 3D map to be returned according to the environment illumination information;
and the 3D map returning module is used for returning the 3D map to the mobile robot.
18. The navigation device of claim 17, wherein the 3D map determination module comprises:
the preset condition judgment submodule is used for determining whether the ambient light meets the preset condition or not according to the ambient light information;
the 3D map determining submodule is used for determining the 3D map as a 3D sparse map under the condition that the preset condition is met; or, in the case that the preset condition is not satisfied, determining that the 3D map is a dense map.
19. A navigation apparatus of a mobile robot, characterized by comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 7 to 12.
20. A computer readable storage medium having stored therein computer instructions which, when executed by a processor, implement the method of any one of claims 7 to 12.
CN202111053614.2A 2021-09-08 2021-09-08 Mobile robot and navigation method, device, equipment and storage medium thereof Pending CN113686332A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111053614.2A CN113686332A (en) 2021-09-08 2021-09-08 Mobile robot and navigation method, device, equipment and storage medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111053614.2A CN113686332A (en) 2021-09-08 2021-09-08 Mobile robot and navigation method, device, equipment and storage medium thereof

Publications (1)

Publication Number Publication Date
CN113686332A true CN113686332A (en) 2021-11-23

Family

ID=78586182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111053614.2A Pending CN113686332A (en) 2021-09-08 2021-09-08 Mobile robot and navigation method, device, equipment and storage medium thereof

Country Status (1)

Country Link
CN (1) CN113686332A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070061079A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 Localization system of mobile robot based on camera and landmarks and method there of
CN102519481A (en) * 2011-12-29 2012-06-27 中国科学院自动化研究所 Implementation method of binocular vision speedometer
CN105549585A (en) * 2015-12-07 2016-05-04 江苏木盟智能科技有限公司 Robot navigation method and system
US20160147230A1 (en) * 2014-11-26 2016-05-26 Irobot Corporation Systems and Methods for Performing Simultaneous Localization and Mapping using Machine Vision Systems
CN107843251A (en) * 2017-10-18 2018-03-27 广东宝乐机器人股份有限公司 The position and orientation estimation method of mobile robot
CN108717710A (en) * 2018-05-18 2018-10-30 京东方科技集团股份有限公司 Localization method, apparatus and system under indoor environment
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070061079A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 Localization system of mobile robot based on camera and landmarks and method there of
CN102519481A (en) * 2011-12-29 2012-06-27 中国科学院自动化研究所 Implementation method of binocular vision speedometer
US20160147230A1 (en) * 2014-11-26 2016-05-26 Irobot Corporation Systems and Methods for Performing Simultaneous Localization and Mapping using Machine Vision Systems
CN105549585A (en) * 2015-12-07 2016-05-04 江苏木盟智能科技有限公司 Robot navigation method and system
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device
CN107843251A (en) * 2017-10-18 2018-03-27 广东宝乐机器人股份有限公司 The position and orientation estimation method of mobile robot
CN108717710A (en) * 2018-05-18 2018-10-30 京东方科技集团股份有限公司 Localization method, apparatus and system under indoor environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐德,谭民,李原: "机器人视觉测量与控制", 国防工业出版社, pages: 323 - 327 *

Similar Documents

Publication Publication Date Title
CN110095752B (en) Positioning method, apparatus, device and medium
CN109506642B (en) Robot multi-camera visual inertia real-time positioning method and device
CN103793936A (en) Automated frame of reference calibration for augmented reality
CN110361027A (en) Robot path planning method based on single line laser radar Yu binocular camera data fusion
WO2022052660A1 (en) Warehousing robot localization and mapping methods, robot, and storage medium
CN111156998A (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN110211228A (en) For building the data processing method and device of figure
WO2023029776A1 (en) Control method, apparatus and device for transfer robot, and storage medium
CN110058591A (en) A kind of AGV system based on laser radar Yu depth camera hybrid navigation
CN112734765A (en) Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion
CN108332750A (en) Robot localization method and terminal device
WO2021004483A1 (en) Navigation method, mobile carrier, and navigation system
CN112184914A (en) Method and device for determining three-dimensional position of target object and road side equipment
CA3142750A1 (en) System for refining a six degrees of freedom pose estimate of a target object
CN113686332A (en) Mobile robot and navigation method, device, equipment and storage medium thereof
Xuehe et al. GPU based real-time SLAM of six-legged robot
EP3392748B1 (en) System and method for position tracking in a virtual reality system
US11620846B2 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
CN113310484B (en) Mobile robot positioning method and system
CN110675445B (en) Visual positioning method, device and storage medium
CN103712603B (en) Based on 3D vision pose measuring apparatus and the measuring method thereof of plane grating
Wang et al. Agv navigation based on apriltags2 auxiliary positioning
CN115131656B (en) Space identification method and device, electronic equipment and computer readable storage medium
CN219533396U (en) Laser radar and binocular camera combined calibration platform
WO2024021340A1 (en) Robot following method and apparatus, and robot and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination