CN113741473A - Photocatalyst mobile robot and map construction method - Google Patents

Photocatalyst mobile robot and map construction method Download PDF

Info

Publication number
CN113741473A
CN113741473A CN202111066245.0A CN202111066245A CN113741473A CN 113741473 A CN113741473 A CN 113741473A CN 202111066245 A CN202111066245 A CN 202111066245A CN 113741473 A CN113741473 A CN 113741473A
Authority
CN
China
Prior art keywords
robot
local
grid
shell
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111066245.0A
Other languages
Chinese (zh)
Inventor
温宇航
张小建
温美英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Benyun International Development Co ltd
Original Assignee
Shenzhen Benyun International Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Benyun International Development Co ltd filed Critical Shenzhen Benyun International Development Co ltd
Priority to CN202111066245.0A priority Critical patent/CN113741473A/en
Publication of CN113741473A publication Critical patent/CN113741473A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a photocatalyst mobile robot, which comprises a shell, a wireless communication module, a mobile assembly, a sensor module, a filter and an ROS module, wherein the shell is provided with a plurality of Light Emitting Diodes (LEDs); the sensor module comprises a depth-of-field camera and a laser radar, wherein the depth-of-field camera and the laser radar are arranged on the shell, the depth-of-field camera is used for acquiring a local three-dimensional cloud picture, and the laser radar is used for acquiring the distance between a local characteristic point and the robot; the map construction method comprises the steps of fusing a first grid map generated based on a laser radar and a second grid map generated based on a depth camera to generate an indoor map; information is acquired through the depth-of-field camera and the laser radar, and the ROS module generates an indoor map by using the acquired information, so that the robot moves to a target position according to an instruction of external terminal equipment, and the purifying assembly is driven to automatically purify air; the mobile robot improves the efficiency of air purification and enlarges the range of air purification.

Description

Photocatalyst mobile robot and map construction method
Technical Field
The invention relates to the technical field of robots, in particular to a photocatalyst mobile robot and a map construction method.
Background
In the modern times, the demand of air purification is increasingly expanding in daily life and production process of people, and under the background of the market demand, a photocatalyst air purifier is derived. However, in the society with rapid development of informatization, intelligent, autonomous and convenient internet of things equipment also becomes a main development direction at present. The existing photocatalyst air purification product can not move autonomously, and can automatically purify air in a target area, and the purified area is limited.
Disclosure of Invention
The invention aims to provide a photocatalyst mobile robot and a map construction method, which are used for solving the problem that the purification range of an air purification product in the prior art is limited.
One aspect of the present embodiment provides a photocatalyst mobile robot, including:
a housing: comprises a shell and a chassis arranged at one end of the shell;
the wireless communication module is arranged in the shell and is used for being in wireless connection with an external terminal;
a moving assembly mounted on the chassis; for moving the robot;
the purification assembly comprises an air inlet and an air outlet which are arranged on the surface of the shell; and an air purifier mounted within the housing;
the sensor module comprises a depth-of-field camera and a laser radar, wherein the depth-of-field camera and the laser radar are arranged on the shell, the depth-of-field camera is used for acquiring a local three-dimensional cloud picture, and the laser radar is used for acquiring the distance between a local characteristic point and the robot;
the filter is arranged in the shell and used for acquiring the real-time position and the attitude of the local characteristic point relative to the robot according to the motion state of the robot; the motion state of the robot comprises the acceleration, the angular velocity and the displacement of the robot in the current state;
and the ROS module is arranged in the shell and used for constructing an indoor map according to the local three-dimensional cloud picture and the distance between the local characteristic point and the robot.
Preferably, the sensor module further comprises an ultrasonic module and a cliff sensor, wherein the ultrasonic module is used for detecting an obstacle in front of the robot, and the cliff sensor is used for detecting the height difference of the working platform.
Preferably, the ultrasonic module comprises 3 ultrasonic detectors, and the ultrasonic detectors are arranged on the shell.
Preferably, the moving assembly comprises two driving wheels, a hub motor arranged in the driving wheels, and four auxiliary universal wheels arranged around the chassis.
In a second aspect, the present embodiment provides a map building method based on the above-mentioned photocatalyst mobile robot, the map building method includes,
acquiring displacement, acceleration and speed of the robot;
obtaining the distance and the angle between the local characteristic point and the robot through a laser radar;
acquiring the real-time position and posture of the local characteristic point relative to the robot according to the displacement, the acceleration and the speed of the robot and the distance and the angle between the local characteristic point and the robot so as to generate a first grid map; the local feature points represent a plurality of landmarks within a detection range of the lidar;
acquiring a local image through a depth-of-field camera to generate a local three-dimensional cloud image;
projecting the local three-dimensional cloud image to form a second grid image;
and fusing the first grid map and the second grid map to generate an indoor map.
The first grid graph and the second grid graph are both composed of a plurality of grids, and the state of each grid is occupied or empty;
if the states of the grids of the first grid map and the second grid map are both empty, determining that the state of the grid is empty; and judging the other situations as occupation.
According to the photocatalyst mobile robot, the depth of field camera and the laser radar are used for collecting information, and the ROS module is used for generating an indoor map by using the collected information, so that the robot moves to a target position according to an instruction of external terminal equipment, and drives the purification assembly to automatically purify air; the mobile robot improves the efficiency of air purification and enlarges the range of air purification.
Drawings
Fig. 1 is a schematic structural diagram of a photocatalyst mobile robot according to an embodiment of the present invention;
fig. 2 is a system framework diagram of a photocatalyst mobile robot according to an embodiment of the present invention;
fig. 3 is a flowchart of a map construction method according to a second embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indications (such as up, down, left, right, front, back, inner and outer, center … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the motion situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indication is changed accordingly.
In the present invention, unless otherwise expressly stated or limited, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention. .
The following examples are now provided:
the first embodiment is as follows:
as shown in fig. 1 to 2, a photocatalyst mobile robot includes:
a housing 1: comprises a shell 11 and a chassis 12 arranged at one end of the shell 11;
the wireless communication module 2 is arranged in the shell 1 and is used for being wirelessly connected with an external terminal; the wireless terminal such as a web service platform and a mobile phone APP sends instructions through the mobile phone and the web service platform and transmits the instructions to the wireless communication module; the wireless communication module may be a Global System for Mobile communications (GSM), Wideband Code Division Multiple Access (WCDMA), 4G network, 5G network, Bluetooth (Bluetooth), Wi-Fi, or other wireless network module.
The moving assembly 3 is arranged on the chassis; for moving the robot; the moving assembly comprises two driving wheels, a hub motor arranged in the driving wheels and four auxiliary universal wheels arranged around the chassis; the hub motor is a motor which integrates a power system, a transmission system and a brake system of the vehicle.
The purification assembly 4 comprises an air inlet 41 and an air outlet 42 which are arranged on the surface of the shell; and an air purifier (not shown) mounted within the housing; after the air purifier is opened, air is supplied through the air inlet 41 and is discharged through the air outlet 42, so that air circulation is realized.
The sensor module 5 comprises a depth-of-field camera 51 and a laser radar 52, which are arranged on the housing, wherein the depth-of-field camera 51 is used for acquiring a local three-dimensional cloud image, specifically, the depth-of-field camera 51 can be provided with two groups, one group can be a first camera used for identifying a low-resolution depth image, and a second camera providing stereoscopic vision, the two groups of cameras are fused and shot to form the local three-dimensional cloud image, and the local three-dimensional cloud image is used for calculating a surface normal map, so that the posture of the current robot is estimated.
The laser radar 52 is used for acquiring the distance between the local characteristic point and the robot; specifically, the distance between the robot and the surface of the feature points, each of which may be understood as a landmark, may be output by the laser radar 52, and the local feature points may include a plurality of feature points, where the greater the number of feature points, the more accurate the estimation of the relative position coordinates of the robot.
A filter 6, preferably an extended kalman filter, disposed within the housing 1; the filter is used for acquiring the real-time position and the attitude of the local feature point relative to the robot according to the motion state of the robot; the motion state of the robot comprises the acceleration, the angular velocity and the displacement of the robot in the current state.
The ROS module 7 is arranged in the shell 1 and used for constructing an indoor map according to the local three-dimensional cloud picture and the distance between the local characteristic point and the robot; the method comprises the steps that an instruction of an external terminal moves to a target position of an indoor map to drive a purification assembly to purify air, specifically, after the indoor map is built, the indoor map can be reported to a mobile phone through a wireless communication module, the positions of all areas of the indoor environment are marked in a mobile phone APP according to the indoor environment, such as a kitchen, a living room and the like, for example, after the position of the kitchen area is determined, a certain position of the kitchen area is used as a target position, a target track for purifying air in the kitchen area is planned on the mobile phone APP, then target position instruction information is issued to an ROS module through the wireless communication module, the ROS module drives a hub motor to drive a driving wheel to rotate according to the target position instruction information, so that a robot is driven to move to the kitchen area, then air purification is carried out according to the target track, and the air purifier can be remotely opened and closed through the mobile phone APP, thereby purifying the air in the kitchen area; realizes full-automatic purification, enlarges the purification range and improves the purification efficiency.
Preferably, the sensor module 4 further comprises an ultrasonic module 53 for detecting obstacles in front of the robot, and a cliff sensor 54 for detecting a height difference of the working platform, and sending a signal to the ROS module when detecting a step or a high place, and the ROS module controls the robot to stop moving so as to prevent the robot from slipping from the high place.
Preferably, the ultrasonic module 53 includes 3 ultrasonic detectors, and the ultrasonic detectors are disposed on the housing.
Example two:
the embodiment provides a map building method based on the photocatalyst mobile robot, as shown in fig. 3, the map building method includes,
step S100: acquiring displacement, acceleration and speed of the robot; the displacement of robot is acquireed to the number of turns of rotation of accessible in-wheel motor, can calculate the acceleration, the speed of robot through the time of passing through.
Step S101: the distance and the angle between the local characteristic point and the robot can be obtained through the laser radar 52; the measurement range of the laser radar 52 is 0-360 degrees, the horizontal resolution is 0.2 degrees, and the laser radar 52 can realize full-circle scanning through the frequency of 10-100 HZ.
Step S102: acquiring the real-time position and posture of the local characteristic point relative to the robot according to the displacement, the acceleration and the speed of the robot and the distance and the angle between the local characteristic point and the robot so as to generate a first grid map; the local feature points represent a plurality of landmarks within the detection range of lidar 52;
step S103: acquiring a local image through the depth-of-field camera 51 to generate a local three-dimensional cloud picture, wherein the local image represents a three-dimensional image in a local range which can be detected by the depth-of-field camera 51, and calculating a surface normal map through scene surface points to estimate the real-time posture of the robot according to the real-time posture of the robot; performing nonlinear optimization according to the real-time attitude, and then performing loop detection; finally, establishing a local three-dimensional cloud picture;
step S104: optimizing the local three-dimensional cloud picture, eliminating false obstacle information, and projecting the optimized three-dimensional cloud picture to form a second grid picture; the first grid map and the second grid map are both 2D maps;
step S105: fusing the first grid map and the second grid map to generate an indoor map;
the first grid graph and the second grid graph are both composed of a plurality of grids, and the state of each grid is occupied or empty; for example, in a certain grid region in the first grid map, the ratio of the occupied area to the total grid area is higher than a preset threshold (e.g., 50%), which means that the state of the grid is occupied, and the ratio of the occupied area to the total grid area is lower than the preset threshold, which means that the state of the grid is empty.
If the states of the grids of the first grid map and the second grid map are both empty, determining that the state of the grid is empty;
if the states of the grids of the first grid map and the second grid map are occupied, determining that the states of the grids are occupied;
if the state of the grid of the first grid map is occupied and the state of the grid of the second grid map is empty, determining that the grid is occupied;
and if the state of the grid of the first grid map is empty and the state of the grid of the second grid map is occupied, determining that the grid is occupied.
When data association is performed in practical applications, we may encounter the following problems:
1. a feature point may be seen last but not next;
2. the feature point may be seen this time, but not later;
the above problems will cause serious problems for our navigation and mapping; because the depth-of-field camera 51 and the laser radar 52 have certain detection blind areas during monitoring, the first grid map and the second grid map are fused, so that the detection blind areas are reduced, the integrity of detected characteristic points is improved, and the two problems are greatly avoided.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (5)

1. A photocatalyst mobile robot, characterized by comprising:
a housing: comprises a shell and a chassis arranged at one end of the shell;
the wireless communication module is arranged in the shell and is used for being in wireless connection with an external terminal;
the moving assembly is arranged on the chassis and used for moving the robot;
the purification assembly comprises an air inlet and an air outlet which are arranged on the surface of the shell; and an air purifier mounted within the housing;
the sensor module comprises a depth-of-field camera and a laser radar, wherein the depth-of-field camera and the laser radar are arranged on the shell, the depth-of-field camera is used for acquiring a local three-dimensional cloud picture, and the laser radar is used for acquiring the distance between a local characteristic point and the robot;
the filter is arranged in the shell and used for acquiring the real-time position and the attitude of the local characteristic point relative to the robot according to the motion state of the robot; the motion state of the robot comprises the acceleration, the angular velocity and the displacement of the robot in the current state;
the ROS module is arranged in the shell and used for constructing an indoor map according to the local three-dimensional cloud picture and the distance between the local characteristic point and the robot; so as to move to the target position of the indoor map according to the instruction of the external terminal and drive the purification component to purify the air.
2. The photocatalyst mobile robot as claimed in claim 1, wherein the sensor module further comprises an ultrasonic module for detecting an obstacle in front of the robot, and a cliff sensor for detecting a height difference of the work platform.
3. The photocatalyst mobile robot as claimed in claim 1, wherein the ultrasonic module includes 3 ultrasonic detectors, and the ultrasonic detectors are disposed on the housing.
4. The photocatalyst mobile robot as claimed in claim 1, wherein the moving member includes two driving wheels, a hub motor disposed in the driving wheels, and four auxiliary universal wheels disposed around the base plate.
5. A map construction method based on the photocatalyst mobile robot as claimed in any one of claims 1 to 4, characterized in that the method comprises,
acquiring displacement, acceleration and speed of the robot;
obtaining the distance and the angle between the local characteristic point and the robot through a laser radar;
acquiring the real-time position and posture of the local characteristic point relative to the robot according to the displacement, the acceleration and the speed of the robot and the distance and the angle between the local characteristic point and the robot so as to generate a first grid map; the local feature points represent a plurality of landmarks within a detection range of the lidar;
acquiring a local image through a depth-of-field camera to generate a local three-dimensional cloud image;
projecting the local three-dimensional cloud image to form a second grid image;
fusing the first grid map and the second grid map to generate an indoor map;
the first grid graph and the second grid graph are both composed of a plurality of grids, and the state of each grid is occupied or empty;
if the states of the grids of the first grid map and the second grid map are both empty, determining that the state of the grid is empty; and judging the other situations as occupation.
CN202111066245.0A 2021-09-13 2021-09-13 Photocatalyst mobile robot and map construction method Pending CN113741473A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111066245.0A CN113741473A (en) 2021-09-13 2021-09-13 Photocatalyst mobile robot and map construction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111066245.0A CN113741473A (en) 2021-09-13 2021-09-13 Photocatalyst mobile robot and map construction method

Publications (1)

Publication Number Publication Date
CN113741473A true CN113741473A (en) 2021-12-03

Family

ID=78738217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111066245.0A Pending CN113741473A (en) 2021-09-13 2021-09-13 Photocatalyst mobile robot and map construction method

Country Status (1)

Country Link
CN (1) CN113741473A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226344A1 (en) * 2012-02-29 2013-08-29 Irobot Corporation Mobile Robot
US20140207282A1 (en) * 2013-01-18 2014-07-24 Irobot Corporation Mobile Robot Providing Environmental Mapping for Household Environmental Control
CN106909156A (en) * 2017-03-30 2017-06-30 北京小米移动软件有限公司 Air purification method and device
CN108073167A (en) * 2016-11-10 2018-05-25 深圳灵喵机器人技术有限公司 A kind of positioning and air navigation aid based on depth camera and laser radar
KR20180061949A (en) * 2016-11-30 2018-06-08 주식회사 유진로봇 Obstacle Sensing Apparatus and Method for Multi-Channels Based Mobile Robot, Mobile Robot including the same
KR20190027974A (en) * 2017-09-07 2019-03-18 코웨이 주식회사 Air purifying robot
KR102008367B1 (en) * 2018-01-18 2019-08-07 소프트온넷(주) System and method for autonomous mobile robot using a.i. planning and smart indoor work management system using the robot
CN110645974A (en) * 2019-09-26 2020-01-03 西南科技大学 Mobile robot indoor map construction method fusing multiple sensors
US20200198139A1 (en) * 2017-09-27 2020-06-25 Guangdong Bona Robot Co., Ltd. Map creation method of mobile robot and mobile robot
US20200297180A1 (en) * 2019-03-19 2020-09-24 Lg Electronics Inc. Air purifying system and control method for the air purifying system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226344A1 (en) * 2012-02-29 2013-08-29 Irobot Corporation Mobile Robot
US20140207282A1 (en) * 2013-01-18 2014-07-24 Irobot Corporation Mobile Robot Providing Environmental Mapping for Household Environmental Control
CN108073167A (en) * 2016-11-10 2018-05-25 深圳灵喵机器人技术有限公司 A kind of positioning and air navigation aid based on depth camera and laser radar
KR20180061949A (en) * 2016-11-30 2018-06-08 주식회사 유진로봇 Obstacle Sensing Apparatus and Method for Multi-Channels Based Mobile Robot, Mobile Robot including the same
CN106909156A (en) * 2017-03-30 2017-06-30 北京小米移动软件有限公司 Air purification method and device
KR20190027974A (en) * 2017-09-07 2019-03-18 코웨이 주식회사 Air purifying robot
US20200198139A1 (en) * 2017-09-27 2020-06-25 Guangdong Bona Robot Co., Ltd. Map creation method of mobile robot and mobile robot
KR102008367B1 (en) * 2018-01-18 2019-08-07 소프트온넷(주) System and method for autonomous mobile robot using a.i. planning and smart indoor work management system using the robot
US20200297180A1 (en) * 2019-03-19 2020-09-24 Lg Electronics Inc. Air purifying system and control method for the air purifying system
CN110645974A (en) * 2019-09-26 2020-01-03 西南科技大学 Mobile robot indoor map construction method fusing multiple sensors

Similar Documents

Publication Publication Date Title
US10901419B2 (en) Multi-sensor environmental mapping
US11720100B2 (en) Systems and methods for utilizing semantic information for navigation of a robotic device
US8818043B2 (en) Traffic signal mapping and detection
US7054716B2 (en) Sentry robot system
EP1240562B1 (en) Autonomous multi-platform robot system
EP1365300A2 (en) Autonomous multi-platform robotic system
Löper et al. Automated valet parking as part of an integrated travel assistance
US20200257311A1 (en) Cart having leading and following function
CN110716549A (en) Autonomous navigation robot system for map-free area patrol and navigation method thereof
JP2007233771A (en) Autonomous mobile vehicle guidance system and method
JP2020079997A (en) Information processing apparatus, information processing method, and program
CN110750097A (en) Indoor robot navigation system and map building, positioning and moving method
CN113741473A (en) Photocatalyst mobile robot and map construction method
CN116513334A (en) Magnetic adsorption robot device for multi-sensor fusion map building and navigation
CN110696003A (en) Water side rescue robot based on SLAM technology and deep learning
CN112318507A (en) Robot intelligent control system based on SLAM technology
CN112298177A (en) Unmanned tractor control system and control method thereof
Albrecht et al. Generic convoying functionality for autonomous vehicles in unstructured outdoor environments
Balasooriya et al. Development of the smart localization techniques for low-power autonomous rover for predetermined environments
CN211577736U (en) Indoor robot navigation
WO2024102083A1 (en) A delivery system and mapping method for a delivery robot
CN118103674A (en) Selecting boundary targets for autonomous mapping within a space
KR20230149709A (en) Bidirectional path optimization in a grid
KR20230043028A (en) Calibration courses and targets
CN112298178A (en) Unmanned tractor control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination