WO2019144541A1 - Robot de nettoyage - Google Patents

Robot de nettoyage Download PDF

Info

Publication number
WO2019144541A1
WO2019144541A1 PCT/CN2018/088143 CN2018088143W WO2019144541A1 WO 2019144541 A1 WO2019144541 A1 WO 2019144541A1 CN 2018088143 W CN2018088143 W CN 2018088143W WO 2019144541 A1 WO2019144541 A1 WO 2019144541A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning robot
cleaning
obstacle
monocular camera
virtual area
Prior art date
Application number
PCT/CN2018/088143
Other languages
English (en)
Chinese (zh)
Inventor
张一茗
陈震
Original Assignee
速感科技(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 速感科技(北京)有限公司 filed Critical 速感科技(北京)有限公司
Publication of WO2019144541A1 publication Critical patent/WO2019144541A1/fr
Priority to US16/921,095 priority Critical patent/US11654574B2/en
Priority to US18/295,474 priority patent/US12042926B2/en
Priority to US18/738,524 priority patent/US20240326259A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the invention relates to the field of artificial intelligence, and in particular to a cleaning robot.
  • cleaning robots have developed rapidly. The most important point for such robots is to achieve comprehensive and efficient cleaning of unknown environments. At present, most cleaning robots mainly have the following problems: First, the random cleaning mode of the usual cleaning robot often misses a large number of uncleaned areas, while the cleaned areas are often repeatedly cleaned, resulting in low cleaning efficiency; second, some New cleaning robots need to use expensive ranging equipment such as laser radar range finder or laser SLAM (Simultaneous Localization And Mapping) to map, locate and navigate in the actual work area. On the one hand, the cost of laser radar is relatively high. This increases the cost of the sweeper. On the other hand, the use of lidar increases the overall size and weight of the device due to the large size of the lidar.
  • ranging equipment such as laser radar range finder or laser SLAM (Simultaneous Localization And Mapping)
  • the invention provides a cleaning robot, which improves the positioning accuracy and the accuracy of drawing the cleaning robot by installing a monocular camera with a simple structure and an acute angle at the front of the cleaning robot, thereby improving the path planning/navigation effect and avoiding
  • the low efficiency of the random cleaning mode also improves the recognition effect and obstacle avoidance effect on the object, and can achieve efficient cleaning at low cost; and can realize functions such as object recognition, face recognition, real-time monitoring, security monitoring, etc.
  • the application field of cleaning robots improves the positioning accuracy and the accuracy of drawing the cleaning robot by installing a monocular camera with a simple structure and an acute angle at the front of the cleaning robot, thereby improving the path planning/navigation effect and avoiding
  • the low efficiency of the random cleaning mode also improves the recognition effect and obstacle avoidance effect on the object, and can achieve efficient cleaning at low cost; and can realize functions such as object recognition, face recognition, real-time monitoring, security monitoring, etc.
  • a cleaning robot of the present invention includes: a motion unit, a motion sensor, a monocular camera, a storage unit, a data processing unit, a cleaning unit, a power module, a housing; the motion unit is for moving; the motion sensor is used for Obtaining a motion parameter during movement of the cleaning robot; the monocular camera is configured to capture an image; the monocular camera is disposed in a slope surface located obliquely upward in a forward motion direction at a front end of the outer casing, and the slope is The direction of the monocular camera is at an acute angle to the direction of forward movement of the cleaning robot; the storage unit is coupled to the data processing unit for storing data; and the data processing unit is configured to acquire a monocular camera The captured image, and the motion parameters of the motion sensor, and the positioning of the cleaning robot according to the positioning and mapping algorithm, establishing a map, planning a route, and controlling the motion unit to operate according to the planned route; the cleaning unit is used to clean the ground.
  • a protective edge that is higher than the protrusion of the slope is provided at the periphery of the slope.
  • the monocular camera is rotatable in a plane formed by the direction of the monocular camera and the direction of forward motion.
  • a communication module is further included, and the communication module can be wirelessly connected to the server, and the data processing unit sends data or commands to the server through the communication module, and/or acquires information of the server through the communication module.
  • the communication module may be wirelessly connected to the user terminal device, the data processing unit receiving a command of the user terminal device through the communication module; and/or the data processing unit is provided to the user through the communication module
  • the terminal device sends a message.
  • a microphone and/or a speaker may also be included; the microphone acquires a voice control command and sends it to the data processing unit; the speaker transmits the information outward.
  • the slope surface covers the light transmissive material.
  • the light transmissive material may be one or more of GPPS, transparent ABS, AS, PMMA, PC, PS.
  • the direction of the monocular camera coincides with a normal direction of the slope surface
  • the angle ⁇ of the direction of the monocular camera and the direction of forward movement of the cleaning robot ranges from 30 ⁇ 90°;
  • the monocular camera has a vertical field of view ranging from 50[deg.] to 100[deg.].
  • the positioning and mapping algorithm comprises:
  • the motion parameter of the cleaning robot when the motion sensor captures the at least two images the displacement and posture of the cleaning robot are calculated, and the position of the cleaning robot when each image is captured is calculated, and the positioning of the cleaning robot is realized.
  • a collision sensor is further disposed, the collision sensor is disposed at a front end of the cleaning robot, and transmits a collision signal to the data processing unit when the obstacle is collided;
  • the positioning and mapping algorithm further includes:
  • the position of the obstacle is determined according to the collision signal of the collision sensor and the position of the collision moment of the cleaning robot;
  • a map in the work area is established to realize the construction of the work area.
  • a distance sensor may be further included, and when the distance sensor detects an obstacle, the relative position of the obstacle relative to the cleaning robot is sent to the data processing unit;
  • the positioning and mapping algorithm may further include:
  • a map in the work area is established to realize the construction of the work area.
  • the positioning and mapping algorithm may further include:
  • the cleaning robot is disposed inside the virtual area or on a virtual area boundary, and the virtual area is set by a preset instruction for controlling a moving distance and a rotation angle of the cleaning robot, wherein the instruction causes the cleaning robot to An enclosed space enclosed by a running track running in the actual working area constitutes the virtual area;
  • the process of controlling the cleaning robot to clean the virtual area inside the actual working area comprises:
  • the cleaning robot is controlled to turn to the preset steering and run along the edge of the obstacle, and the calculation is performed in real time.
  • the process of cleaning the interior of the enclosed space may include:
  • S510 controlling the cleaning robot to run in the first cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area;
  • S520 Control the cleaning robot to turn to the first steering direction to the first offset direction that is the same as the boundary direction of the encounter, continue to run the first offset length, and turn to the first steering to be parallel and opposite to the first cleaning direction.
  • S530 controlling the cleaning robot to run in the second cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area;
  • S540 controlling the cleaning robot to turn to the second steering direction to continue to run the second offset length in the second offset direction that is the same as the intersecting boundary direction, and turn to the second steering direction to the first cleaning direction;
  • Steps S510 to S540 are repeated until the trace of the cleaning robot traverses the operable area in the virtual area.
  • the first offset length and/or the second offset length are substantially the width of the cleaning unit.
  • the cleaning robot is controlled to run along the edge of the obstacle, and the trajectory running along the edge of the obstacle is calculated in real time in the current cleaning direction.
  • the projection length on the vertical line when the projection length is equal to the third offset length, the control cleaning robot is turned to the cleaning direction opposite to the current cleaning direction to continue the operation, and when the projection length is equal to 0, the control cleaning robot is turned to the current cleaning direction. Keep running.
  • the positioning and mapping algorithm may further include:
  • the extended virtual area is sequentially cleaned in sequence.
  • the sequentially cleaning the other extended virtual areas in sequence is determined according to the distance between the current position of the cleaning virtual robot and the other extended virtual areas after cleaning the current virtual area, and determining the next extended virtual area to be cleaned.
  • the instruction may include: 1) starting the cleaning robot with the current position as the starting position, controlling the cleaning robot to run the first preset length in the current direction; 2) steering the preset steering; 3) operating the second pre-direction in the current direction Set the length; 4) turn to the preset steering; 5) run the first preset length in the current direction; 6) turn to the preset steering; 7) continue to run the second preset length in the current direction back to the starting position.
  • the invention installs a monocular camera with a simple structure and an acute angle at an oblique angle on the front end of the cleaning robot, thereby avoiding the problem of difficult positioning and drawing caused by less information on the ceiling, and providing an image of the object in front of the moving direction in one image.
  • the characteristics of the ceiling are used as fixed reference objects, which improves the accuracy of positioning and mapping of the cleaning robot, thereby improving the path planning/navigation effect, avoiding the inefficiency of the random cleaning mode, and improving the recognition effect on the object and
  • the obstacle avoidance effect enables efficient cleaning at low cost; and it can realize functions such as object recognition, face recognition, real-time monitoring, security monitoring, etc., and expands the application field of cleaning robots.
  • 1a is a schematic view of a cleaning robot according to an embodiment of the present invention.
  • FIG. 1b is a schematic diagram showing the arrangement of a monocular camera according to an embodiment of the present invention.
  • Figure 1c is an external structural view of an embodiment of the present invention.
  • Figure 1d is a structural view of the appearance of an embodiment of the present invention.
  • FIG. 2 is a schematic view showing the combination of a cleaning robot structure according to various embodiments of the present invention.
  • 3a is a flow chart of a positioning and mapping algorithm of a cleaning robot according to an embodiment of the present invention
  • 3c is a schematic diagram of a path in a positioning and mapping algorithm of a cleaning robot according to an embodiment of the present invention
  • the motion unit is used for movement.
  • the specific form of the motion unit may be a wheel or a track, and of course other forms capable of carrying an object, such as a motion unit in the form of a magnetic levitation or a propeller.
  • the present invention does not limit the specific form of the motion unit.
  • the motion sensor is configured to acquire motion parameters during the movement of the cleaning robot.
  • the motion sensor includes an odometer, an IMU (the IMU is an abbreviation of the inertial measurement unit, usually including a gyroscope and an accelerometer), and the like, and a sensor for obtaining a motion parameter, or a variety of displacement sensors (such as a resistive displacement sensor, an inductive displacement). Sensors, capacitive displacement sensors, strain gauge displacement sensors, Hall-type displacement sensors, etc.).
  • Various motion sensors can measure or calculate one or more motion parameters such as distance, displacement, angle, acceleration, etc. according to their characteristics.
  • the slope "located at the front end of the casing” means that the slope is located at the front end of the outer casing of the cleaning robot in the direction of its forward movement, as shown in Figs. 1a to 1d.
  • the "forward movement direction” refers to the forward movement direction of the cleaning robot;
  • the "front end” refers to the front end of the cleaning robot in the forward movement direction.
  • the slope surface "inclined upward in the direction of the forward motion” means that the normal direction of the slope surface is at an acute angle to the forward motion direction of the cleaning robot ( ⁇ angle in FIG. 1b, 0° ⁇ 90°) .
  • the monocular camera is set to be "substantially in the same direction as the slope", and the optical axis direction of the monocular camera (ie, the direction of the monocular camera, as shown in FIG. 1b) and the slope method are
  • the angle between the line directions i.e., the angle ⁇ in Fig. 1b
  • ⁇ 0 the direction of the monocular camera, substantially coincides with the normal direction of the slope.
  • the direction of the monocular camera is at an acute angle to the direction of forward motion of the cleaning robot (i.e., the angle ⁇ in Figure 1b, also referred to as the elevation angle of the monocular camera in the present invention; 0° ⁇ ⁇ ⁇ 90°).
  • the above description essentially illustrates that a diagonally upward and forward monocular camera is provided in the outer casing of the front end of the cleaning robot in the forward direction of motion, as shown in FIG. 1c, and there is a monocular camera above the lens of the monocular camera.
  • the optical axis is generally transparent in the same direction as the slope. This arrangement allows light to be imaged on the monocular camera through the transparent slope, causing the monocular camera to capture an image of the scene information including the ceiling above the cleaning robot's heading direction and the front scene information.
  • Some Sweepers of the prior art are cameras that are vertically arranged at the top of the Sweeper.
  • One problem with such Sweepers is that cameras that are vertically oriented generally can only capture scene images of the ceiling in the house, and most of the ceilings in the house are Simpler features, fewer feature points (such as pure white ceilings, or ceilings with simple grid patterns), which allows the sweeper to acquire fewer feature points, which may not be sufficient for accurate positioning and mapping.
  • the characteristics of the scene are used as a reference (most of the scenes of the ceiling are fixed, although the features are small, it can provide reference), and it is judged whether the front scene moves or not, thereby avoiding the construction of the cleaning robot and the positioning and positioning of the cleaning robot. Navigation interference. Therefore, the cleaning robot of the present invention can avoid the problem of difficulty in positioning and drawing caused by the lesser feature information on the ceiling, and also provides the static scene feature of the ceiling as a fixed reference for the cleaning robot in the same image to determine the forward motion. Whether the scene in front of the direction has moved.
  • the monocular camera is coupled to the data processing unit to transmit the captured image to the data processing unit.
  • the invention does not limit the type of the camera, and may be a standard camera, a wide-angle camera, a fisheye camera or other various types of cameras; the image of the acquired scene may be an image under visible light or an image under infrared light, and the invention The type of camera is not limited.
  • the storage unit is coupled to the data processing unit for storing data.
  • the stored data includes, but is not limited to, various types of sensor data (the sensor data includes motion parameters of distance, displacement, angle, acceleration, etc. acquired by the motion sensor during motion of the motion unit, and images captured by the monocular camera). It can also include other sensors (such as collision sensors, laser radar, ultrasonic, TOF (time of flight sensors), infrared distance sensors, proximity sensors, cliff sensors, etc.) to acquire, calculate, and generate data), and intermediate data for data processing unit operations. , new or dumped map data and/or maps, etc.
  • the present invention does not impose any restrictions on the type, use, and source of data stored by the storage unit.
  • the data processing unit is configured to acquire an image captured by the monocular camera and motion parameters of the motion sensor, and locate the cleaning robot according to the positioning and mapping algorithm, establish a map, plan a route, and control the motion unit according to the plan.
  • the route runs.
  • the data processing unit can also perform optimization maps, map corrections, loop detection, and the like.
  • the data, information, and program may be acquired from the storage unit as needed, and the obtained data, information, and intermediate files are stored in the storage unit. .
  • the present invention does not limit the acquisition/storage of various types of data/information between the data processing unit and the storage unit.
  • the chassis of the cleaning robot includes a motion unit (such as a wheel), a motion sensor (such as an odometer), and a data processing unit (such as a processor such as an MCU or a DSP), so that the chassis as a device component is functionally used in the present invention. It is divided into multiple functional units.
  • a motion unit such as a wheel
  • a motion sensor such as an odometer
  • a data processing unit such as a processor such as an MCU or a DSP
  • the outer casing may house the above-mentioned various components and functional units of the cleaning robot, as shown in FIG. 1a; or one or more of the motion unit, the cleaning unit, and the motion sensor may not be accommodated or fully accommodated.
  • a cleaning robot provides a protective edge above the slope surface at a periphery of the slope surface to protect the slope surface, thereby protecting the monocular camera, as shown in FIG. 1d. Shown. Since the cleaning robot in which the collision sensor is mounted usually detects the obstacle in front by collision, the position of the obstacle is determined, and the collision increases the risk of the monocular camera being damaged, since the protection edge is higher and higher than the slope, In the event of a collision, the protective edge collides with the slope to protect the slope and protect the monocular camera under the slope from being damaged by the front obstacle.
  • the change angle of the elevation angle ⁇ should be within the light transmission range of the slope surface, so that the monocular camera can obtain better transmittance; in one embodiment, the change of the elevation angle ⁇ makes the direction and position of the monocular camera
  • the angle between the normal directions of the slopes i.e., the angle a in Figure 1b is always maintained in the range of less than 45°.
  • the elevation angle ⁇ can also be set to reciprocate (or be referred to as scanning) at a certain frequency during the operation of the cleaning robot to increase the range of scanning.
  • the frequency can be matched to the speed of movement of the cleaning robot such that at least a portion of the same feature of the same scene can be scanned in multiple scans for subsequent positioning based on features of the same scene. Accordingly, in the image processing process, these same features are extracted and calculated in the scanned image of the same feature including the same scene according to the scanning frequency and the moving speed of the cleaning robot, thereby positioning the cleaning robot.
  • the direction of the monocular camera of the cleaning robot can also reciprocate in a horizontal direction at a certain frequency during the working process of the cleaning robot (or horizontal scanning; the horizontal direction is the moving plane of the cleaning robot), and obtain more horizontal directions.
  • Scenery information can also reciprocate in a horizontal direction at a certain frequency during the working process of the cleaning robot (or horizontal scanning; the horizontal direction is the moving plane of the cleaning robot), and obtain more horizontal directions.
  • a cleaning robot further includes a communication module (such as a WiFi module), and the communication module is wirelessly connected to the server, as shown in FIG. 2.
  • a communication module such as a WiFi module
  • FIG. 2 includes various components such as a sensor, a microphone, a speaker, a server, a user terminal device, and a communication module, it does not mean that the components included in FIG. 2 must be in the same embodiment. These components can be used individually or in combination with each other as needed.
  • the data processing unit transmits data or commands to the server through the communication module, and/or acquires information of the server through the communication module.
  • the captured image, the calculated self-positioning data, and the like are sent to the server for calculation, and the map and path plan established after the operation are downloaded from the server to the cleaning robot through the communication module; or the data processing unit of the cleaning robot is completed. Positioning, drawing operations, and planning the path, and only storing some setting information, historical information, and/or map information and data on the server.
  • the present invention does not limit the allocation of data and functions stored between the storage unit and the data processing unit of the server and the cleaning robot.
  • a cleaning robot further includes a communication module, and the communication module is wirelessly connected to a user terminal device (such as a mobile phone, a computer, a pad), as shown in FIG. 2 .
  • the data processing unit receives a command of the user terminal device through the communication module; and/or the data processing unit transmits information (which may be sound, image, and/or text information) to the user terminal device through the communication module.
  • the command may include a query command of the user terminal device, a control command (such as a switch cleaning robot, selecting a cleaning mode, etc.), and the like.
  • the present invention does not limit the commands that the cleaning robot receives from the user terminal device and the information transmitted to the user terminal device.
  • the cleaning robot may include only the user terminal device without including the server of the previous embodiment, or may include the server at the same time.
  • the communication module can separately send/receive data/information corresponding to the two components with the user terminal device and the server, or can be the communication module, the user terminal device, and the server. Transmit, calculate, analyze, and integrate data/information as a whole.
  • a cleaning robot further includes a microphone and/or a speaker, as shown in FIG.
  • the microphone acquires a voice control command and sends it to the data processing unit.
  • the relevant control command is executed according to the matching result; in some embodiments, in order to save the local computing power of the cleaning robot, the data processing unit may also be translated.
  • the code is sent by the communication module to the server and/or the user terminal device through the wireless network for matching, and the matching result is sent back to the cleaning robot to execute the relevant control command.
  • the voice control command may include various commands such as a query command, a switch cleaning robot, and a cleaning mode.
  • the speaker sends out information, such as an alarm clock, a schedule reminder, a plan reminder, an alarm notification, a cleaning robot failure notification, a cleaning robot information notification (such as "cleaned designated area", “charge notification”), and the like.
  • information such as an alarm clock, a schedule reminder, a plan reminder, an alarm notification, a cleaning robot failure notification, a cleaning robot information notification (such as "cleaned designated area", “charge notification”), and the like.
  • the slope surface is covered with a light-transmitting material, such as glass or a highly translucent plastic, so that the monocular camera can obtain clear and bright images from the outside through the light-transmitting material, and simultaneously The camera can also protect against dust and collision.
  • a light-transmitting material such as glass or a highly translucent plastic
  • the monocular camera can obtain clear and bright images from the outside through the light-transmitting material, and simultaneously The camera can also protect against dust and collision.
  • Optional plastic light-transmissive materials are GPPS, transparent ABS, AS, PMMA, PC, PS.
  • the PMMA acrylic material has a light transmittance of 92% to 95%, which is larger than that of ordinary glass (about 80%).
  • a higher-strength light-transmissive film such as a polyvinyl chloride or a polyethylene film can also be selected.
  • the invention does not limit the light-transmitting material.
  • the direction of the monocular camera coincides with the normal direction of the slope surface, so that the maximum transmittance can be obtained, and the image captured by the monocular camera is the clearest.
  • the monocular camera has a vertical angle of view ranging from 50° to 100°, that is, a wide-angle lens as a monocular camera.
  • a wide-angle lens image distortion can be reduced and image distortion can be reduced with as much information as possible.
  • the use of a fisheye lens with a larger vertical field of view (with a vertical field of view of up to 180°) increases image distortion. Therefore, it is necessary to select a suitable lens as a monocular camera according to the needs and functions to be realized.
  • the positioning and mapping algorithm includes: identifying feature information in at least two images captured by the monocular camera; and cleaning the robot when the at least two images are captured according to the motion sensor
  • the motion parameters are calculated to obtain the displacement and attitude of the cleaning robot, and the position of the cleaning robot is calculated when each image is taken to realize the positioning of the cleaning robot.
  • the main function of the monocular camera is to locate, that is, to determine the position of the cleaning robot itself.
  • the monocular camera also has functions of recognizing objects, recognizing people, and identifying scenes.
  • a cleaning robot further includes a distance sensor (such as an infrared ranging sensor, an ultrasonic ranging sensor, a PSD sensor, a TOF (Time of Flight) ranging sensor, etc.), as shown in FIG.
  • a distance sensor such as an infrared ranging sensor, an ultrasonic ranging sensor, a PSD sensor, a TOF (Time of Flight) ranging sensor, etc.
  • the distance sensor detects an obstacle
  • the relative position of the obstacle relative to the cleaning robot (which can be calculated from the relative distance and relative angle of the obstacle to the cleaning robot) is sent to the data processing unit.
  • the positioning and mapping algorithm further includes: a position of the cleaning robot according to the relative position of the obstacle sent by the distance sensor and the distance sensor obtaining the relative position of the obstacle (ie, the coordinates of the cleaning robot at the moment, To determine the position of the cleaning robot, for example, the starting point of the cleaning robot is taken as the origin, the direction of the forward movement of the cleaning robot is the x-axis, and the x-axis is rotated counterclockwise by 90° around the origin to form a rectangular coordinate system of the y-axis; At some point, the coordinates of the cleaning robot in the two-dimensional coordinate system represent the position of the cleaning robot at this time.
  • the position of the obstacle is determined (again, the position of the obstacle is the coordinate of the obstacle in the coordinate system of the cleaning robot described above); the position of the obstacle according to the determination , to establish a map within the work area, to achieve the construction of the work area.
  • Step S100 setting a virtual area, and setting a cleaning robot on the inside of the virtual area or on the boundary of the virtual area, the virtual area being set by a preset instruction for controlling the movement distance and the rotation angle of the cleaning robot, the command making The enclosed space enclosed by the running track of the cleaning robot running in the actual working area constitutes the virtual area.
  • step S200 the cleaning robot is controlled to clean the virtual area inside the actual working area.
  • step S100 will be described below by way of example, but those skilled in the art should understand that other instructions can be used to make the virtual space of the closed space enclosed by the running track of the cleaning robot running in the actual working area, so the The examples should not be construed as limiting the invention.
  • the command for controlling the movement distance and the rotation angle of the cleaning robot can be: 1) starting the cleaning robot with the current position as the starting position, and controlling the cleaning robot to run the first preset length in the current direction (measured by the odometer on the cleaning robot) , as shown in Figure 3b, point a to point b; 2) steering preset steering (preset steering includes both the direction of rotation and the angle of rotation, in this case 90 degrees counterclockwise with reference to the current direction of the cleaning robot); 3) Run the second preset length in the current direction, as in point b to point c in Figure 3b; 4) turn to the preset steering (in this example, still rotate counterclockwise by 90°); 5) run in the current direction.
  • the instructions cause the closed space (i.e., the rectangle abcd) enclosed by the running track of the cleaning robot to operate in the actual working area to constitute the virtual area.
  • the first preset length and the second preset length are not limited, and can be set according to requirements.
  • the preset steering includes both the rotation direction and the rotation angle, and may be a left turn or a right turn. The angle of each turn may be set as needed, which is not limited by the present invention.
  • the first preset length is 6m
  • the second preset length is 4m
  • the preset steering is left turn
  • the steering angle is 90°
  • the closed space obtained by the instruction to control the movement distance and the rotation angle of the cleaning robot is as shown in FIG. 3b. Show.
  • the cleaning robot encounters an obstacle when dividing the virtual area (ie, encounters an obstacle in the process of executing the control of the movement distance and the rotation angle of the cleaning robot to obtain a closed space), then controlling The operation logic for obscuring obstacles is added to the instruction to clean the robot's movement distance and rotation angle.
  • the operation logic for obscuring obstacles is added to the instruction to clean the robot's movement distance and rotation angle.
  • the cleaning robot is controlled to turn to the preset steering after encountering an obstacle ( In this example, it is rotated 90° counterclockwise and runs along the edge of the obstacle (can be close to the edge of the obstacle, or it can maintain a constant distance or a certain distance from the edge of the obstacle; in this case, the cleaning robot From the c1 point to the c2 point to the c3 point trajectory), the projection length of the trajectory running along the obstacle edge in the current direction before the obstacle (ie, from the b point to the c1 point) is calculated in real time.
  • an obstacle In this example, it is rotated 90° counterclockwise and runs along the edge of the obstacle (can be close to the edge of the obstacle, or it can maintain a constant distance or a certain distance from the edge of the obstacle; in this case, the cleaning robot From the c1 point to the c2 point to the c3 point trajectory), the projection length of the trajectory running along the obstacle edge in the current direction before the obstacle (ie, from the b point to the c1 point)
  • controlling the cleaning robot to clean the virtual area inside the actual working area in the above step S200 includes:
  • Step S210 executing the instruction to control the movement distance and the rotation angle of the cleaning robot to obtain a closed space, and the boundary of the closed space can be regarded as a boundary of the virtual area.
  • the process of executing the command is described in the previous embodiment, and is not described herein again.
  • the present invention does not limit the shape of the closed space set by the command.
  • step S220 the inside of the closed space is cleaned.
  • control cleaning robot cleans the interior of the enclosed space by using a bow-shaped cleaning method, as shown in FIG. 3e, including:
  • the first cleaning direction may be determined according to the direction of the boundary where the current position of the cleaning robot is located, which is not specifically limited in the present invention.
  • the cleaning robot can also be controlled to run to one end of any boundary in the virtual area.
  • the cleaning robot is controlled to run to the nearest end of the closest boundary of the current position of the cleaning robot.
  • S520 Control the cleaning robot to turn to the first steering direction to the first offset direction that is the same as the boundary direction of the encounter, continue to run the first offset length, and turn to the first steering to be parallel and opposite to the first cleaning direction.
  • the second cleaning direction Control the cleaning robot to turn to the first steering direction to the first offset direction that is the same as the boundary direction of the encounter, continue to run the first offset length, and turn to the first steering to be parallel and opposite to the first cleaning direction.
  • the first steering and the second steering both include a rotation direction and a rotation angle.
  • the first steering may be a left turn or a right turn, and the specific steering direction and steering angle are preset by a program or determined according to a distribution characteristic of the virtual area.
  • the first offset length is a projected length of the running trajectory along the first offset direction on a perpendicular line in the first cleaning direction.
  • the first offset length and the second offset length are generally the main brush width, preferably 10 cm to 15 cm, which enables the cleaning robot to avoid missing uncleaned areas during cleaning in adjacent cleaning directions during the cleaning process.
  • S530 Control the cleaning robot to run in the second cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area.
  • S540 Control the cleaning robot to turn to the second steering direction to continue to run the second offset length in the second offset direction that is the same as the boundary direction of the encounter, and turn to the second steering direction to the second cleaning direction.
  • the second steering is opposite to the first steering, and the first offset direction and the second offset direction may be the same or different, but the first offset direction and the second offset direction are in a direction perpendicular to the cleaning direction. the same.
  • the first offset length is the same as the second offset length
  • the first offset length is a projection length of the running track in the first offset direction on the perpendicular line in the first cleaning direction
  • the second offset length is along the second The projection length of the offset direction running track on the vertical line in the second cleaning direction.
  • the cleaning robot encounters an obstacle while running in the current cleaning direction in the virtual area, the cleaning robot is controlled to run along the edge of the obstacle, and the trajectory running along the edge of the obstacle is calculated in real time on the vertical line of the current cleaning direction.
  • the projection length when the projection length is equal to the third offset length, controls the cleaning robot to turn to the cleaning direction opposite to the current cleaning direction to continue running, and when the projection length is equal to 0, the control cleaning robot is turned to the current cleaning direction to continue the operation.
  • Step S300 dividing the area other than the virtual area in the actual working area according to the virtual area to obtain a plurality of extended virtual areas.
  • Step S400 sequentially cleans the other extended virtual areas in sequence, and determines the next extended virtual area to be cleaned according to the distance between the current position and the other extended virtual areas after the cleaning robot cleans the current virtual area.
  • the cleaning robot cleans the current virtual area (including the current extended virtual area)
  • the extended virtual area closest to the current position of the cleaning robot is set as the next extended virtual area to be cleaned.
  • the cleaning robot of the invention increases the type of features in the captured image by installing a monocular camera with a simple structure and an acute angle at the front of the cleaning robot, thereby improving the accuracy of positioning and mapping of the cleaning robot, thereby improving the path.
  • the planning/navigation effect avoids the inefficiency of the random cleaning mode, and also improves the recognition effect and obstacle avoidance effect on the object, enables efficient cleaning at low cost, and realizes object recognition, face recognition, real-time monitoring, Security monitoring and other functions expand the application field of cleaning robots.
  • the cleaning robot of the invention also has a positioning and mapping algorithm, which can divide the actual working area into a plurality of virtual areas covering the entire working area without the initial map, and cooperate with the image acquired by the monocular camera, in turn The method of cleaning the inside of each virtual area achieves efficient cleaning of unknown areas.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un robot de nettoyage, comprenant une unité de mouvement, un capteur de mouvement, une caméra monoculaire, une unité de stockage, une unité de traitement de données, une unité de nettoyage et un boîtier. Le capteur de mouvement obtient un paramètre de mouvement du robot de nettoyage pendant un processus de mouvement. La caméra monoculaire est utilisée pour photographier une image. La caméra monoculaire est disposée dans une pente située sur l'extrémité avant du boîtier et s'inclinant vers le haut le long d'une direction de mouvement vers l'avant, et est approximativement dans la même direction que la pente. La direction de la caméra monoculaire forme un angle aigu avec la direction de mouvement vers l'avant du robot de nettoyage. L'unité de traitement de données est utilisée pour obtenir l'image photographiée par la caméra monoculaire et le paramètre de mouvement du capteur de mouvement, effectuer un positionnement, une création de carte et une planification d'itinéraire sur le robot de nettoyage, et commander le fonctionnement de l'unité de mouvement selon un itinéraire planifié. Au moyen de la caméra monoculaire ayant une structure de montage simple et un pas aigu, le robot de nettoyage améliore la précision de positionnement et de création de carte de celui-ci, en améliorant ainsi l'effet de navigation/planification d'itinéraire.
PCT/CN2018/088143 2018-01-24 2018-05-24 Robot de nettoyage WO2019144541A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/921,095 US11654574B2 (en) 2018-01-24 2020-07-06 Cleaning robot
US18/295,474 US12042926B2 (en) 2018-01-24 2023-04-04 Cleaning robot
US18/738,524 US20240326259A1 (en) 2018-01-24 2024-06-10 Cleaning robot

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810067512.8 2018-01-24
CN201810067512.8A CN108247647B (zh) 2018-01-24 2018-01-24 一种清洁机器人

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/921,095 Continuation-In-Part US11654574B2 (en) 2018-01-24 2020-07-06 Cleaning robot

Publications (1)

Publication Number Publication Date
WO2019144541A1 true WO2019144541A1 (fr) 2019-08-01

Family

ID=62742907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/088143 WO2019144541A1 (fr) 2018-01-24 2018-05-24 Robot de nettoyage

Country Status (3)

Country Link
US (3) US11654574B2 (fr)
CN (1) CN108247647B (fr)
WO (1) WO2019144541A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110471A (zh) * 2021-04-25 2021-07-13 珠海格力电器股份有限公司 设备作业路径规划方法、装置、计算机设备和存储介质
CN114714350A (zh) * 2022-03-31 2022-07-08 北京云迹科技股份有限公司 服务机器人的控制方法、装置、设备及介质
CN114812463A (zh) * 2022-06-27 2022-07-29 山西嘉世达机器人技术有限公司 检测清洁机到达边缘的方法、检测装置、清洁机和介质
EP4011566A4 (fr) * 2019-08-09 2022-10-26 Ecovacs Robotics Co., Ltd. Dispositif mobile autonome
CN115709484A (zh) * 2023-01-09 2023-02-24 常州检验检测标准认证研究院 一种移动机器人安全仿真检测方法及系统

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102100476B1 (ko) 2018-05-04 2020-05-26 엘지전자 주식회사 복수의 이동 로봇 및 그 제어방법
WO2019212240A1 (fr) * 2018-05-04 2019-11-07 Lg Electronics Inc. Pluralité de robots nettoyeurs et leur procédé de commande
US11340079B1 (en) * 2018-05-21 2022-05-24 AI Incorporated Simultaneous collaboration, localization, and mapping
WO2019232803A1 (fr) * 2018-06-08 2019-12-12 珊口(深圳)智能科技有限公司 Procédé de commande mobile, robot mobile et support de stockage informatique
CN108803618A (zh) * 2018-07-12 2018-11-13 上海常仁信息科技有限公司 基于机器人身份证的机器人越界警戒系统
CN108908365A (zh) * 2018-07-26 2018-11-30 佛山市神风航空科技有限公司 一种会做家务的飞行机器人
CN109124497B (zh) * 2018-09-21 2021-11-02 何晨亮 一种基于移动支付的室内智能清扫设备
CN210704858U (zh) * 2018-09-28 2020-06-09 成都家有为力机器人技术有限公司 一种具有双目摄像头的清洁机器人
CN111240310A (zh) * 2018-11-13 2020-06-05 北京奇虎科技有限公司 机器人避障处理的方法、装置及电子设备
CN111202470B (zh) * 2018-11-21 2024-09-20 北京石头世纪科技股份有限公司 智能清洁设备、重定位方法及装置、存储介质、电子设备
CN109571470A (zh) * 2018-12-03 2019-04-05 江西洪都航空工业集团有限责任公司 一种机器人
DE102018132175A1 (de) * 2018-12-13 2020-06-18 Vorwerk & Co. Interholding Gmbh System mit zwei Bodenbearbeitungsgeräten und Verfahren dafür
KR20200084449A (ko) * 2018-12-26 2020-07-13 삼성전자주식회사 청소 로봇 및 그의 태스크 수행 방법
KR102234641B1 (ko) * 2019-01-17 2021-03-31 엘지전자 주식회사 이동 로봇 및 복수의 이동 로봇의 제어방법
KR102155095B1 (ko) * 2019-03-26 2020-09-11 엘지전자 주식회사 로봇 청소기
KR102201002B1 (ko) * 2019-03-26 2021-01-12 엘지전자 주식회사 로봇 청소기
CN109976350B (zh) * 2019-04-15 2021-11-19 上海钛米机器人科技有限公司 多机器人调度方法、装置、服务器及计算机可读存储介质
WO2020213955A1 (fr) * 2019-04-16 2020-10-22 주식회사 유진로봇 Procédé et système de diagnostic d'initialisation d'un robot mobile
US11471813B2 (en) * 2019-04-26 2022-10-18 Lg Electronics Inc. Air cleaner
JP7331462B2 (ja) * 2019-05-24 2023-08-23 京セラドキュメントソリューションズ株式会社 ロボットシステム、ロボット制御方法及び電子装置
CN110919644B (zh) * 2019-06-11 2022-02-08 远形时空科技(北京)有限公司 一种利用摄像头设备和机器人进行定位交互的方法及系统
CN110545223A (zh) * 2019-09-03 2019-12-06 珠海格力电器股份有限公司 应用于拖把的语音交互控制方法和拖把
CN110507254A (zh) * 2019-09-24 2019-11-29 小狗电器互联网科技(北京)股份有限公司 一种扫地机
KR20210051014A (ko) * 2019-10-29 2021-05-10 엘지전자 주식회사 로봇 청소기 및 그의 동작 방법
CN112773261B (zh) * 2019-11-04 2022-06-21 美智纵横科技有限责任公司 规避障碍物的方法、装置及扫地机器人
CN112799389B (zh) * 2019-11-12 2022-05-13 苏州宝时得电动工具有限公司 自动行走区域路径规划方法及自动行走设备
KR20220101140A (ko) 2019-11-12 2022-07-19 넥스트브이피유 (상하이) 코포레이트 리미티드 이동 로봇
CN111198565B (zh) * 2020-01-09 2023-07-28 上海山科机器人有限公司 行走机器人、控制行走机器人的方法和行走机器人系统
CN111207753A (zh) * 2020-02-13 2020-05-29 苏州大学 一种多玻璃隔断环境下的同时定位与建图的方法
CN111309032A (zh) * 2020-04-08 2020-06-19 江苏盛海智能科技有限公司 一种无人驾驶车辆的自主避障方法及控制端
CN114355871B (zh) * 2020-09-30 2024-10-18 北京牛特机器人科技有限公司 一种自行走装置及其控制方法
JP2022072184A (ja) * 2020-10-29 2022-05-17 avatarin株式会社 コミュニケーションシステム、ロボット、及び記憶媒体
CN112506182B (zh) * 2020-10-29 2023-03-21 久瓴(江苏)数字智能科技有限公司 扫地机器人定位方法、装置、计算机设备和存储介质
CN112379675B (zh) * 2020-11-27 2024-05-24 广东盈峰智能环卫科技有限公司 环卫机器人的控制方法及系统、环卫机器人
CN112484718B (zh) * 2020-11-30 2023-07-28 海之韵(苏州)科技有限公司 一种基于环境地图修正的边沿导航的装置和方法
CN114680732A (zh) * 2020-12-25 2022-07-01 苏州宝时得电动工具有限公司 一种清洁机器人及其清洁控制方法
US20220206505A1 (en) * 2020-12-30 2022-06-30 Southeast University Geometric folding full coverage path for robot and method for generating same
EP4059403B8 (fr) * 2021-01-19 2023-12-27 Yituo Electric Co., Ltd. Procédé de commande et dispositif de commande d'equipment de nettoyage
CN113050633B (zh) * 2021-03-12 2024-05-31 苏州挚途科技有限公司 清扫轨迹的确定方法、装置和自动清扫设备
CN113510716A (zh) * 2021-04-28 2021-10-19 哈尔滨理工大学 一种基于球形电机的仿人型护理机器人
CN113467468B (zh) * 2021-07-23 2024-03-29 合肥工业大学 一种基于嵌入式的机器人智能避障系统及方法
CN113341851A (zh) * 2021-08-04 2021-09-03 深圳康易世佳科技有限公司 一种ai语音交互及视觉定位回充的扫地机控制电路
CN114617484A (zh) * 2021-11-30 2022-06-14 追觅创新科技(苏州)有限公司 清洁设备的清洁方法、清洁设备及存储介质
CN114348579A (zh) * 2021-12-31 2022-04-15 深圳云天励飞技术股份有限公司 搬运机器人的控制方法及相关设备
CN114808810B (zh) * 2022-04-12 2022-12-30 吉林大学 一种半自主交互式无人清扫车清扫系统及清扫方法
CN114747982B (zh) * 2022-04-18 2023-03-24 麦岩智能科技(北京)有限公司 一种k型的清洁机器人摄像头排布方法
CN116027794B (zh) * 2023-03-30 2023-06-20 深圳市思傲拓科技有限公司 一种基于大数据的泳池机器人自动定位管理系统及方法
CN116403359B (zh) * 2023-06-07 2023-08-15 深圳市华安泰智能科技有限公司 一种多模态图像识别算法的生产安全预警系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110010329A (ko) * 2009-07-24 2011-02-01 주식회사 모뉴엘 청소로봇의 먼지인식 방법
CN104586322A (zh) * 2013-10-31 2015-05-06 Lg电子株式会社 移动机器人及其工作方法
TW201705897A (zh) * 2015-07-29 2017-02-16 Lg電子股份有限公司 行動機器人及其控制方法
CN107569181A (zh) * 2016-07-04 2018-01-12 九阳股份有限公司 一种智能清洁机器人及清扫方法
CN206910290U (zh) * 2016-08-16 2018-01-23 美国iRobot公司 自主移动机器人

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2261762A3 (fr) * 2009-06-12 2014-11-26 Samsung Electronics Co., Ltd. Robot nettoyeur et son procédé de commande
RU2012122469A (ru) 2009-11-06 2013-12-20 Эволюшн Роботикс, Инк. Способы и системы для полного охвата поверхности автономным роботом
DE102011000009A1 (de) * 2011-01-03 2012-07-05 Vorwerk & Co. Interholding Gmbh Verfahren zur gleichzeitigen Bestimmung und Kartenbildung
KR101566207B1 (ko) * 2011-06-28 2015-11-13 삼성전자 주식회사 로봇 청소기 및 그 제어방법
US9764472B1 (en) * 2014-07-18 2017-09-19 Bobsweep Inc. Methods and systems for automated robotic movement
US9519289B2 (en) 2014-11-26 2016-12-13 Irobot Corporation Systems and methods for performing simultaneous localization and mapping using machine vision systems
US9630319B2 (en) * 2015-03-18 2017-04-25 Irobot Corporation Localization and mapping using physical features
US9868211B2 (en) 2015-04-09 2018-01-16 Irobot Corporation Restricting movement of a mobile robot
CN204618113U (zh) * 2015-04-30 2015-09-09 李东梅 监控清洁机器人
CN205889214U (zh) * 2016-06-30 2017-01-18 广东宝乐机器人股份有限公司 一种带旋转摄像头的智能机器人
CN206414227U (zh) * 2016-09-22 2017-08-18 中国科学院深圳先进技术研究院 一种具有声音识别功能的智能扫地机器人
CN107340768B (zh) * 2016-12-29 2020-08-28 珠海市一微半导体有限公司 一种智能机器人的路径规划方法
CN106805856A (zh) * 2016-12-31 2017-06-09 鸿奇机器人股份有限公司 控制清洁机器人的方法
CN107356252B (zh) * 2017-06-02 2020-06-16 青岛克路德机器人有限公司 一种融合视觉里程计与物理里程计的室内机器人定位方法
CN107544507A (zh) * 2017-09-28 2018-01-05 速感科技(北京)有限公司 可移动机器人移动控制方法及装置
CN207424680U (zh) * 2017-11-20 2018-05-29 珊口(上海)智能科技有限公司 移动机器人
CN109984678B (zh) 2017-12-29 2021-08-06 速感科技(北京)有限公司 一种清洁机器人及清洁机器人的清洁方法
CN109984689B (zh) 2017-12-29 2021-09-17 速感科技(北京)有限公司 一种清洁机器人及清洁机器人的路径优化方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110010329A (ko) * 2009-07-24 2011-02-01 주식회사 모뉴엘 청소로봇의 먼지인식 방법
CN104586322A (zh) * 2013-10-31 2015-05-06 Lg电子株式会社 移动机器人及其工作方法
TW201705897A (zh) * 2015-07-29 2017-02-16 Lg電子股份有限公司 行動機器人及其控制方法
CN107569181A (zh) * 2016-07-04 2018-01-12 九阳股份有限公司 一种智能清洁机器人及清扫方法
CN206910290U (zh) * 2016-08-16 2018-01-23 美国iRobot公司 自主移动机器人

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4011566A4 (fr) * 2019-08-09 2022-10-26 Ecovacs Robotics Co., Ltd. Dispositif mobile autonome
CN113110471A (zh) * 2021-04-25 2021-07-13 珠海格力电器股份有限公司 设备作业路径规划方法、装置、计算机设备和存储介质
CN113110471B (zh) * 2021-04-25 2023-03-21 珠海格力电器股份有限公司 设备作业路径规划方法、装置、计算机设备和存储介质
CN114714350A (zh) * 2022-03-31 2022-07-08 北京云迹科技股份有限公司 服务机器人的控制方法、装置、设备及介质
CN114714350B (zh) * 2022-03-31 2024-03-26 北京云迹科技股份有限公司 服务机器人的控制方法、装置、设备及介质
CN114812463A (zh) * 2022-06-27 2022-07-29 山西嘉世达机器人技术有限公司 检测清洁机到达边缘的方法、检测装置、清洁机和介质
CN115709484A (zh) * 2023-01-09 2023-02-24 常州检验检测标准认证研究院 一种移动机器人安全仿真检测方法及系统

Also Published As

Publication number Publication date
US12042926B2 (en) 2024-07-23
US20230241783A1 (en) 2023-08-03
CN108247647A (zh) 2018-07-06
US20240326259A1 (en) 2024-10-03
US20200331148A1 (en) 2020-10-22
US11654574B2 (en) 2023-05-23
CN108247647B (zh) 2021-06-22

Similar Documents

Publication Publication Date Title
WO2019144541A1 (fr) Robot de nettoyage
CN109890573B (zh) 移动机器人的控制方法、装置、移动机器人及存储介质
CN109947109B (zh) 机器人工作区域地图构建方法、装置、机器人和介质
JP5946147B2 (ja) 可動式ヒューマンインターフェースロボット
JP6732746B2 (ja) 機械視覚システムを使用した、同時位置測定マッピングを実施するためのシステム
CN110376934B (zh) 清洁机器人、清洁机器人控制方法及终端控制方法
US20190235490A1 (en) Mobile robot and control method of mobile robot
CN112867424B (zh) 导航、划分清洁区域方法及系统、移动及清洁机器人
KR101813922B1 (ko) 로봇 청소기 및 이의 제어 방법
CN112739244A (zh) 移动机器人清洁系统
CN114847803A (zh) 机器人的定位方法及装置、电子设备、存储介质
US20230057965A1 (en) Robot and control method therefor
KR20180080498A (ko) 공항용 로봇 및 그의 동작 방법
GB2527207A (en) Mobile human interface robot
KR20210053239A (ko) 다중-센서 slam 시스템들을 위한 장치 및 방법들
CN108544494B (zh) 一种基于惯性和视觉特征的定位装置、方法及机器人
CN105919517B (zh) 自动清扫机器人装置
CN212044739U (zh) 一种基于惯性数据和视觉特征的定位装置及机器人
AU2015202200A1 (en) Mobile human interface robot
JP7354528B2 (ja) 自律移動装置、自律移動装置のレンズの汚れ検出方法及びプログラム
TWI824503B (zh) 自移動設備及其控制方法
US20240233340A9 (en) Method and electronic device for training neural network model by augmenting image representing object captured by multiple cameras
KR20240057297A (ko) 신경망 모델을 학습시키는 방법 및 전자 장치
TW202344863A (zh) 語意距離地圖的建構方法及其相關移動裝置
JP2022540594A (ja) 移動ロボット及びその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902762

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902762

Country of ref document: EP

Kind code of ref document: A1