CN218606375U - Intelligent cleaning robot - Google Patents

Intelligent cleaning robot Download PDF

Info

Publication number
CN218606375U
CN218606375U CN202222948138.7U CN202222948138U CN218606375U CN 218606375 U CN218606375 U CN 218606375U CN 202222948138 U CN202222948138 U CN 202222948138U CN 218606375 U CN218606375 U CN 218606375U
Authority
CN
China
Prior art keywords
robot
camera
laser
view
intelligent cleaning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202222948138.7U
Other languages
Chinese (zh)
Inventor
李振
邹强斌
乐毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Gaussian Automation Technology Development Co Ltd
Original Assignee
Shanghai Gaussian Automation Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Gaussian Automation Technology Development Co Ltd filed Critical Shanghai Gaussian Automation Technology Development Co Ltd
Priority to CN202222948138.7U priority Critical patent/CN218606375U/en
Application granted granted Critical
Publication of CN218606375U publication Critical patent/CN218606375U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The utility model relates to a cleaning machines people technical field especially relates to an intelligent cleaning machines people, and it is not enough to aim at solving current intelligent cleaning machines people recognition capability, is difficult to adapt to indoor complex environment's problem. The utility model comprises a head-up laser radar, a rotary table, a front oblique sight depth camera, a front oblique sight RGB camera and a rear oblique sight RGB camera; the head-up laser radar is used for emitting laser to the horizontal direction and receiving the reflected laser so as to obtain the distance between the head-up laser radar and an obstacle; the rotary table is arranged at the top of the robot shell and drives the head-up laser radar to rotate; the front oblique depth camera is used for collecting a three-dimensional image; the front squint RGB camera is used for collecting a color image; the rear oblique view RGB camera is used for collecting color images. Through the cooperation of each sensor, can effectively improve cleaning machines people's recognition capability, carry out reasonable obstacle avoidance and clean, improve the adaptability to indoor complex environment, improve clean efficiency.

Description

Intelligent cleaning robot
Technical Field
The utility model relates to a cleaning machines people technical field especially relates to an intelligent cleaning machines people.
Background
Indoor environments such as hotels, office buildings, shopping malls, supermarkets and other places have high requirements on the intelligent robot due to large personnel circulation, large area of the places and frequent environment change. When the existing intelligent cleaning robot identifies the environment, garbage and a moving object, the accuracy and the sensitivity are poor, the robot cannot work in an indoor complex environment smoothly, and the cleaning efficiency and the cleaning effect are poor.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to provide an intelligence cleaning machines people to it is not enough to solve current intelligence cleaning machines people discernment ability, is difficult to adapt to indoor complex environment's problem.
In order to solve the technical problem, the utility model provides a technical scheme lies in:
the utility model provides an intelligent cleaning robot, which comprises a head-up laser radar, a turntable, a front squint depth camera, a front squint RGB camera and a rear squint RGB camera; the head-up laser radar is used for transmitting laser to the horizontal direction and receiving the reflected laser so as to obtain the distance between the head-up laser radar and an obstacle; the rotary table is arranged at the top of the robot shell, the head-up laser radar is arranged on the rotary table, and the rotary table can rotate around the axis of the rotary table so as to drive the head-up laser radar to rotate; the front oblique-view depth camera is arranged on the front side of the robot shell in the advancing direction and used for collecting three-dimensional images, the coverage area of the front oblique-view depth camera is a fan-shaped area with an included angle of 73 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 60 degrees; the front squint RGB camera is arranged on the front side of the robot shell in the advancing direction, is positioned on the left side or the right side of the front squint depth camera and is used for collecting color images, the coverage area of the front squint RGB camera is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees; the rear oblique view RGB camera is arranged on the rear side of the robot shell in the advancing direction and used for collecting color images, the coverage range of the rear oblique view RGB camera is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees.
Furthermore, the intelligent cleaning robot also comprises a welting sensor, and the welting sensor is arranged on the side surface of the robot shell in the advancing direction; the welt sensor comprises PSD single-point welt laser and welt small pupil line laser; the PSD single-point welting laser is arranged on one side close to the chassis, comprises a transmitting end and a receiving end and is used for detecting the distance between the PSD single-point welting laser and a wall; the welt small pupil line laser is arranged above the PSD single-point welt laser and used for detecting the distance between the wall and the wall.
Further, intelligence cleaning machines people still includes the top camera of looking, and the top camera of looking is installed in the top of robot housing for acquire the image of top.
Furthermore, the top view camera is a depth camera and is used for acquiring a three-dimensional image above the top view camera.
Further, intelligence cleaning machines people still includes bottom line laser, and bottom line laser sets up in the front side of the robot housing on the direction of advance and is located the below of preceding squint depth camera for detect with the distance of the place ahead barrier.
Furthermore, the bottom line laser irradiates downwards in an inclined way, and the irradiation direction and the horizontal direction form an included angle of 30 degrees.
Furthermore, the intelligent cleaning robot also comprises a forward-looking light supplement lamp, and the forward-looking light supplement lamp is arranged on the front side of the robot shell in the advancing direction; the front-view light supplement lamp is positioned on one side of the front squint depth camera, which is far away from the front squint RGB camera; the position of the forward-looking light supplement lamp is higher than that of the forward-looking oblique depth camera, and the forward-looking light supplement lamp is used for illuminating in the advancing direction.
Furthermore, intelligence cleaning machines people still includes the back vision light filling lamp, and the back vision light filling lamp sets up in the rear side of the robot casing on the direction of advance for the illumination of the direction of retreating.
Furthermore, intelligence cleaning machines people still includes safely touches the limit, touches the limit safely and sets up both sides around the robot housing, configures to and triggers and make the walking motor stall after receiving the extrusion.
Furthermore, the intelligent cleaning robot also comprises a pile aligning sensor; the pile aligning sensor is arranged on the rear side of the robot shell in the advancing direction; the pile sensor is used for transmitting and receiving reflected infrared light so as to measure the distance between the pile sensor and the charging pile.
Synthesize above-mentioned technical scheme, the utility model discloses the technological effect that can realize lies in:
the utility model provides an intelligent cleaning robot, which comprises a head-up laser radar, a turntable, a front oblique sight depth camera, a front oblique sight RGB camera and a rear oblique sight RGB camera; the head-up laser radar is used for emitting laser to the horizontal direction and receiving the reflected laser so as to obtain the distance between the head-up laser radar and an obstacle; the rotary table is arranged at the top of the robot shell, the head-up laser radar is arranged on the rotary table, and the rotary table can rotate around the axis of the rotary table so as to drive the head-up laser radar to rotate; the front oblique-view depth camera is arranged on the front side of the robot shell in the advancing direction and used for collecting three-dimensional images, the coverage area of the front oblique-view depth camera is a fan-shaped area with an included angle of 73 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 60 degrees; the front squint RGB camera is arranged on the front side of the robot shell in the advancing direction, is positioned on the left side or the right side of the front squint depth camera and is used for collecting color images, the coverage area of the front squint RGB camera is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees; the rear oblique view RGB camera is arranged on the rear side of the robot shell in the advancing direction and used for collecting color images, the coverage range of the rear oblique view RGB camera is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees.
The utility model provides an intelligence cleaning machines people is through head up laser radar, revolving stage, preceding squint depth camera, preceding squint RGB camera and the cooperation of back squint RGB camera, the barrier of omnidirectional real-time supervision removal in-process. The head-up laser radar and the rotary table are matched to monitor objects around for navigation and obstacle avoidance, and a front oblique sight depth camera is used for avoiding a blind area of the head-up laser radar in the advancing direction. And then, acquiring environmental information through the front squint RGB camera and the rear squint RGB camera, identifying objects such as electric wires, running water, carpets and the like, and providing information for determining a cleaning strategy and planning a path for the robot. Through the cooperation of each sensor, can effectively improve cleaning machines people's recognition ability, eliminate the blind area, carry out reasonable obstacle avoidance and clean, improve the adaptability to indoor complex environment, improve clean efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the technical solutions in the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a front view of an intelligent cleaning robot provided in an embodiment of the present invention;
fig. 2 is a top view of an intelligent cleaning robot provided in an embodiment of the present invention;
fig. 3 is a left side view of the intelligent cleaning robot provided in the embodiment of the present invention;
fig. 4 is a rear view of the intelligent cleaning robot provided by the embodiment of the present invention.
An icon: 100-head-up lidar; 200-a turntable; 300-front oblique view depth camera; 400-front squint RGB camera; 500-rear squint RGB camera; 610-PSD single point welting laser; 620-a small pupil line near edge laser; 700-top view camera; 800-bottom line laser; 810-line laser; 900-front view fill light; 1000-back vision fill light; 1110-edge brushing; 1200-left fill light; 1300-right fill light.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Some embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
When the existing intelligent cleaning robot identifies the environment, garbage and a moving object, the accuracy and the sensitivity are poor, the robot cannot work in an indoor complex environment smoothly, and the cleaning efficiency and the cleaning effect are poor.
In view of this, the utility model provides an intelligent cleaning robot, including head up laser radar 100, revolving stage 200, forward squint depth camera 300, forward squint RGB camera 400 and backward squint RGB camera 500; the head-up laser radar 100 is configured to emit laser light in a horizontal direction and receive the reflected laser light to acquire a distance to an obstacle; the rotary table 200 is arranged at the top of the robot shell, the head-up laser radar 100 is arranged on the rotary table 200, and the rotary table 200 can rotate around the axis of the rotary table 200 to drive the head-up laser radar 100 to rotate; the front oblique-view depth camera 300 is arranged on the front side of the robot shell in the advancing direction and used for collecting three-dimensional images, the coverage area of the front oblique-view depth camera 300 is a fan-shaped area with an included angle of 73 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 60 degrees; the front squint RGB camera 400 is arranged on the front side of the robot shell in the advancing direction, is positioned on the left side or the right side of the front squint depth camera 300 and is used for collecting color images, the coverage area of the front squint RGB camera 400 is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees; the rear oblique view RGB camera 500 is disposed at the rear side of the robot housing in the forward direction, and is used to collect a color image, the coverage of the rear oblique view RGB camera 500 is a sector area with an included angle of 83 ° in the vertical plane, and an included angle of 65 ° between the lower boundary and the horizontal direction.
The utility model provides an intelligence cleaning machines people removes the barrier of in-process through look ahead laser radar 100, revolving stage 200, preceding squint depth camera 300, preceding squint RGB camera 400 and the cooperation of back squint RGB camera 500, omnidirectional real-time supervision. The surrounding objects are monitored through cooperation of the head-up laser radar 100 and the rotary table 200 to perform navigation and obstacle avoidance, and a blind area of the head-up laser radar 100 in the advancing direction is avoided by the front oblique depth camera 300. And then, the environmental information is acquired through the front squint RGB camera 400 and the rear squint RGB camera 500, objects such as electric wires, running water, carpets and the like are identified, and information is provided for the robot to determine a cleaning strategy and plan a path. Through the cooperation of each sensor, can effectively improve cleaning machines people's recognition ability, carry out reasonable obstacle avoidance and clean, improve the adaptability to indoor complex environment, improve clean efficiency.
The structure and shape of the intelligent cleaning robot provided in the present embodiment will be described in detail below with reference to fig. 1 to 4.
In this embodiment, the head-up laser radar 100 is driven by the turntable 200 to emit laser and receive laser reflected by an obstacle, and the distance between the robot and the obstacle is determined by the time difference between emission and reception, so as to determine the relative position between the obstacle and the robot, and implement navigation and obstacle avoidance. The obstacle and the environmental change can be rapidly recognized through the high-speed rotation of the turntable 200 and the cooperation of the head-up laser radar 100, so that the influence caused by the flow of the people can be recognized in time, and the collision is avoided.
Furthermore, the top of the robot housing in this embodiment is an inclined plane, and the inclined plane is inclined downward from back to front in the forward direction, so that the head-up laser radar 100 is disposed at the top of the robot housing and located at a side close to the backward direction, and the influence of the inclined plane on the head-up laser radar 100 is prevented from shielding laser light, so that the scanning measurement of the surrounding environment can be performed without shielding, and the functions of map construction, autonomous movement, obstacle avoidance, falling prevention and the like are realized.
In this embodiment, the front oblique-view depth camera 300 is installed at the front side of the robot housing in the forward direction, and is located at the central position in the height direction, the viewing angle of the front oblique-view depth camera 300 faces obliquely downward, the coverage range is a sector area with an included angle of 73 ° in the vertical plane, the included angle between the upper boundary and the horizontal direction is 13 °, and the included angle between the lower boundary and the horizontal direction is 60 °. The forward squint depth camera 300 operates on the principle that a set of infrared laser pulses is emitted outward, and reflected back into the camera after encountering an object, the time difference or phase difference between the emission and the reflection of the infrared laser is calculated, so that data can be collected to form a set of distance depth data, a three-dimensional image can be obtained, and a three-dimensional model can be constructed according to the distance depth data. Supplementary head-up laser radar 100 of looking at depth camera 300 accomplishes and keeps away the barrier through preceding squint for the blind area risk of head-up laser radar 100 is prevented for detect the barrier that is arranged in near in the robot, and with the barrier that is arranged in ground as leading.
In this embodiment, the front oblique view RGB camera 400 has three basic color components, namely red, green, blue, given by three different cables, and thus a color image is composed of three primary colors. The front squint RGB camera 400, by sensing the color image, classifies the objects in the image by means of the upper computer, and instructs the robot to perform corresponding actions to determine the operation of obstacle avoidance or cleaning. The front squint RGB camera 400 is installed on the right side or the left side of the front squint depth camera 300, and is mainly used for observing and inspecting the surrounding environment in the running process of the robot and detecting objects such as electric wires, running water, carpets and ground mats on the ground, so that the robot is assisted to complete path planning and determine a cleaning strategy, and accurate and efficient cleaning is realized. The coverage of the front oblique view RGB camera 400 is greater than the front oblique view depth camera 300 and includes the coverage of the front oblique view depth camera 300. Specifically, the coverage area of the front squint RGB camera 400 is a sector area with an included angle of 83 ° in the vertical plane, the included angle between the upper boundary and the horizontal direction is 18 °, the included angle between the lower boundary and the horizontal direction is 65 °, both the upper boundary and the lower boundary exceed the boundary of the front squint depth camera 300 by 5 °, thereby effectively assisting the front squint depth camera 300 to acquire three-dimensional image information.
In this embodiment, the rear oblique-view RGB camera 500 is located at the rear side of the robot housing in the forward direction and above the charging port, and the rear oblique-view RGB camera 500 and the front oblique-view RGB camera 400 are RGB cameras, and have the same working principle and similar functions, and the main difference is that the rear oblique-view RGB camera 500 is used for identifying a ground object in the backward movement of the robot. In this embodiment, the coverage of the rear oblique view RGB camera 500 is the same as the coverage of the front oblique view RGB camera 400.
In an alternative provided by this embodiment, the intelligent cleaning robot further comprises a welt sensor. Because the rubbish that is close to the wall is difficult to be cleaned, if cleaning machines people can not be fully close to the wall, then clean the effect relatively poor, but the adherence cleans the risk that has the striking wall again. The adherent cleaning task is completed by setting a sensor close to the edge to sense the distance between the robot and the wall. Specifically, the welt sensors are mounted on two sides of the robot shell in the advancing direction, comprise two sensors of PSD single-point welt laser 610 and welt small pupil line laser 620 and are mounted at different positions on the same side of the robot. The PSD full-name Position Sensitive Detector is positioned on the front side of a chassis driving wheel, consists of a light emitting end and a receiving end and is used for detecting the distance between the light emitting end and a wall. The small pupil line welt laser 620 is also used for assisting the robot in performing welt cleaning tasks, is arranged at the middle upper part of the robot edge brush 1110 and is higher than the PSD single-point welt laser 610, and is mainly used for detecting the welt distance, supplementing the visual blind area of the PSD single-point welt laser 610 and preventing the robot from colliding in the welt cleaning process due to factors such as wall decorations and the like.
In an alternative provided by this embodiment, the intelligent cleaning robot further comprises a bottom line laser 800. The bottom line laser 800 is positioned behind the front side collision plate at the bottom of the robot, is lower than the front oblique-view depth camera 300, and consists of three same line lasers 810, and the installation angle of the line lasers is 30 degrees downwards, namely, the irradiation direction and the horizontal direction form an included angle of 30 degrees. The main effect is to supplement the blind area of the front oblique-view depth camera 300, detect the obstacle in front that the ground is closer to the robot, and avoid the robot to bump the obstacle. Specifically, one line laser 810 is disposed in the middle, and two line lasers 810 are disposed at positions close to the side of the robot housing to enlarge the detection area.
In an alternative provided by this embodiment, the intelligent cleaning robot further includes a top view camera 700. Top-view camera 700 is located on top of the robot housing, in front of heads-up laser radar 100, and at a height lower than heads-up laser radar 100 to avoid interfering with heads-up laser radar 100 operation. The head-up camera 700 employs a depth camera for assisting the head-up laser radar 100 in completing the positioning of the robot to improve the accuracy of navigation.
In an alternative provided by this embodiment, the intelligent cleaning robot further comprises a pile sensor. The robot needs to return to the air and automatically align to the charging pile for charging when the electric quantity is insufficient in the cleaning process or after the cleaning task is finished. Use infrared pair pipe sensor to the stake sensor, through infrared transmitting terminal transmission infrared light, utilize the receiving terminal to receive the infrared light and judge and fill the distance of electric pile to supplementary back squint RGB camera 500 accomplishes the action to the stake.
In an alternative provided by this embodiment, the intelligent cleaning robot further comprises a safety touch edge. The safety touch edge is the last defense line for ensuring the safe operation of the robot, the safety touch edge is positioned at the rear of the bottom collision plate, when the robot collides in the advancing process, the collision plate lightly touches the safety touch edge to enable the safety touch edge to be excited, and the upper computer controls the walking motor to stop rotating after the safety touch edge senses that the force exceeds a set threshold value.
In the alternative that this embodiment provided, intelligent cleaning machines people still includes forward looking light filling lamp 900 and back vision light filling lamp 1000 to guarantee that vision sensor can normally work in the place that the light intensity is not enough. The front light supplement lamp 900 is disposed at the front side of the robot housing in the forward direction and is located at a side of the front oblique-view depth camera 300 departing from the front oblique-view RGB camera 400. The front light supplement lamp 900 is equal to the front oblique depth camera 300 in height or is positioned on the upper side of the front oblique depth camera 300 so as to supplement light for the front visual camera; the rear side light supplementing lamp is located on the upper side of the rear oblique view RGB camera and used for supplementing light for the rear oblique view RGB camera.
Further, a left fill light 1200 and a right fill light 1300 are disposed on two sides of the bottom line laser 800, and are used for assisting the forward-looking fill light 900 to enhance the illumination near the ground.
The intelligent robot that this embodiment provided utilizes mutually supporting in order to realize function complementation and reinforcing between the multisensor through setting up multiple sensor and carrying out rational arrangement to it, has improved cleaning robot's intelligent operation, and can avoid the influence of blind area to its work in the operation. The PSD single-point welt laser 610 and the welt small pupil line laser 620 are matched with each other, so that the welt cleaning target is realized, and the danger of wall collision is avoided. The forward squint depth camera 300 is matched with the forward squint RGB camera 400, the head-up laser radar 100, the bottom line laser 800 and the like, so that the identification and the track planning of the obstacle in the advancing direction are realized. Rear oblique view RGB camera 500 cooperatees with a sensor, has realized the discernment of rear portion barrier and the discernment of filling electric pile.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention.

Claims (10)

1. An intelligent cleaning robot is characterized by comprising a head-up laser radar (100), a turntable (200), a front oblique vision depth camera (300), a front oblique vision RGB camera (400) and a rear oblique vision RGB camera (500);
the head-up laser radar (100) is used for emitting laser to the horizontal direction and receiving the reflected laser to acquire the distance to an obstacle;
the rotary table (200) is arranged at the top of the robot shell, the head-up laser radar (100) is arranged on the rotary table (200), and the rotary table (200) can rotate around the axis of the rotary table to drive the head-up laser radar (100) to rotate;
the front oblique-view depth camera (300) is arranged on the front side of the robot shell in the advancing direction and used for collecting three-dimensional images, the coverage range of the front oblique-view depth camera (300) is a sector area with an included angle of 73 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 60 degrees;
the front squint RGB camera (400) is arranged on the front side of the robot shell in the advancing direction, is positioned on the left side or the right side of the front squint depth camera (300) and is used for collecting color images, the coverage area of the front squint RGB camera (400) is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees;
the rear oblique-view RGB camera (500) is arranged on the rear side of the robot shell in the advancing direction and used for collecting color images, the coverage range of the rear oblique-view RGB camera (500) is a sector area with an included angle of 83 degrees in a vertical plane, and the included angle of a lower boundary and the horizontal direction is 65 degrees.
2. The intelligent cleaning robot of claim 1, further comprising a welt sensor disposed at a side of the robot housing in a forward direction;
the welt sensor comprises a PSD single-point welt laser (610) and a welt small pupil line laser (620);
the PSD single-point welting laser (610) is arranged on one side close to the chassis, comprises a transmitting end and a receiving end and is used for detecting the distance between the receiving end and the wall;
the small near pupil line laser (620) is arranged above the PSD single-point near edge laser (610) and used for detecting the distance between the small near pupil line laser and a wall.
3. The intelligent cleaning robot as claimed in claim 2, further comprising a top view camera (700), wherein the top view camera (700) is mounted on top of the robot housing for capturing images above.
4. The intelligent cleaning robot as claimed in claim 3, wherein the top view camera (700) is a depth camera for acquiring a three-dimensional image of the top.
5. The intelligent cleaning robot according to claim 4, further comprising a bottom line laser (800), the bottom line laser (800) being disposed at a front side of the robot housing in a forward direction and below the front oblique-view depth camera (300) for detecting a distance to a front obstacle.
6. The intelligent cleaning robot of claim 5, wherein the bottom line laser (800) is directed obliquely downward with an angle of 30 ° to the horizontal.
7. The intelligent cleaning robot as claimed in claim 6, further comprising a forward-looking light supplement lamp (900), wherein the forward-looking light supplement lamp (900) is disposed at a front side of the robot housing in a forward direction;
the front-view light supplement lamp (900) is positioned on one side, away from the front squint RGB camera (400), of the front squint depth camera (300);
the position of the front-view light supplement lamp (900) is higher than that of the front oblique-view depth camera (300) and is used for illumination in the advancing direction.
8. The intelligent cleaning robot of claim 7, further comprising a rear-view supplementary lighting lamp (1000), wherein the rear-view supplementary lighting lamp (1000) is disposed at the rear side of the robot housing in the forward direction for lighting in the backward direction.
9. The intelligent cleaning robot as claimed in claim 8, further comprising safety contact edges disposed at front and rear sides of the robot housing and configured to be pressed to trigger and stop the walking motor.
10. The intelligent cleaning robot of claim 9, further comprising a pair pile sensor;
the pile pair sensors are arranged on the rear side of the robot shell in the advancing direction;
the pile pair sensor is used for emitting and receiving reflected infrared light so as to measure the distance between the pile pair sensor and the charging pile.
CN202222948138.7U 2022-11-04 2022-11-04 Intelligent cleaning robot Active CN218606375U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202222948138.7U CN218606375U (en) 2022-11-04 2022-11-04 Intelligent cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202222948138.7U CN218606375U (en) 2022-11-04 2022-11-04 Intelligent cleaning robot

Publications (1)

Publication Number Publication Date
CN218606375U true CN218606375U (en) 2023-03-14

Family

ID=85423363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202222948138.7U Active CN218606375U (en) 2022-11-04 2022-11-04 Intelligent cleaning robot

Country Status (1)

Country Link
CN (1) CN218606375U (en)

Similar Documents

Publication Publication Date Title
US10394248B2 (en) Charging pile, method and device for recognizing the charging pile
US20210251450A1 (en) Automatic cleaning device and cleaning method
CN110852312B (en) Cliff detection method, mobile robot control method, and mobile robot
EP2457486B1 (en) Robot cleaner and control method thereof
CN109674402B (en) Information processing method and related equipment
CN112415998A (en) Obstacle classification and obstacle avoidance control system based on TOF camera
CN111035327A (en) Cleaning robot, carpet detection method, and computer-readable storage medium
CN109213137A (en) sweeping robot, sweeping robot system and its working method
CN105982624A (en) Automatic cleaning equipment and anti-jamming handling method and device for automatic cleaning equipment
CN103941306A (en) Cleaning robot and method for controlling same to avoid obstacle
US20220299650A1 (en) Detecting objects using a line array
CN110269547A (en) Self-movement robot and its avoidance processing method
CN205144444U (en) Floor sweeping robot
KR20180074537A (en) Cleaning robot
CN213934205U (en) Self-moving equipment
CN218606375U (en) Intelligent cleaning robot
CN212521687U (en) Autonomous mobile device
CN110088701B (en) Operating method for a self-propelled cleaning device and such a cleaning device
CN115151174A (en) Cleaning robot and cleaning control method thereof
CN209770256U (en) floor sweeping robot
CN208864213U (en) Self-movement robot
EP3949818A1 (en) Robot cleaner
US20230225580A1 (en) Robot cleaner and robot cleaner control method
CN114879690A (en) Scene parameter adjusting method and device, electronic equipment and storage medium
CN209966275U (en) Floor sweeping robot

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant