CN212346419U - Cleaning robot - Google Patents

Cleaning robot Download PDF

Info

Publication number
CN212346419U
CN212346419U CN202020017250.7U CN202020017250U CN212346419U CN 212346419 U CN212346419 U CN 212346419U CN 202020017250 U CN202020017250 U CN 202020017250U CN 212346419 U CN212346419 U CN 212346419U
Authority
CN
China
Prior art keywords
cleaning robot
distance
lidar
main body
central axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020017250.7U
Other languages
Chinese (zh)
Inventor
谭跃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Flyco Electrical Appliance Co Ltd
Original Assignee
Shenzhen Feike Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Feike Robot Co ltd filed Critical Shenzhen Feike Robot Co ltd
Priority to CN202020017250.7U priority Critical patent/CN212346419U/en
Application granted granted Critical
Publication of CN212346419U publication Critical patent/CN212346419U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The utility model provides a cleaning robot, include the host computer and be fixed in camera device on the host computer, camera device can acquire nearest barrier with distance between the host computer is less than cleaning robot realizes cleaning robot can avoid the barrier in advance at every second's movement distance, and does not collide with the barrier to improve cleaning robot's barrier detection precision.

Description

Cleaning robot
Technical Field
The utility model relates to the technical field of robots, in particular to cleaning machines people.
Background
Sometimes, the cleaning robot collects obstacles in the running environment of the cleaning robot through a camera, and measures the distance of the obstacles in the running environment of the cleaning robot so as to construct a map, plan a path and the like. However, the camera has a certain detection blind area, and the camera cannot acquire the obstacle information in the detection blind area. If the detection blind area is too large, the cleaning robot cannot detect the obstacle in advance and dangerous actions occur. Therefore, there is a need to improve the detection blind area of the camera in conjunction with the movement performance of the cleaning robot.
SUMMERY OF THE UTILITY MODEL
In order to solve the problem, the utility model provides a cleaning robot which can improve the obstacle detection precision.
The utility model discloses a first aspect of the embodiment provides a cleaning robot, include the host computer and be fixed in camera device on the host computer, the nearest barrier that camera device can acquire with distance between the host computer is less than cleaning robot per second's maximum movement distance.
In one implementation mode, the distance range from the nearest obstacle to the cleaning robot, which can be acquired by the camera device, is 25-200 mm.
In one implementation, the vertical distance between the center of the lens of the camera device and the walking surface is greater than one half of the thickness of the host.
In one implementation mode, the height range of the lens center of the camera device to the walking surface is 32-150 mm.
In one implementation, the maximum longitudinal field of view of the imaging device is in a range of 50 ° to 70 °, and the maximum lateral field of view of the imaging device is in a range of 110 ° to 130 °
In an implementation, cleaning machines people still includes buffer and lidar, the buffer at least part set up in the host computer to the week side of part, lidar locates on the host computer and expose the host computer on the advancing direction when cleaning machines people removes, the nearest barrier that lidar can acquire with the distance at lidar's center is less than the buffer with the distance at lidar's center.
In one implementation mode, the host computer is further provided with an accommodating part, the top surface is provided with an opening communicated with the accommodating part, the laser radar is fixed in the accommodating part and exposed out of the top surface, and the camera device is arranged at an interval of the laser radar.
In one implementation, the laser radar is located on a central axis of the cleaning robot, and the central axis is parallel to an advancing direction of the cleaning robot when the cleaning robot moves.
In an implementation, the lidar is symmetric about the central axis.
In an implementation manner, cleaning machines people still includes the safety cover, the safety cover is located laser radar is last, correspond on the safety cover the position in laser emission hole is equipped with the printing opacity district, is used for passing through the laser of laser radar transmission, the printing opacity district is the through-hole, the aperture of through-hole is less than 8.5 mm.
The embodiment of the utility model provides a cleaning robot, because the distance between nearest barrier that camera device can acquire and the cleaning robot is less than cleaning robot's the maximum movement distance in every second, camera device's shooting blind area is less promptly, and camera device acquires nearest barrier information promptly when the barrier does not fall into camera device's shooting blind area, and cleaning robot can pass through nearest barrier information planning movement path realizes that cleaning robot can avoid the barrier in advance, and does not collide with the barrier to improve cleaning robot's barrier detection precision.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a side view of a cleaning robot according to an embodiment of the present invention.
Fig. 2 is a partially exploded perspective view of the cleaning robot shown in fig. 1.
Fig. 3 is a sectional view of the cleaning robot shown in fig. 1 moving in a forward direction.
Fig. 4 is a block diagram of a lidar in an embodiment.
Fig. 5 is a plan view of the cleaning robot shown in fig. 1.
Fig. 6 is a front view of the cleaning robot shown in fig. 1.
Fig. 7 is a block diagram of a cleaning robot in an embodiment.
Fig. 8 is a block diagram of a circuit board in an embodiment.
Fig. 9 is a schematic diagram illustrating that an optical axis of the image capturing apparatus in one embodiment is tilted in a first direction with respect to a central axis of a main body of the cleaning robot.
Fig. 10 is a schematic diagram illustrating that an optical axis of the image capturing apparatus in an embodiment is inclined in a second direction with respect to a central axis of the main body of the cleaning robot.
Fig. 11 is a bottom view of the cleaning robot in an embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative efforts belong to the protection scope of the present invention.
Referring to fig. 1 and 2, fig. 1 is a side view of a cleaning robot according to an embodiment of the present invention. Fig. 2 is a partially exploded perspective view of the cleaning robot shown in fig. 1. The cleaning robot 100 includes a main body 10, a laser radar 30, and an imaging device 70. The laser radar 30 is provided on the main unit 10. The laser emitting hole 31 of the laser radar 30 is exposed to the top surface 15 of the main body 10. Host 10 also includes a bottom surface 17 disposed opposite top surface 15. The camera device 70 is fixed to the forward portion 11 of the main body 10, and the camera device 70 is disposed at intervals from the laser radar 30. The camera device 70 is located between the top surface 15 and the bottom surface 17. In the present embodiment, the forward portion 11 is a portion of the main body 10 that faces the forward direction when the cleaning robot 100 moves in the forward direction.
Since the laser emitting hole 31 of the laser radar 30 is exposed to the top surface 15 of the main body 10, the cleaning robot 100 can acquire environmental information higher than the top surface 15 of the main body 10 by the laser emitted from the laser radar 30. The camera device 70 is fixed on the forward portion 11, and the cleaning robot 100 acquires environmental information of the orientation of the forward portion 11 of the main body 10 through an image captured by the camera device 70. Since the field of view of the camera device 70 is large, the environmental information acquired by the camera device 70 includes environmental information that the height of the traveling direction of the cleaning robot 100 is lower than the top surface 15 of the main body 10. In other words, the imaging device 70 can acquire the obstacle information in the detection blind area of the laser radar 30, thereby improving the detection accuracy of the cleaning robot 100, reducing the possibility that the cleaning robot 100 collides with an obstacle during operation, and improving the operation efficiency of the cleaning robot 100.
Referring to fig. 3, fig. 3 is a cross-sectional view of the cleaning robot shown in fig. 1 moving along a forward direction on a walking surface. The host 10 comprises said forward part 11 and backward part 13. As described above, the forward portion 11 refers to a portion of the main body 10 that faces the forward direction when the cleaning robot 100 moves in the forward direction. The rearward portion 13 refers to a portion opposite the forward portion 11. Host 10 also includes oppositely disposed top surface 15 and bottom surface 17. When the cleaning robot 100 travels on the traveling surface 200, the bottom surface 17 is a surface of the cleaning robot 100 adjacent to the traveling surface 200, compared to the top surface 15. The main body 10 is further provided with a housing portion 19 for fixing the laser radar 30. The housing portion 19 has a groove-like structure. The top surface 15 is opened with an opening 151 communicating with the receiving portion 19. The main body 10 has a central axis γ to which an advancing direction of the cleaning robot 100 when moving is parallel. It is understood that the central axis γ is not limited to the position shown in fig. 3, and the central axis γ may be a central axis at other positions on the longitudinal section of the main body 10 passing through the central axis, for example, the central axis γ may also be a central axis having a height from the bottom surface 17 being one half of the thickness of the main body 10, and so on.
The cleaning robot 100 further includes a bumper 20. The bumper 20 is installed at least partially on the circumferential side of the forward portion of the main body 10 to buffer an impact force applied when the cleaning robot 100 collides with an obstacle. In the present embodiment, the buffer 20 is provided on the peripheral side of the forward portion of the host computer 10.
The laser radar 30 is fixed to the housing portion 19. Laser emitting aperture 31 (also shown in fig. 6) of lidar 30 is exposed at top surface 15 of host 10 for emitting laser light to detect environmental information above top surface 15 of host 10. In one embodiment, in the forward direction of the cleaning robot 100 when moving, the distance from the center of the laser radar 30 to the nearest obstacle that the laser radar 30 can acquire is smaller than the distance from the center of the laser radar 30 to the bumper 20. Thus, when the cleaning robot 10 moves forward, the blind area of the laser radar 30 is small, and the obstacle in front of the bumper 20 can be detected before the bumper 20 which is collided forward is not triggered, so that the cleaning robot 100 can timely perform the action of avoiding the obstacle without colliding the bumper 20 with the obstacle, the probability of triggering the bumper 20 is greatly reduced, and the damage of the cleaning robot is reduced.
Referring to fig. 4, fig. 4 is a block diagram of a lidar 30 according to an embodiment, which includes a laser emitting portion 33, a laser receiving portion 35, and a lidar processor 37. In the present embodiment, the laser emitted from the laser radar 30 is substantially parallel to the central axis γ, and the laser radar 30 can detect an obstacle having a height not lower than the top surface 15, such as the obstacle 201 shown in fig. 3. Whereas for the area having a height lower than the top surface 15, which is the detection blind area of the laser radar 30, the laser radar 30 cannot detect an obstacle within the detection blind area, such as the obstacle 203 shown in fig. 3. The laser emitting part 33 of the laser radar 30 emits laser, the laser is reflected after being irradiated to an obstacle, the reflected laser is received by the laser receiving part 35, and an obstacle having a different distance from the cleaning robot 100 is imaged at a different position of the laser receiving part 35, that is, the distance between the obstacle and the cleaning robot 100 and the imaging position of the obstacle on the laser receiving part 35 have a corresponding relationship, so that the distance between the obstacle to be measured and the cleaning robot 100 can be obtained through the imaging position on the laser receiving part 35. The laser radar processor 37 is configured to perform calculation processing on the image acquired by the laser receiving unit 35 to obtain environmental information such as a relative distance between the obstacle and the cleaning robot 100.
Referring to fig. 3 again, the cleaning robot 100 further includes a protective cover 50. The protective cover 50 covers the laser radar 30 to protect the laser radar 30, so as to reduce the possibility that the laser radar 30 is damaged by being pressed during the movement of the cleaning robot 100. The protective cover 50 is provided with a light-transmitting area 51, and the light-transmitting area 51 is exposed out of the top surface 13 of the host computer 10. The light-transmitting area 50 is used for passing the laser light emitted from the laser emitting hole 31 and the reflected laser light reflected back to the laser receiving portion 35 of the laser radar 30 through an obstacle. In this embodiment, the light-transmitting area 51 is a through hole, and the aperture of the through hole is smaller than 8.5mm, so as to prevent an external object (e.g., a finger) from entering the protective cover 50 and damaging the laser radar 30. It is understood that the size of the light-transmitting region 51 is not limited; the light-transmitting region 50 of the protective cover 50 may be made of a light-transmitting material, such as transparent glass, to prevent impurities, such as dust, from entering the protective cover 50 and causing damage to the laser radar 30.
A camera device 70 is fixed to the forward portion 11 for capturing an image to acquire environmental information of a direction in which the forward portion 11 of the cleaning robot 100 is directed. The camera device 70 is located between the top surface 15 and the bottom surface 17 of the host 10. The buffer 20 is opened with a light-transmitting area 21 at a position corresponding to the image pickup device 70 so that light outside the cleaning robot 100 can enter the image pickup device 70. Since the field of view of the camera device 70 is large, the environmental information that the camera device 70 can acquire includes obstacle information in which the height of the direction in which the forward portion 11 of the host computer 10 faces is lower than the top surface 15. In other words, the image pickup device 70 can acquire obstacle information in the detection blind area of the laser radar 30, improve the detection accuracy of the cleaning robot 100, reduce the possibility that the cleaning robot 100 collides with an obstacle during operation, and improve the operation efficiency of the cleaning robot 100. In the present embodiment, the imaging device 70 can recognize an obstacle from the captured image. The image pickup device 70 has a lens center 700 (shown in fig. 6), and the optical axis of the image pickup device 70 passes through the lens center 700.
In the present embodiment, the height of the laser emitting hole 31 from the bottom surface 17 is greater than the height of the lens center 700 from the bottom surface 17, the laser radar 30 is located on the central axis γ of the cleaning robot 100, the optical axis of the camera device 70 is coaxial with the central axis γ of the cleaning robot 100, and the central axis γ of the cleaning robot 100 passes through the laser radar 30, so as to simplify data processing such as image recognition, obstacle ranging, path planning, and map construction performed by the cleaning robot 100, thereby improving the detection accuracy of the cleaning robot 100 for an obstacle. The laser radar 30 is located on the central axis γ of the cleaning robot 100, may be located on the central axis γ for the center of the laser radar 30, and may also be set off from the central axis γ for the center of the laser radar 30, and the central axis γ may be set through the laser radar 30.
It is understood that the positions of the laser radar 30 and the imaging device 70 are not limited. For example, the laser radar 30 may be disposed near the central axis γ of the main body 10, and the central axis γ does not pass through the laser radar 30. In some embodiments, the optical axis of the imaging device 70 does not coincide with the central axis γ on which the imaging device 70 is located. In some embodiments, the lens center of the imaging device 70 is located on the central axis γ. In this way, it is only necessary that the laser radar 30 is disposed on the host 10 to emit laser, and the image capturing device 70 is fixed to the host 10 to capture an image.
It is understood that the number of the image pickup devices 70 may be plural. In some embodiments, the number of the image capturing devices 70 is even, and an even number of the image capturing devices 70 are symmetrically disposed on the main body 10 about the central axis γ, in other words, the number of the image capturing devices 70 located on one side of the central axis γ is the same as the number of the image capturing devices 70 located on the other side of the central axis γ, and the positions of the image capturing devices 70 located on one side of the central axis γ correspond to the positions of the image capturing devices 70 located on the other side of the central axis γ one-to-one. In some embodiments, the even number of cameras 70 may not be symmetrically disposed about the central axis γ, and the even number of cameras 70 may be disposed adjacent to the central axis γ.
In some embodiments, the number of the cameras 70 is odd, and the lens center 700 of one camera 70 is disposed on or adjacent to the central axis γ, so as to simplify data processing of the cleaning robot 100, such as image recognition, obstacle distance measurement, path planning, and map construction, and improve the obstacle detection accuracy.
In some embodiments, the plurality of cameras 70 may be a module, and the cameras 70 may be modularized to improve the accuracy of ranging and the uniformity of installation of the cleaning robot 100.
Assuming that the vertical angle of view of the imaging device 70 is a (as shown in fig. 3) and the horizontal angle of view of the imaging device 70 is b (as shown in fig. 5), the field view of the imaging device 70 is a two-dimensional field formed by the vertical angle of view a and the horizontal angle of view b. Although the larger the angle of view, the larger the field of view of the imaging device 70, and the smaller the optical magnification, the too large angle of view easily causes distortion in the image acquired by the imaging device 70, and affects the detection accuracy. In one embodiment, the maximum longitudinal field angle a ranges from 50 ° to 70 °, and the maximum lateral field angle b ranges from 110 ° to 130 °, so that the image capturing device 70 can capture high-quality images while having a good field range.
The mounting height of the imaging device 70 also greatly affects the distance measurement accuracy of the imaging device 70. The higher the height h (shown in fig. 6) of the lens center 700 of the imaging device 70 from the traveling surface 200 of the main body 10, the larger the distance between the obstacle located on the traveling surface 200 on which the cleaning robot 100 travels and the optical axis. With the same object distance and the same pixel detection accuracy of the image pickup device 70, the farther the obstacle is from the optical axis, the smaller the object distance measurement error. Therefore, under the condition of ensuring that the obstacle is visible, the distance measurement precision can be improved by 0.5-1 mm when the height of the lens center 700 of the camera device 70 from the walking surface 200 is increased by 1 mm. In the present embodiment, the height of the optical axis of the imaging device 70 from the traveling surface 200 is within a range of 32 to 150 mm. It can be understood that the vertical distance between the optical axis of the camera device 70 and the walking surface 200 of the host 10 is greater than half of the thickness of the host 10 (i.e., the vertical distance between the bottom surface 17 and the top surface 15 of the host 10). However, the height of the optical axis of the imaging device 70 from the traveling surface 200 is not more than 150mm from the installation height of the whole machine. If the height of the lens center 700 of the image pickup device 70 from the travel surface 200 is too high, the dead zone of the image pickup device 70 may increase. The dead zone may increase uncertainty after the camera device 70 recognizes the object, and the camera device 70 may estimate the position of the obstacle in the dead zone according to its motion, and thus, may increase an obstacle detection error of the cleaning robot 100.
The cleaning robot further includes a traveling unit 80, and the traveling unit 80 is movably disposed on the main body 10 and partially protrudes from the bottom surface 17.
Referring to fig. 7, fig. 7 is a block diagram of a cleaning robot. The cleaning robot 100 further includes a processor 110 and a driving unit 120. The processor 110 and the driving unit 120 are disposed on the host 10. The driving unit 120 is used for driving the walking unit 80 to move. The laser radar 30, the image pickup device 70, and the driving unit 120 are all connected to the processor 110. The processor 110 is configured to construct a map and plan a motion path according to the environment information acquired by the laser radar 30 and the camera device 70, and the processor 110 is configured to control the driving unit 120 to drive the traveling unit 80 to travel along the planned motion path. The driving unit 120 may be a driving device such as a motor. The environmental information acquired by the laser radar 30 is defined as first environmental information, and the environmental information acquired by the image pickup device 70 is defined as second environmental information.
In the present embodiment, the image pickup device 70 can acquire that the distance between the nearest obstacle and the main unit 10 is smaller than the maximum movement distance per second of the cleaning robot 100. The nearest obstacle is an obstacle having the smallest distance from the cleaning robot 100 that can be acquired by the imaging device 70. That is, the length of the photographing blind area of the image pickup device 70 in the advancing direction is smaller than the maximum moving distance of the cleaning robot 100 per second. For example, the second obstacle 203 shown in fig. 3 is the closest obstacle to the cleaning robot 100 when moving in the forward direction, and c represents the distance between the second obstacle 203 and the cleaning robot 100. The image pickup device 70 acquires second environment information including nearest obstacle detection information including a distance between a nearest obstacle and the cleaning robot 100, the processor 110 plans a movement path of the cleaning robot 100 according to the first environment information and the second environment information, and the processor 110 controls the driving unit 120 to drive the traveling unit 80 to travel along the planned movement path.
Since the distance between the nearest obstacle that can be acquired by the camera device 70 and the cleaning robot 100 is limited to be smaller than the maximum movement distance of the cleaning robot 100 per second, the range of the dead zone of the camera device 70 is small, the camera device 70 acquires the second environment information including the nearest obstacle detection information when the obstacle does not fall into the dead zone of the camera device 70, and the cleaning robot 100 plans the movement path of the cleaning robot 100 according to the acquired first environment information and second environment information, so that the cleaning robot 100 can avoid the obstacle in advance without colliding with the obstacle.
Illustratively, the distance from the nearest obstacle to the cleaning robot 100, which can be acquired by the camera device 70, is in a range of 25 to 200mm, for example, the maximum speed of the cleaning robot 100 during operation is 0.2m/S, i.e., the maximum movement distance of the cleaning robot 100 per second is less than 200 mm.
Further, referring to fig. 8, the circuit board 733 includes an image sensor 7331 and a camera processor 7335. The light entering the lens 731 is received by the image sensor 7331, and obstacles having different distances from the cleaning robot 100 are imaged at different positions of the image sensor 7331, that is, the distance between the obstacle and the cleaning robot 100 and the imaging position of the obstacle on the image sensor 7331 have a corresponding relationship, so that the distance between the detected obstacle and the cleaning robot 100 can be obtained from the imaging position on the image sensor 7331. The camera processor 7335 is configured to perform calculation processing on the image acquired by the image sensor 7331 to acquire second environment information. The second environment information includes a distance between the obstacle and the cleaning robot 100, an obstacle name, and the like.
In some embodiments, the optical axis of the camera device 70 is disposed to be inclined with respect to a horizontal plane, and the central axis γ of the cleaning robot 100 is parallel to the horizontal plane, that is, the optical axis of the camera device 70 is disposed to be inclined with respect to the central axis γ of the host computer 10, the lens center 700 of the camera device 70 is located on the central axis γ, and an included angle between the optical axis of the camera device 70 and the central axis γ of the host computer 10 ranges from 0.5 ° to 10 °. It is understood that the lens center 700 of the image pickup device 70 may not be located on the central axis γ. Alternatively, the optical axis of the image pickup device 70 may be disposed obliquely upward with respect to the horizontal plane. Referring to fig. 9, fig. 9 is a schematic diagram illustrating that an optical axis of the image capturing device is inclined to a first direction relative to a central axis γ of the host, the first direction (the Y direction shown in fig. 9) is a direction perpendicular to the central axis γ and away from the bottom surface 17, and an inclination angle of an optical axis δ of the image capturing device 70 to the first direction relative to the central axis γ of the host 10 is 5 °. When the optical axis of the image capturing device 70 is inclined in the first direction with respect to the central axis γ of the main body 10, the visible angle of the image capturing device 70 with respect to the traveling surface 200 decreases in the forward direction of the main body 10, and the dead zone of the image capturing with respect to the traveling surface 200 increases, compared to when the optical axis of the image capturing device 70 is coaxial with the central axis γ of the main body 10.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating that an optical axis of the camera device is inclined in a second direction relative to a central axis of the main body of the cleaning robot. The optical axis δ of the imaging device 70 is inclined in a second direction opposite to the first direction with respect to the central axis γ of the main body 10. When the optical axis of the camera device 70 is inclined to the second direction relative to the central axis γ of the host 10 compared to the optical axis of the camera device 70 located on the central axis γ of the host 10, the visible angle of the camera device 70 with respect to the walking surface 200 is increased in the forward direction of the host 10, and the dead zone of shooting with respect to the walking surface 200 is reduced. However, if the inclination angle between the optical axis δ of the camera device 70 and the central axis γ of the host 10 is too large, most of the images collected by the camera device 70 are the walking surface 200, which affects the user's experience of watching and monitoring. In a specific implementation, whether the optical axis of the camera device 70 is inclined with respect to the central axis γ of the host 10 is adjusted according to the space limitation of the structure and the required image capturing characteristics.
Referring to fig. 7 again, the cleaning robot 100 further includes a memory 130, a power supply unit 150, and a communication bus 170. The connection between the laser radar 30, the image pickup device 70, the memory 130, the power supply unit 150, and the processor 110 is realized by a communication bus 170.
The memory 130 may be integrated in the processor 110 or may be provided separately from the processor 110. In some embodiments, the lidar processor 37 in the lidar 30 may be omitted, and the processor 110 is configured to obtain the first environment information according to the information of the laser receiving part 33; the camera processor 7335 in the camera device 70 may be omitted, and the processor 110 is configured to acquire the second environment information from the image information of the image sensor 7331.
For ease of illustration, only one memory and processor are shown in FIG. 7. In actual practice, there may be multiple processors and memories.
The processor 110 may be considered to be implemented by a dedicated processing chip, processing circuit, processor, or a general-purpose chip.
It should be understood that in the embodiments of the present Application, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field-Programmable Gate arrays (FPGA) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like.
Memory 130 is used to store instructions and data including, but not limited to: map data, temporary data generated when controlling the operation of the cleaning robot 100, such as position data, speed data, etc. of the cleaning robot 100. The processor 110 may read instructions stored in the memory 130 to perform the corresponding functions. The Memory 130 may include a Random Access Memory (RAM) and a Non-Volatile Memory (NVM). The nonvolatile Memory may include a Hard Disk Drive (Hard Disk Drive, HDD), a Solid State Drive (SSD), a Silicon Disk Drive (SDD), a Read-Only Memory (ROM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy Disk, an optical data storage device, and the like.
The power supply unit 150 includes a rechargeable battery, a charging circuit connected to the rechargeable battery, and electrodes of the rechargeable battery. The number of the rechargeable batteries is one or more, and the rechargeable batteries may supply power required for the operation of the cleaning robot 100. The electrode may be provided at a side of the body or at the bottom of the body of the cleaning robot 100. The power supply unit 150 may also include a battery parameter detection component for detecting battery parameters, such as voltage, current, battery temperature, and the like. When the operation mode of the cleaning robot 100 is switched to the recharging mode, the cleaning robot 100 starts to search for the charging pile and charges the cleaning robot 100 with the charging pile.
It should be noted that the connection relationship between the units or components in the cleaning robot 100 is not limited to the connection relationship shown in fig. 7. For example, the processor 110 may be communicatively coupled to other units or components via wireless communication.
Referring to fig. 11, fig. 11 is a bottom view of a cleaning robot according to an embodiment. The cleaning robot 100 further includes a cleaning unit 190. The driving unit 140 may drive the cleaning unit 190 and the traveling unit 80 under the control of the processor 110.
The traveling unit 80 includes first traveling wheels 81, second traveling wheels 83, and guide wheels 85 (wherein the first traveling wheels and the second traveling wheels may also be referred to as left wheels and right wheels) respectively arranged in a symmetrical manner at opposite sides of the bottom of the main body 10 of the cleaning robot 100. Motion operations including forward motion, backward motion, and rotation are performed during the performance of a task. The guide wheels 85 may be provided at the front or rear of the main machine 10.
The cleaning unit 190 includes: a main brush 190 and one or more side brushes 192. The main brush 190 is installed on the main body 10 of the cleaning robot 100 and is disposed to protrude from the bottom surface 17 of the main body 10. Alternatively, the main brush 190 is a drum-shaped rotating brush rotating with respect to the contact surface in a roller type. The side brushes 192 are mounted on the left and right edge portions of the front end of the bottom surface 17 of the cleaning robot 100. That is, the side brush 192 is installed substantially in front of the first road wheel 81 and/or the second road wheel 83. The side brush 192 is used to clean a cleaning area that the main brush 190 cannot clean. The side brush 192 may not only rotate on the spot but also be installed to protrude to the outside of the cleaning robot 100, so that the area swept by the cleaning robot 100 may be enlarged. It is understood that the cleaning unit 190 is not limited to a brush, but may be a rag or other cleaning objects.
It is understood that in one or more embodiments, the cleaning robot 100 may further include an input-output unit, a wireless communication unit, a display unit, and the like.
It should be noted that the cleaning robot 100 may further include other units or components, or only include some of the units or components, or lack some of the units or components (for example, in other embodiments, the cleaning robot 100 may not include the cleaning unit 190), which is not limited in this embodiment, and only the cleaning robot 100 is taken as an example for description.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the scope of the invention, which is defined by the appended claims.

Claims (10)

1. The cleaning robot is characterized by comprising a host and a camera device fixed on the host, wherein the distance between the nearest barrier and the host, which can be acquired by the camera device, is less than the maximum movement distance of the cleaning robot per second.
2. The cleaning robot according to claim 1, wherein the distance from the nearest obstacle to the cleaning robot that can be acquired by the imaging device is in a range of 25 to 200 mm.
3. The cleaning robot as claimed in claim 1, wherein a vertical distance between a center of a lens of the camera and the traveling surface is greater than one-half of a thickness of the main body.
4. The cleaning robot according to claim 1, wherein a height of a lens center of the image pickup device from the traveling surface is in a range of 32 to 150 mm.
5. The cleaning robot according to claim 1, wherein a maximum longitudinal field angle of the image pickup device ranges from 50 ° to 70 °, and a maximum lateral field angle of the image pickup device ranges from 110 ° to 130 °.
6. The cleaning robot according to any one of claims 1 to 5, further comprising a bumper provided at least partially on a peripheral side of a forward portion of the main body, and a lidar provided on the main body and exposed from the main body, wherein a distance from a nearest obstacle that can be acquired by the lidar to a center of the lidar is smaller than a distance from the bumper to the center of the lidar in a forward direction of the cleaning robot when the cleaning robot moves.
7. The cleaning robot as claimed in claim 6, wherein the main body further has a receiving portion, a top surface of the main body has an opening communicating with the receiving portion, the lidar is fixed in the receiving portion and exposes the top surface, and the image pickup device is disposed at a distance from the lidar.
8. The cleaning robot of claim 6, wherein the lidar is positioned on a central axis of the cleaning robot, the central axis being parallel to a direction of travel of the cleaning robot as it moves.
9. The cleaning robot of claim 6, wherein the lidar is symmetric about a central axis of the cleaning robot.
10. The cleaning robot according to claim 6, further comprising a protective cover provided on the laser radar, wherein a light-transmitting region for passing the laser light emitted from the laser radar is provided at a position corresponding to the laser emitting hole of the laser radar on the protective cover, the light-transmitting region being a through hole having a hole diameter of less than 8.5 mm.
CN202020017250.7U 2020-01-03 2020-01-03 Cleaning robot Active CN212346419U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020017250.7U CN212346419U (en) 2020-01-03 2020-01-03 Cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020017250.7U CN212346419U (en) 2020-01-03 2020-01-03 Cleaning robot

Publications (1)

Publication Number Publication Date
CN212346419U true CN212346419U (en) 2021-01-15

Family

ID=74137525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020017250.7U Active CN212346419U (en) 2020-01-03 2020-01-03 Cleaning robot

Country Status (1)

Country Link
CN (1) CN212346419U (en)

Similar Documents

Publication Publication Date Title
AU2019236712B2 (en) Cleaning robot and controlling method thereof
US11013385B2 (en) Automatic cleaning device and cleaning method
EP3459688A2 (en) Mobile robot and control method therefor
JP2020076589A (en) Object detection device
KR102459151B1 (en) Robot cleaner and controlling method thereof
CN211933925U (en) Cleaning robot
CN113069042A (en) Cleaning robot
CN108652532B (en) Intelligent cleaning equipment
CN212346419U (en) Cleaning robot
KR101613467B1 (en) A robot cleaner and a method for operating it
CN212346418U (en) Cleaning robot
US20240077877A1 (en) Obstacle detection device for cleaning robot and cleaning robot
US20240029298A1 (en) Locating method and apparatus for robot, and storage medium
CN113069043A (en) Cleaning robot
CN109512340B (en) Control method of cleaning robot and related equipment
CN218456747U (en) Self-moving equipment
CN113573620A (en) Robot cleaner
JP2020052601A (en) Autonomous travel cleaner and control method
CN211749325U (en) Cleaning robot
CN109452905B (en) Autonomous mobile device
WO2022035573A1 (en) Techniques for detecting and mitigating interference among multiple lidar sensors
WO2022252849A1 (en) Self-moving device
KR20150095440A (en) Distance detecting apparatus and method thereof
CN210673216U (en) Light filtering type robot
CN113133715B (en) Autonomous walking type dust collector

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220810

Address after: 201600 555 Guangfulin East Road, Songjiang District, Shanghai

Patentee after: SHANGHAI FLYCO ELECTRICAL APPLIANCE Co.,Ltd.

Address before: 518109 area 401f, building D, gangzhilong Science Park, 6 Qinglong Road, Qinghua community, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN FEIKE ROBOT Co.,Ltd.

TR01 Transfer of patent right