CN210673216U - Light filtering type robot - Google Patents

Light filtering type robot Download PDF

Info

Publication number
CN210673216U
CN210673216U CN201920797650.1U CN201920797650U CN210673216U CN 210673216 U CN210673216 U CN 210673216U CN 201920797650 U CN201920797650 U CN 201920797650U CN 210673216 U CN210673216 U CN 210673216U
Authority
CN
China
Prior art keywords
receiving sensor
robot
light source
light
machine body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920797650.1U
Other languages
Chinese (zh)
Inventor
曹晶瑛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Stone Innovation Technology Co ltd
Original Assignee
Beijing Rockrobo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockrobo Technology Co Ltd filed Critical Beijing Rockrobo Technology Co Ltd
Priority to CN201920797650.1U priority Critical patent/CN210673216U/en
Application granted granted Critical
Publication of CN210673216U publication Critical patent/CN210673216U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The utility model relates to a light filtering type robot, include: a machine body (110), a position determination device (121), and a control system (130); the position determining device is positioned on the side surface of the machine body (110), is connected with the control system (130) and provides the position information of the robot for the control system; the position determination device (121) comprises: at least one light source and at least one receiving sensor (1212), the at least one light source (1211) and the at least one receiving sensor being located at a side of the machine body (110); and the filtering device (1213) is arranged in front of the at least one receiving sensor and is used for shielding a receiving point on the receiving sensor, which receives the light intensity reflected by the at least one light source and is greater than a preset threshold value, by the control of the control system. The utility model discloses set up a printing opacity dot matrix screen in the place ahead of receiving end sensor, carry out the projection to the strong point of receipt of light intensity and shelter from to fall and transfer the noise, make the target detection more accurate.

Description

Light filtering type robot
Technical Field
The utility model belongs to the technical field of mechanical equipment, especially, relate to a light filtering type robot.
Background
Obstacle avoidance, environment mapping and work planning are important subjects of current research in the field of robots, and in order to complete tasks better, the sweeping robot needs to generate a work environment map of the sweeping robot, and then plans a most reasonable work path according to understanding of the environment map, so that cleaning efficiency is improved.
The sweeping robot popular in the market at present is based on the contact navigation of a collision sensor, the sweeping robot can switch a working path after collision, and the defects of low cleaning coverage rate and the like exist because intelligent judgment on obstacles, such as the size, the type and the like of the obstacles, cannot be made. And can cause danger or loss when impacted by fragile objects. The existing intelligent obstacle avoidance method for the sweeping robot mainly comprises an obstacle avoidance technology based on an ultrasonic ranging obstacle avoidance technology, a laser ranging technology and the like.
Because ultrasonic ranging, laser rangefinder etc. all need adopt mechanical structure, and the volume is great, consequently among the prior art, the obstacle detection part of robot of sweeping the floor all is fixed in the robot top as protruding structure, because the visual angle is limited, leads to the unable discernment to the lower obstacle of height, for example the cable on ground, door line etc. lead to the robot of sweeping the floor easily blocked in the course of the work. In addition, the detection part adopts a mechanical structure, so that the detection part is easy to damage in the collision process and has lower reliability.
For the existing sweeper using tof, because the light consumes energy in the operation process, generally speaking, the intensity of the reflected light from the area array or dot matrix light source emitted by the emitting end is stronger than that of the reflected light from the remote object when the sweeper encounters the short-distance object. Therefore, when the tof is measured, in some special cases, objects in a certain near distance are in the same measurement range, and the reflected light of the tof has strong intensity, so that scattering or overexposure occurs, which may affect the signal measurement of the reflected light of the object in a far distance, as shown in fig. 10, the exposure degree of the obstacle in the far distance and the obstacle in the near distance may be obviously different.
The utility model discloses an inventor has developed a filtering type robot of sweeping the floor that has at least one time of flight sensor gradually at the in-process of long-term research to solve above-mentioned technical problem.
SUMMERY OF THE UTILITY MODEL
An embodiment of the utility model provides a robot to solve one of above-mentioned technical problem.
Based on the utility model discloses an embodiment, the embodiment of the utility model provides a light filtering type robot, include: a machine body 110, a position determining device 121, and a control system 130; wherein, the position determining device 121 is located at the side of the machine body 110, connected to the control system 130, and provides the control system 130 with the position information of the robot;
the position determining means 121 comprises:
at least one light source 1211 and at least one receiving sensor 1212, the at least one light source 1211 and the at least one receiving sensor 1212 being located on the machine body 110;
and at least one filtering device 1213 disposed in front of the at least one receiving sensor 1212, wherein the filtering device is controlled by the control system to block a receiving point on the receiving sensor, where the intensity of light reflected by the at least one light source is greater than a preset threshold.
Optionally, the at least one filtering device 1213 is a dot matrix liquid crystal panel.
Optionally, the position determining device 121 includes: two light sources and one receiving sensor, which are located in front of the machine body 110.
Optionally, the position determining device 121 further includes:
a third light source and a second receiving sensor, which are located at the rear of the machine body 110.
Optionally, the light source 1211 is a surface light source or a line light source.
Optionally, the receiving sensor is a surface receiving sensor or a line receiving sensor.
Optionally, the position determining device 121 further includes: a data processing unit 1216, connected to the at least one receiving sensor 1212, for processing data received by the at least one receiving sensor 1212.
Optionally, the area receiving sensor includes an area array CCD; the line receiving sensor comprises a linear array CCD.
Optionally, the machine body 110 includes a forward portion 111 and a rearward portion 112, the two light sources and one receiving sensor being located at the forward portion 111; the third light source and the second receiving sensor are located at the rearward portion 112.
Optionally, the forward portion 111 and the rearward portion 112 are each semi-circular.
The utility model discloses above-mentioned scheme of embodiment has following beneficial effect at least:
the utility model discloses a robot, at least one time of flight sensor along the setting of casing lateral wall detects the barrier. By more than one flight time sensor, an environment image with a larger visual angle and within a detection height can be obtained, so that the detection range of the sweeping robot to the surrounding environment is enlarged. The front of the receiving end sensor is provided with a light-transmitting dot matrix screen which is connected with the central processing unit and is provided with a reflected light intensity threshold value, when the light intensity of a certain point received by the sensor exceeds the threshold value, the central processing unit controls the light-transmitting dot matrix screen to project and shield the receiving point with stronger light intensity, thereby reducing modulation noise and enabling target detection to be more accurate.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and those skilled in the art will be able to obtain other drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario of photoelectric detection of a robot according to an embodiment of the present invention;
fig. 2 is a top view of a robot structure provided in an embodiment of the present application;
fig. 3 is a bottom view of a robot structure provided in an embodiment of the present application;
FIG. 4 is a front view of a robot structure provided by an embodiment of the present application;
fig. 5 is a perspective view of a robot structure provided in an embodiment of the present application;
FIG. 6 is a block diagram of a robot according to an embodiment of the present disclosure;
fig. 7 is a schematic view of a robot area array photodetector provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a robot linear array photodetector provided in an embodiment of the present application;
fig. 9 is a schematic diagram of a light filtering of a liquid crystal lattice robot according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of a robot detection in the prior art.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative efforts belong to the protection scope of the present invention.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe … … in the embodiments of the present application, these … … should not be limited to these terms. These terms are used only to distinguish … …. For example, the first … … can also be referred to as the second … …, and similarly the second … … can also be referred to as the first … … without departing from the scope of embodiments herein.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings. To describe the behavior of the robot more clearly, the following directional definitions are made:
as shown in fig. 5, the robot 100 may travel over the ground through various combinations of movements relative to the following three mutually perpendicular axes defined by the body 110: a front-back axis X, a lateral axis Y, and a central vertical axis Z. The forward driving direction along the forward-rearward axis X is denoted as "forward", and the rearward driving direction along the forward-rearward axis X is denoted as "rearward". The transverse axis Y extends substantially along the axis defined by the center points of the drive wheel modules 141 between the right and left wheels of the robot.
The robot 100 may rotate about the Y-axis. "pitch up" when the forward portion of the robot 100 is tilted up and the backward portion is tilted down, and "pitch down" when the forward portion of the robot 100 is tilted down and the backward portion is tilted up. In addition, the robot 100 may rotate about the Z-axis. In the forward direction of the robot, the robot 100 is tilted to the right of the X axis as "right turn", and the robot 100 is tilted to the left of the X axis as "left turn".
Referring to fig. 1, a possible application scenario provided in the embodiment of the present application includes a robot, such as a sweeping robot, a mopping robot, a dust collector, a weeding machine, and the like. In some embodiments, the robot may be a robot, in particular a sweeping robot, a mopping robot. In implementation, the robot transmits the detection light source through the transmitting unit E, receives the obstacle reflection light source through the receiving unit R, transmits the received data to the DPU for calculation processing, and then transmits the data to the control system to control the traveling direction of the robot. In other embodiments, the robot may be provided with a touch-sensitive display to receive operation instructions input by a user. The robot can also be provided with wireless communication modules such as WIFI module, Bluetooth module to be connected with intelligent terminal, and receive the operating instruction that the user utilized intelligent terminal to transmit through wireless communication module.
The structure of the relevant robot is described below, as shown in fig. 2-5:
the robot 100 includes a robot body 110, a sensing system 120, a control system, a drive system 140, a cleaning system, an energy system, and a human-machine interaction system 170. As shown in fig. 2.
The machine body 110 includes a forward portion 111 and a rearward portion 112 having an approximately circular shape (circular front to rear), and may have other shapes including, but not limited to, an approximately D-shape with a front to rear circle.
As shown in fig. 4, the sensing system 120 includes a position determining device 121 located above the machine body 110, a bumper 122 located at the forward portion 111 of the machine body 110, a cliff sensor 123, and an ultrasonic sensor, an infrared sensor, a magnetometer, an accelerometer, a gyroscope, a odometer, etc., and provides various position information and motion state information of the machine to the control system 130. The position determining device 121 includes, but is not limited to, a camera, a laser distance measuring device (LDS), and the position determining device 121 may be located on the top or side of the robot. The following describes how position determination is performed by taking a laser distance measuring device of the triangulation method as an example. The basic principle of the triangulation method is based on the geometric relation of similar triangles, and is not described herein.
The laser ranging device includes a light emitting unit and a light receiving unit. The light emitting unit may include a light source that emits light, and the light source may include a light emitting element, such as an infrared or visible Light Emitting Diode (LED) that emits infrared light or visible light. Preferably, the light source may be a light emitting element that emits a laser beam. In the present embodiment, a Laser Diode (LD) is taken as an example of the light source. In particular, a light source using a laser beam may make the measurement more accurate than other lights due to the monochromatic, directional, and collimation characteristics of the laser beam. For example, infrared or visible light emitted by a Light Emitting Diode (LED) is affected by ambient environmental factors (e.g., color or texture of an object) as compared to a laser beam, and may be reduced in measurement accuracy. The Laser Diode (LD) may be a spot laser for measuring two-dimensional position information of an obstacle, or a line laser for measuring three-dimensional position information of an obstacle within a certain range.
The light receiving unit may include an image sensor on which a light spot reflected or scattered by an obstacle is formed. The image sensor may be a set of a plurality of unit pixels of a single row or a plurality of rows. These light receiving elements can convert optical signals into electrical signals. The image sensor may be a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge Coupled Device (CCD) sensor, and is preferably a Complementary Metal Oxide Semiconductor (CMOS) sensor due to cost advantages. Also, the light receiving unit may include a light receiving lens assembly. Light reflected or scattered by the obstruction may travel through a light receiving lens assembly to form an image on the image sensor. The light receiving lens assembly may comprise a single or multiple lenses.
The base may support the light emitting unit and the light receiving unit, which are disposed on the base and spaced apart from each other by a certain distance. In order to measure the obstacle situation in the 360 degree direction around the robot, the base may be rotatably disposed on the main body 110, or the base itself may be rotated without rotating the emitted light, the received light by providing a rotating element. The rotating angular speed of the rotating element can be obtained by arranging the optical coupling element and the coded disc, the optical coupling element senses tooth gaps on the coded disc, and instantaneous angular speed can be obtained by dividing the sliding time of the tooth gap distance and the tooth gap distance value. The higher the density of the tooth notches on the coded disc is, the higher the measurement accuracy and precision are correspondingly, but the structure is more precise, and the calculated amount is higher; on the contrary, the smaller the density of the tooth defects is, the lower the accuracy and precision of measurement are, but the structure can be relatively simple, the calculation amount is smaller, and the cost can be reduced.
The data processing device, e.g. DSP, connected to the light receiving unit records and transmits the obstacle distance values at all angles in the direction of 0 degrees with respect to the robot to the data processing unit in the control system 130, e.g. Application Processor (AP) comprising CPU running a particle filter based positioning algorithm to obtain the current position of the robot and to map it according to this position for navigation. The positioning algorithm preferably uses instant positioning and mapping (SLAM).
Although the laser distance measuring device based on the triangulation method can measure the distance value at an infinite distance beyond a certain distance in principle, in practice, the realization of the long-distance measurement, for example, more than 6 meters, is difficult, mainly because of the size limitation of the pixel unit on the sensor of the light receiving unit, and at the same time, the laser distance measuring device is also influenced by the photoelectric conversion speed of the sensor, the data transmission speed between the sensor and the connected DSP, and the calculation speed of the DSP. The measured value obtained by the laser ranging device under the influence of temperature can also have variation which cannot be tolerated by a system, mainly because the angle between incident light and emergent light is changed due to thermal expansion deformation of a structure between the light emitting unit and the light receiving unit, and the light emitting unit and the light receiving unit can also have the temperature drift problem. After the laser ranging device is used for a long time, the measurement result is also seriously influenced by deformation caused by accumulation of various factors such as temperature change, vibration and the like. The accuracy of the measuring result directly determines the accuracy of the map drawing, and is the basis for further strategy implementation of the robot, and is particularly important.
As shown in fig. 3, the forward portion 111 of the machine body 110 may carry a bumper 122, the bumper 122 detecting one or more events in the travel path of the robot 100 via a sensor system, such as an infrared sensor, as the drive wheel module 141 propels the robot across the floor during cleaning, and the robot may control the drive wheel module 141 to cause the robot to respond to the events, such as moving away from an obstacle, by detecting the events, such as an obstacle, a wall, by the bumper 122.
The control system 130 is disposed on a circuit board in the machine body 110, and includes a non-transitory memory, such as a hard disk, a flash memory, and a random access memory, a communication computing processor, such as a central processing unit, and an application processor, and the application processor uses a positioning algorithm, such as SLAM, to map an instant map of the environment where the robot is located according to the obstacle information fed back by the laser ranging device. And the current working state of the sweeper is comprehensively judged by combining distance information and speed information fed back by the buffer 122, the cliff sensor 123, the ultrasonic sensor, the infrared sensor, the magnetometer, the accelerometer, the gyroscope, the odometer and other sensing devices, for example, when the sweeper passes a threshold, a carpet is arranged at the cliff, the upper part or the lower part of the sweeper is clamped, a dust box is full, the sweeper is taken up and the like, and a specific next-step action strategy is provided according to different conditions, so that the robot can work more according with the requirements of an owner, and better user experience is achieved. Further, the control system 130 can plan the most efficient and reasonable cleaning path and cleaning mode based on map information drawn by the SLAM, thereby greatly improving the cleaning efficiency of the robot.
The drive system 140 may steer the robot 100 across the ground based on drive commands having distance and angle information, such as x, y, and theta components. The drive system 140 includes a drive wheel module 141, and the drive wheel module 141 can control both the left and right wheels, and in order to more precisely control the motion of the machine, it is preferable that the drive wheel module 141 includes a left drive wheel module and a right drive wheel module, respectively. The left and right drive wheel modules are opposed along a transverse axis defined by the body 110. In order for the robot to be able to move more stably or with greater mobility over the ground, the robot may include one or more driven wheels 142, including but not limited to universal wheels. The driving wheel module comprises a traveling wheel, a driving motor and a control circuit for controlling the driving motor, and can also be connected with a circuit for measuring driving current and a milemeter. The driving wheel module 141 may be detachably coupled to the main body 110 to facilitate disassembly and maintenance. The drive wheel may have a biased drop-type suspension system movably secured, e.g., rotatably attached, to the robot body 110 and receiving a spring bias biased downward and away from the robot body 110. The spring bias allows the drive wheels to maintain contact and traction with the floor with a certain landing force while the cleaning elements of the robot 100 also contact the floor 10 with a certain pressure.
The cleaning system may be a dry cleaning system and/or a wet cleaning system. As a dry cleaning system, the main cleaning function is derived from the sweeping system 151 constituted by the roll brush, the dust box, the blower, the air outlet, and the connecting members therebetween. The rolling brush with certain interference with the ground sweeps the garbage on the ground and winds the garbage to the front of a dust suction opening between the rolling brush and the dust box, and then the garbage is sucked into the dust box by air which is generated by the fan and passes through the dust box and has suction force. The dust removal capability of the sweeper can be characterized by the sweeping efficiency of garbage, the sweeping efficiency is influenced by the structure and the materials of the rolling brush, the wind power utilization rate of an air duct formed by a dust suction port, a dust box, a fan, an air outlet and connecting parts among the dust suction port, the dust box, the fan, the air outlet and the connecting parts is influenced by the type and the power of the fan, and the sweeping efficiency is a responsible system design problem. Compared with the common plug-in dust collector, the improvement of the dust removal capability is more significant for the cleaning robot with limited energy. Because the improvement of the dust removal capability directly and effectively reduces the energy requirement, namely the machine which can clean the ground of 80 square meters by charging once can be developed into the machine which can clean 100 square meters or more by charging once. And the service life of the battery, which reduces the number of times of charging, is also greatly increased, so that the frequency of replacing the battery by the user is also increased. More intuitively and importantly, the improvement of the dust removal capability is the most obvious and important user experience, and the user can directly draw a conclusion whether the sweeping/wiping is clean. The dry cleaning system may also include an edge brush 152 having an axis of rotation that is angled relative to the floor for moving debris into the roller brush area of the cleaning system.
Energy systems include rechargeable batteries, such as nickel metal hydride batteries and lithium batteries. The charging battery can be connected with a charging control circuit, a battery pack charging temperature detection circuit and a battery under-voltage monitoring circuit, and the charging control circuit, the battery pack charging temperature detection circuit and the battery under-voltage monitoring circuit are connected with the single chip microcomputer control circuit. The host computer is connected with the charging pile through the charging electrode arranged on the side or the lower part of the machine body for charging. If dust is attached to the exposed charging electrode, the plastic body around the electrode is melted and deformed due to the accumulation effect of electric charge in the charging process, even the electrode itself is deformed, and normal charging cannot be continued.
The human-computer interaction system 170 comprises keys on a panel of the host computer, and the keys are used for a user to select functions; the machine control system can further comprise a display screen and/or an indicator light and/or a loudspeaker, wherein the display screen, the indicator light and the loudspeaker show the current state or function selection item of the machine to a user; and a mobile phone client program can be further included. For the path navigation type cleaning equipment, a map of the environment where the equipment is located and the position of a machine can be displayed for a user at a mobile phone client, and richer and more humanized function items can be provided for the user.
Fig. 6 is a block diagram of a sweeping robot according to the present invention.
The sweeping robot according to the current embodiment may include: a microphone array unit for recognizing a user's voice, a communication unit for communicating with a remote control device or other devices, a moving unit for driving the main body, a cleaning unit, and a memory unit for storing information. An input unit (a key of the sweeping robot, etc.), an object detection sensor, a charging unit, a microphone array unit, a direction detection unit, a position detection unit, a communication unit, a driving unit, and a memory unit may be connected to the control unit to transmit or receive predetermined information to or from the control unit.
The microphone array unit may compare the voice input through the receiving unit with information stored in the memory unit to determine whether the input voice corresponds to a specific command. If it is determined that the input voice corresponds to a specific command, the corresponding command is transmitted to the control unit. If the detected speech cannot be compared to the information stored in the memory unit, the detected speech may be treated as noise to ignore the detected speech.
For example, the detected voice corresponds to the word "come, go", and there is a word control command (come) corresponding to the word stored in the information of the memory unit. In this case, a corresponding command may be transmitted to the control unit.
The direction detecting unit may detect the direction of the voice by using a time difference or a level of the voice input to the plurality of receiving units. The direction detection unit transmits the detected direction of the voice to the control unit. The control unit may determine the moving path by using the voice direction detected by the direction detecting unit.
The position detection unit may detect coordinates of the subject within predetermined map information. In one embodiment, the information detected by the camera and the map information stored in the memory unit may be compared with each other to detect the current position of the subject. The position detection unit may use a Global Positioning System (GPS) in addition to the camera.
In a broad sense, the position detection unit may detect whether the main body is disposed at a specific position. For example, the position detection unit may include a unit for detecting whether the main body is disposed on the charging pile.
For example, in the method for detecting whether the main body is disposed on the charging pile, whether the main body is disposed at the charging position may be detected according to whether power is input into the charging unit. For another example, whether the main body is disposed at the charging position may be detected by a charging position detecting unit disposed on the main body or the charging pile.
The communication unit may transmit/receive predetermined information to/from a remote control device or other devices. The communication unit may update map information of the sweeping robot.
The driving unit may operate the moving unit and the cleaning unit. The driving unit may move the moving unit along the moving path determined by the control unit.
The memory unit stores therein predetermined information related to the operation of the sweeping robot. For example, map information of an area where the sweeping robot is arranged, control command information corresponding to a voice recognized by the microphone array unit, direction angle information detected by the direction detection unit, position information detected by the position detection unit, and obstacle information detected by the object detection sensor may be stored in the memory unit.
The control unit may receive information detected by the receiving unit, the camera, and the object detection sensor. The control unit may recognize a voice of the user, detect a direction in which the voice occurs, and detect a position of the sweeping robot based on the transmitted information. Further, the control unit may also operate the moving unit and the cleaning unit.
Specifically, based on the utility model discloses an embodiment, the embodiment of the utility model provides a robot, include: a machine body 110, a position determining device 121, and a control system 130; wherein the position determining device 121 is located at a side of the machine body 110, and provides the control system 130 with position information of the robot;
the position determining means 121 comprises: at least one light source 1211 and at least one receiving sensor 1212, the at least one light source 1211 and the at least one receiving sensor 1212 being located at a side of the machine body 110; at least one filtering device 1213 disposed in front of the at least one receiving sensor 1212, the filtering device 1213 being capable of passing signals greater than a preset threshold.
Optionally, the position determining device 121 includes: two light sources and one receiving sensor, which are located at the front side of the machine body 110. A third light source and a second receiving sensor, which are located at the rear side of the machine body 110.
The position of the obstacle is determined by a TOF method implemented by a light source and a receiver.
TOF is short for Time of flight, Time-of-flight ranging. Time-of-flight 3D imaging is the acquisition of object distance by continuously sending light pulses to the object, receiving the light returning from the object with a sensor, and detecting the time of flight (round trip) of the light pulses. As shown in fig. 1.
The TOF ranging method belongs to a two-way ranging technology, and mainly utilizes the time of flight of a signal back and forth between two asynchronous transceivers (or reflected surfaces) to measure the distance between nodes. Conventional ranging techniques are classified into two-way ranging techniques and one-way ranging techniques. Under the condition of better signal level modulation or a non-line-of-sight environment, the estimation result of the distance measurement method based on the Received Signal Strength Indication (RSSI) is more ideal; under the sight distance and sight line environment, the TOF-based distance estimation method can make up the defects of the RSSI-based distance estimation method.
Alternatively, the at least one light source 1211 may be a surface light source or a line light source. The at least one receiving sensor 1212 may be a surface receiving sensor or a line receiving sensor. The positions and the number of the at least one light source 1211 and the at least one receiving sensor 1212 may be selected according to actual conditions.
To the utility model discloses an embodiment, can select earlier:
one is a surface light source, one is a linear light source and one is a surface receiving sensor. The receiver of an area array and two light sources, namely an area light source and a line light source are used, the two light sources are used for alternate measurement, the linear array light source can be used for positioning, building a two-dimensional map and navigation, and the area array light source can be used for avoiding obstacles and building a three-dimensional map.
One light source is a line light source and one sensor is a line receiving sensor. Namely a line receiver and a line light source, can be used for accurate positioning, two-dimensional map building and navigation. Optionally, the area receiving sensor includes an area array CCD; the line receiving sensor comprises a linear array CCD.
As shown in fig. 7, there are 3 types of structures of the area array CCD. The first is a frame-transfer CCD. The vertical register is composed of an upper part and a lower part, wherein the upper part is a photosensitive area in which pixels are concentrated, and the lower part is a storage area in which light is shielded and a vertical register is concentrated. Its advantages are simple structure, easy increase of pixels, and large size of CCD. The second is interline transfer CCD, the pixel group and vertical register are on the same plane, the third is frame interline transfer CCD, which is a composite of the first and the second, with complex structure but greatly reduced vertical smear.
As shown in fig. 8, the linear CCD array is scanned across the picture by a row of pixels, and three exposures are made, corresponding to red, green, and blue filters, respectively, as the name indicates, and the linear sensor captures a one-dimensional image.
Optionally, the position determining device 121 further includes: a data processing unit 1216, connected to the receiving sensor, for processing the data received by the receiving sensor.
Optionally, the machine body 110 includes a forward portion 111 and a rearward portion 112, the two light sources and one receiving sensor being located at the forward portion 111; the third light source and the second receiving sensor are located at the rearward portion 112. The forward and rearward portions 111, 112 are each semi-circular.
As shown in fig. 9, a transparent dot-matrix screen is arranged in front of the receiving end sensor, and is connected to the control system 130, and a reflected light intensity threshold is set by the control system 130, when the light intensity of a certain point received by the sensor exceeds the threshold, the control system 130 controls the transparent dot-matrix screen to perform projection and shielding on the receiving point with strong light intensity, so as to reduce the light intensity of the local area of the transparent dot-matrix screen, reduce the probability of overexposure, and reduce the influence on other reflecting points due to the reduction of the light intensity of the local area, thereby ensuring the measurement accuracy of other points, and making the target detection more accurate.
As an implementation manner, the above-mentioned light-transmitting dot matrix screen can use, for example, a liquid crystal light valve to perform local control of light, and the specific control method is not the focus of the discussion of the present invention and is not repeated herein.
The threshold value can be obtained through experiments, and the value of the reflected light intensity of the obstacle in the normal detection range can be obtained, so that the value of the range nearby the obstacle can be taken as the threshold value. For the interference barriers in a short distance, the reflected light intensity is higher than that of the normal barriers certainly, and an effective filtering effect can be achieved.
The utility model discloses a robot, at least one time of flight sensor along the setting of casing lateral wall detects the barrier. By more than one flight time sensor, an environment image with a larger visual angle and within a detection height can be obtained, so that the detection range of the sweeping robot to the surrounding environment is enlarged. The front of the receiving end sensor is provided with a light-transmitting dot matrix screen which is connected with the central processing unit and is provided with a reflected light intensity threshold value, when the light intensity of a certain point received by the sensor exceeds the threshold value, the central processing unit controls the light-transmitting dot matrix screen to project and shield the receiving point with stronger light intensity, thereby reducing modulation noise and enabling target detection to be more accurate.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention in its corresponding aspects.

Claims (10)

1. A filter robot, comprising: a machine body (110), a position determination device (121), and a control system (130); wherein the position determining device (121) is positioned on the side surface of the machine body (110), is connected with the control system (130), and provides the control system (130) with the position information of the robot;
the position determining device (121) comprises:
at least one light source (1211) and at least one receiving sensor (1212), the at least one light source (1211) and at least one receiving sensor (1212) being located on the machine body (110);
and the filtering device (1213) is arranged in front of the at least one receiving sensor (1212), and the filtering device (1213) is used for shielding a receiving point on the receiving sensor (1212), which receives the reflected light of the at least one light source and has the intensity greater than a preset threshold value, through the control of the control system (130).
2. A robot according to claim 1, characterized in that said at least one filter means (1213) is a dot matrix liquid crystal screen.
3. The robot according to claim 2, characterized in that said position determining means (121) comprise: two light sources and one receiving sensor, which are located in front of the machine body (110).
4. A robot as claimed in claim 3, characterized in that the position determining means (121) further comprise:
a third light source and a second receiving sensor, which are located behind the machine body (110).
5. The robot according to claim 4, wherein the light source (1211) is a surface light source or a line light source.
6. A robot as set forth in claim 5, wherein the receiving sensor is a surface receiving sensor or a line receiving sensor.
7. The robot according to claim 6, characterized in that said position determining means (121) further comprises: a data processing unit (1216) connected with the at least one receiving sensor (1212), for processing data received by the at least one receiving sensor (1212).
8. The robot of claim 7, wherein said area receiving sensor comprises an area array CCD; the line receiving sensor comprises a linear array CCD.
9. The robot according to claim 4, characterized in that said machine body (110) comprises a forward portion (111) and a backward portion (112), said two light sources and one receiving sensor being located at said forward portion (111); the third light source and second receiving sensor are located in the rearward portion (112).
10. Robot according to claim 9, characterized in that the forward part (111) and the backward part (112) are each semicircular.
CN201920797650.1U 2019-05-30 2019-05-30 Light filtering type robot Active CN210673216U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201920797650.1U CN210673216U (en) 2019-05-30 2019-05-30 Light filtering type robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201920797650.1U CN210673216U (en) 2019-05-30 2019-05-30 Light filtering type robot

Publications (1)

Publication Number Publication Date
CN210673216U true CN210673216U (en) 2020-06-05

Family

ID=70881142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920797650.1U Active CN210673216U (en) 2019-05-30 2019-05-30 Light filtering type robot

Country Status (1)

Country Link
CN (1) CN210673216U (en)

Similar Documents

Publication Publication Date Title
US20210251450A1 (en) Automatic cleaning device and cleaning method
EP3951544A1 (en) Robot working area map constructing method and apparatus, robot, and medium
EP4014827B1 (en) Cleaning robot and control method therefor
CN109932726B (en) Robot ranging calibration method and device, robot and medium
EP3998007A1 (en) Automatic cleaning device control method and apparatus, device and medium
EP4026473A1 (en) Cleaning robot and control method therefor
CN111857153B (en) Distance detection device and robot sweeps floor
CN114010102B (en) Cleaning robot
WO2022048153A1 (en) Positioning method and apparatus for robot, and storage medium
CN210673215U (en) Multi-light-source detection robot
CN210673216U (en) Light filtering type robot
CN210931183U (en) Cleaning robot
CN211270533U (en) Camera device and cleaning robot
EP4062816A1 (en) Camera device and cleaning robot
AU2022204218B2 (en) Camera Apparatus and Cleaning Robot
CN112244705B (en) Intelligent cleaning device, control method and computer storage medium
CN117970356A (en) Distance detection device and self-walking equipment
CN116977858A (en) Ground identification method, device, robot and storage medium
CN117406296A (en) Cliff detection device and method and self-moving equipment

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220428

Address after: 102200 No. 8008, floor 8, building 16, yard 37, Chaoqian Road, Changping Park, Zhongguancun Science and Technology Park, Changping District, Beijing

Patentee after: Beijing Stone Innovation Technology Co.,Ltd.

Address before: No. 6016, 6017 and 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing 100085

Patentee before: Beijing Roborock Technology Co.,Ltd.