CN112244705B - Intelligent cleaning device, control method and computer storage medium - Google Patents

Intelligent cleaning device, control method and computer storage medium Download PDF

Info

Publication number
CN112244705B
CN112244705B CN202010948471.0A CN202010948471A CN112244705B CN 112244705 B CN112244705 B CN 112244705B CN 202010948471 A CN202010948471 A CN 202010948471A CN 112244705 B CN112244705 B CN 112244705B
Authority
CN
China
Prior art keywords
region
triggered
interest
moving object
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010948471.0A
Other languages
Chinese (zh)
Other versions
CN112244705A (en
Inventor
于炀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Stone Innovation Technology Co ltd
Original Assignee
Beijing Stone Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Stone Innovation Technology Co ltd filed Critical Beijing Stone Innovation Technology Co ltd
Priority to CN202010948471.0A priority Critical patent/CN112244705B/en
Publication of CN112244705A publication Critical patent/CN112244705A/en
Application granted granted Critical
Publication of CN112244705B publication Critical patent/CN112244705B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an equipment control method for intelligent cleaning equipment, the intelligent cleaning equipment and a computer storage medium. The method comprises the following steps: acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing a moving target to be triggered into a plurality of frames of video images according to a preset time interval; dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image; and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the first region of interest ROI1 reaches a first control threshold value in the preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the second region of interest ROI2 reaches a second control threshold value, a second control signal is sent. The invention adopts the camera to identify the limb actions of the user, thus realizing the control of equipment start and stop and the like.

Description

Intelligent cleaning device, control method and computer storage medium
Technical Field
The invention relates to the technical field of cooking appliances, in particular to a control method of a cooking appliance and the cooking appliance.
Background
At present, intelligent cleaning equipment such as a sweeping robot is adopted to replace part of household labor in families, but a certain progress space is provided in the aspect of reducing household burden. The entity keys of the existing sweeping robot are arranged on the sweeping robot body and are relatively close to the ground, so that the operations such as starting and closing the sweeping robot are particularly inconvenient for the old who is not good at using a smart phone and has inconvenient waist and legs.
Accordingly, there is a need to provide a control method of a cooking appliance and a cooking appliance to at least partially solve the above-mentioned problems.
Disclosure of Invention
In the summary, a series of concepts in a simplified form are introduced, which will be further described in detail in the detailed description. The summary of the invention is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one aspect, the present invention provides an apparatus control method for an intelligent cleaning apparatus, the control method comprising: acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing a moving target to be triggered into a plurality of frames of video images according to a preset time interval; dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image; and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the first region of interest ROI1 reaches a first control threshold value in the preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the second region of interest ROI2 reaches a second control threshold value, a second control signal is sent.
According to the control method provided by the invention, the control such as starting and stopping of the equipment can be realized by using the camera to identify the limb actions of the user, the user does not need to personally operate keys of the intelligent cleaning equipment body, and great convenience is brought to the use of the user.
Preferably, the first region of interest ROI1 and the second region of interest ROI2 are planar regions perpendicular to the axis of the camera.
Therefore, the first and second regions of interest can be simply and accurately divided, and guarantee is provided for subsequent motion decomposition of the moving object to be triggered.
Preferably, the first region of interest ROI1 is a rectangle with long sides in the x-axis direction in a plane region perpendicular to the axis of the camera; the second region of interest ROI2 is a rectangle with long sides in the y-axis direction in a plane region perpendicular to the axis of the camera.
Thereby, the first region of interest ROI1 and the second region of interest ROI2 can be accurately divided.
Preferably, the first region of interest ROI1 is: and carrying out horizontal uniform movement on a mapping range formed by a second preset distance on an imaging plane of the camera by taking the axis of the camera as a reference at a position higher than the first preset distance of the moving target to be triggered.
Therefore, the first region of interest ROI1 must have a moving object to be triggered, which is beneficial for the intelligent cleaning apparatus to execute corresponding actions according to the movement of the moving object to be triggered.
Preferably, the second region of interest ROI2 is: and carrying out horizontal uniform movement by a third preset distance, which is different from the second preset distance, on a mapping range formed by the imaging plane of the camera by taking the axis of the camera as a base point corresponding to the position of the moving object to be triggered, which is higher than the first preset distance, of the camera.
Therefore, the second region of interest ROI2 must have a moving object to be triggered, which is beneficial for the intelligent cleaning apparatus to execute corresponding actions according to the movement of the moving object to be triggered.
Preferably, the motion index includes any one or combination of coverage, motion amplitude and motion center size of the motion of the moving object to be triggered.
Thereby, the accuracy of triggering the action to be performed can be improved by setting the motion index.
Preferably, a feature extraction algorithm is used to filter out background information in the high air and stationary targets in the video image of each frame so as to extract the moving targets to be triggered.
Thus, a clear moving object to be triggered can be obtained.
Preferably, if the camera is distorted, when the moving object to be triggered is extracted, a distortion model (k 1, k2, p1, p2, k 3) is adopted, and distortion correction is completed according to a pre-calibration result, where the distortion model is:
xc=x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2·p 1 ·xy+p 2 ·(r 2 +2x 2 )]
yc=y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2·p 1 ·xy+p 2 ·(r 2 +2y 2 )]
wherein (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
Therefore, the influence of inaccuracy of the extracted moving target to be triggered, which is caused by the distortion of the camera, can be avoided through distortion correction.
In another aspect, the present invention provides an intelligent cleaning apparatus comprising a memory, a processor and a computer program stored on the memory and running on the processor, the processor implementing the steps of the method described in any of the embodiments above when executing the program.
According to the intelligent cleaning equipment, control such as starting and stopping of the equipment can be realized by using the camera to identify the limb actions of the user, and the user does not need to personally operate keys of the intelligent cleaning equipment body, so that great convenience is brought to the user.
In a further aspect the invention provides a computer storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method described in any of the embodiments above.
Drawings
The following drawings are included to provide an understanding of the invention and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and their description to explain the principles of the invention.
In the accompanying drawings:
FIG. 1 is a schematic illustration of an intelligent cleaning apparatus in an operational scenario according to a preferred embodiment of the present invention;
fig. 2 shows a first region of interest ROI1 and a second region of interest ROI2;
FIG. 3 is a diagram showing a motion trajectory of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during operation of the smart cleaning device over a period of time in one embodiment;
fig. 4 is a detection result representing that the motion index of the moving object to be triggered in the first region of interest ROI1 reaches the first control threshold;
FIG. 5 is a diagram showing a motion trajectory of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during operation of the smart cleaning device for a period of time in another embodiment;
FIG. 6 is a graph representing the detection result of the motion index of the moving object to be triggered in the second region of interest ROI2 reaching the second control threshold;
FIG. 7 is a diagram showing a motion trajectory of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during operation of the smart cleaning device for a period of time in accordance with another embodiment; and
fig. 8 is a detection result representing that the motion index of the moving object to be triggered in both the first region of interest ROI1 and the second region of interest ROI2 does not reach the control threshold.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that embodiments of the invention may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the embodiments of the invention.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present invention. As used herein, the singular is intended to include the plural unless the context clearly indicates otherwise. Furthermore, it will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Exemplary embodiments according to the present invention will now be described in more detail with reference to the accompanying drawings. These exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. It should be appreciated that these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of these exemplary embodiments to those skilled in the art. In the drawings, thicknesses of layers and regions are exaggerated for clarity, and the same reference numerals are used to denote the same elements, so that descriptions thereof will be omitted.
The intelligent cleaning apparatus provided by the present disclosure may be, but is not limited to, a floor sweeping robot, a floor mopping robot, or a floor sweeping and mopping integrated robot, etc., and may include a machine body, a sensing system, a control system, a driving system, a cleaning system, an energy system, and a human-computer interaction system. Wherein: the machine body includes a forward portion and a rearward portion having an approximately circular shape (both front and rear circular) and may have other shapes including, but not limited to, an approximately D-shape with a front and rear circular shape.
The sensing system comprises a position determining device positioned above the machine main body, a buffer positioned at the forward part of the machine main body, a cliff sensor, an ultrasonic sensor, an infrared sensor, a magnetometer, an accelerometer, a gyroscope, an odometer and other sensing devices, and provides various position information and motion state information of the machine for the control system. Position determining devices include, but are not limited to, cameras, laser ranging devices (LDS). The following describes how to perform position determination by taking a laser ranging device of a triangulation method as an example. The basic principle of the triangulation method is based on the equal-ratio relationship of similar triangles, and will not be described here.
The laser ranging device comprises a light emitting unit and a light receiving unit. The light emitting unit may include a light source that emits light, and the light source may include a light emitting element such as an infrared or visible Light Emitting Diode (LED) that emits infrared or visible light. Preferably, the light source may be a light emitting element that emits a laser beam. In the present embodiment, a Laser Diode (LD) is taken as an example of a light source. In particular, the use of a light source for the laser beam may allow for more accurate measurements than other light due to the monochromatic, directional and collimating properties of the laser beam. For example, infrared light or visible light emitted by a Light Emitting Diode (LED) is affected by ambient factors (e.g., color or texture of an object) compared to a laser beam, and may be reduced in measurement accuracy. The Laser Diode (LD) may be a point laser, and may be a line laser, and may be a laser that measures two-dimensional position information of an obstacle, or may be a line laser that measures three-dimensional position information of an obstacle within a certain range.
The light receiving unit may include an image sensor on which a light spot reflected or scattered by an obstacle is formed. The image sensor may be a collection of a plurality of unit pixels in a single row or in multiple rows. These light receiving elements can convert an optical signal into an electrical signal. The image sensor may be a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge Coupled Device (CCD) sensor, which is preferable due to cost advantages. Also, the light receiving unit may include a light receiving lens assembly. Light reflected or scattered by the obstruction may travel through the light receiving lens assembly to form an image on the image sensor. The light receiving lens assembly may include a single or multiple lenses.
The base may support a light emitting unit and a light receiving unit, which are disposed on the base and spaced apart from each other by a certain distance. In order to measure the obstacle situation in the 360 degree direction around the robot, the base may be rotatably arranged on the main body, or the base itself may be rotated without rotating but by providing a rotating member to rotate the emitted light and the received light. The rotation angular velocity of the rotating element can be obtained by arranging an optical coupling element and a code wheel, wherein the optical coupling element senses the tooth gap on the code wheel, and the instantaneous angular velocity can be obtained by dividing the sliding time of the tooth gap distance and the tooth gap distance value. The greater the density of the tooth gaps on the code wheel is, the higher the accuracy and precision of measurement are correspondingly, but the more precise the structure is, and the higher the calculated amount is; conversely, the smaller the density of the tooth gaps, the lower the accuracy and precision of the measurement, but the simpler the structure can be, the smaller the calculated amount can be, and some cost can be reduced.
The data processing device, such as a DSP, connected with the light receiving unit records and transmits the obstacle distance values at all angles of the 0-degree angle direction of the robot to the data processing unit in the control system, such as an Application Processor (AP) containing a CPU, and the CPU operates a positioning algorithm based on particle filtering to obtain the current position of the robot and maps according to the position for navigation. The positioning algorithm preferably uses instant localization and mapping (SLAM).
Although the laser ranging device based on the triangular ranging method can measure distance values at infinite distances beyond a certain distance in principle, in practice, the implementation of long-distance measurement, for example, more than 6 meters, is very difficult, mainly because the size of a pixel unit on a sensor of a light receiving unit is limited, and is also influenced by the photoelectric conversion speed of the sensor, the data transmission speed between the sensor and a connected DSP, and the calculation speed of the DSP. The measured value obtained by the laser ranging device under the influence of temperature can also change intolerably in the system, mainly because the thermal expansion deformation of the structure between the light-emitting unit and the light-receiving unit leads to the angle change between incident light and emergent light, and the light-emitting unit and the light-receiving unit can also have the temperature drift problem. After the laser ranging device is used for a long time, the deformation caused by accumulation of various factors such as temperature change, vibration and the like can also seriously influence the measurement result. The accuracy of the measurement result directly determines the accuracy of drawing the map, and is particularly important as a basis for further strategy implementation of the robot.
The forward portion of the machine body may carry a bumper that detects one or more events (or objects) in the path of travel of the robot 100 via a sensor system, such as an infrared sensor, as the drive wheel module advances the robot across the floor during cleaning, and the robot may control the drive wheel module to cause the robot to respond to the events (or objects), such as away from an obstacle, by the events (or objects), such as an obstacle, wall, detected by the bumper.
The control system is arranged on a circuit board in the machine body and comprises a non-temporary memory, such as a hard disk, a flash memory and a random access memory, a communication computing processor, such as a central processing unit and an application processor, and the application processor draws an instant map in the environment of the robot by utilizing a positioning algorithm, such as SLAM (selective laser absorption) according to obstacle information fed back by the laser ranging device. And the distance information and the speed information fed back by the sensing devices such as the buffer, the cliff sensor, the ultrasonic sensor, the infrared sensor, the magnetometer, the accelerometer, the gyroscope, the odometer and the like are combined to comprehensively judge what working state the sweeper is in, such as a threshold, a carpet is arranged on the cliff, the upper part or the lower part of the sweeper is clamped, the dust box is full and is picked up, and the like, and a specific next action strategy can be provided according to different conditions, so that the work of the robot meets the requirements of an owner better, and better user experience is provided. Furthermore, the control system can plan the most efficient and reasonable cleaning path and cleaning mode based on the instant map information drawn by SLAM, and the cleaning efficiency of the robot is greatly improved.
The drive system may maneuver the robot 100 to travel across the ground based on drive commands having distance and angle information, such as x, y, and θ components. The drive system comprises a drive wheel module which can control both the left and right wheels simultaneously, preferably the drive wheel module comprises a left drive wheel module and a right drive wheel module, respectively, in order to control the movement of the machine more accurately. The left and right drive wheel modules are opposed along a transverse axis defined by the main body. In order for the robot to be able to move more stably or with a greater capacity on the ground, the robot may include one or more driven wheels, including but not limited to universal wheels. The driving wheel module comprises a travelling wheel, a driving motor and a control circuit for controlling the driving motor, and the driving wheel module can be connected with a circuit for measuring driving current and an odometer. The driving wheel module can be detachably connected to the main body, so that the assembly, disassembly and maintenance are convenient. The drive wheel may have a biased drop-down suspension system movably secured, e.g., rotatably attached, to the robot body and receiving a spring bias biased downward and away from the robot body. The spring bias allows the drive wheel to maintain contact and traction with the floor with a certain footprint while the cleaning elements also contact the floor with a certain pressure.
The cleaning system may be a dry cleaning system and/or a wet cleaning system. As a dry cleaning system, a main cleaning function is derived from a cleaning system composed of a rolling brush structure, a dust box structure, a blower structure, an air outlet and connecting members between the four. The rolling brush structure with certain interference with the ground sweeps up the garbage on the ground and winds up the dust collection hole between the rolling brush structure and the dust box structure, and then the dust box structure is sucked by the suction gas generated by the fan structure and passing through the dust box structure. The dust removal capability of the sweeper can be characterized by the sweeping efficiency DPU (Dust pick up efficiency) of the garbage, the sweeping efficiency DPU is influenced by a rolling brush structure and materials, and is influenced by the wind power utilization rate of an air duct formed by a dust collection port, a dust box structure, a fan structure, an air outlet and connecting parts among the four, and the sweeping efficiency DPU is influenced by the type and the power of the fan, so that the sweeping efficiency DPU is a complex system design problem. Compared with the common plug-in dust collector, the improved dust removing capability is of greater significance for the cleaning robot with limited energy. Because the dust removal capability is improved, the energy requirement is directly and effectively reduced, that is to say, the original machine which can clean the ground of 80 square meters after charging once can be evolved into the machine which can clean the ground of 180 square meters or more after charging once. And the service life of the battery, which reduces the number of times of charging, is greatly increased, so that the frequency of replacing the battery by a user is also increased. More intuitively and importantly, the improvement of dust removal capability is the most obvious and important user experience, and users can directly draw a conclusion on whether the dust is cleaned/rubbed clean. The dry cleaning system may also include a side brush having a rotating shaft that is angled relative to the floor for moving debris into a roller brush area of the cleaning system.
The energy system includes rechargeable batteries, such as nickel metal hydride batteries and lithium batteries. The rechargeable battery can be connected with a charging control circuit, a battery pack charging temperature detection circuit and a battery under-voltage monitoring circuit, and the charging control circuit, the battery pack charging temperature detection circuit and the battery under-voltage monitoring circuit are connected with the singlechip control circuit. The host computer charges through setting up the charging electrode in fuselage side or below and charging pile connection.
The man-machine interaction system comprises keys on a panel of the host machine, wherein the keys are used for users to select functions; the system also comprises a display screen and/or an indicator light and/or a loudspeaker, wherein the display screen, the indicator light and the loudspeaker show the current state or function selection item of the machine to a user; a cell phone client program may also be included. For the path navigation type cleaning equipment, a map of the environment where the equipment is located and the position where the machine is located can be displayed to a user at the mobile phone client, and more abundant and humanized functional items can be provided for the user.
The intelligent cleaning equipment is provided with an image acquisition unit and a distance measurement unit; the image acquisition unit is used for acquiring image data, and the ranging unit is used for acquiring ranging data. Wherein the image acquisition unit and the ranging unit may be comprised in a position determining device of the above mentioned perception system. For example, the image acquisition unit may be a camera and the ranging unit may be a laser ranging device. As another example, the image acquisition unit and the ranging unit may be integrated in the camera; for example, a depth camera with a TOF (Time of flight) function, or a camera employing 3D structured light technology may be employed. Of course, the present disclosure is not limited to the specific hardware form of the image acquisition unit and the ranging unit.
Based on the structure of the intelligent cleaning equipment, the invention provides a control method of the intelligent cleaning equipment. The method may include:
acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing a moving target to be triggered into a plurality of frames of video images according to a preset time interval;
dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image;
and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the first region of interest ROI1 reaches a first control threshold value in a preset time interval, sending a first control signal, and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the second region of interest ROI2 reaches a second control threshold value, sending a second control signal different from the first control signal.
The camera can acquire the spatial scene video by adopting an infrared technology or a depth-of-field technology. The "preset time interval" may be understood as: the video acquired by the camera in the short time of the start-up of the intelligent cleaning equipment may be unstable, so that the video is not a video which needs to be analyzed, and the video can be cut off; the period of time during which the intelligent cleaning apparatus is operating stably may be regarded as a preset time interval. It will be appreciated that during this preset time interval, video containing the moving object (e.g., arm) to be triggered is required by the intelligent cleaning apparatus.
For the space scene shot by the camera, the installation angle of the camera is related to the installation angle of the camera on the intelligent cleaning device, and the installation angle of the camera also influences the division of the first region of interest ROI1 and the second region of interest ROI2 of the video image in the subsequent step.
After the video in the preset time interval is acquired and decomposed into a plurality of frames of video images, the region of interest of each frame of video image is divided. Since the multiple frames of video images all contain the moving object to be triggered, specifically, the first region of interest ROI1 may be: the corresponding arm moves horizontally (left and right) at a uniform speed by a mapping range formed by a second preset distance on the imaging plane S of the camera by taking the axis L of the camera as a reference at a position higher than the first preset distance of the camera (refer to figure 1). It will be appreciated that the first region of interest ROI1 is a region in the imaging plane S perpendicular to the axis L of the camera.
As shown in fig. 1 and 2, the first predetermined distance may be set between 1 meter and 1.5 meters. The second predetermined distance refers to a distance shifted to the left or right with an intersection point of the hand and the axis L of the camera as a reference point. The second predetermined distance may be 30cm. Of course, the second predetermined distance may be set to other values, such as 35cm,40cm, etc., according to actual needs.
Likewise, the second region of interest ROI2 may be: the corresponding arm moves horizontally (left and right) at a constant speed by a third predetermined distance in a mapping range formed by an imaging plane S of the camera by taking the axis L of the camera as a reference at a position higher than the first predetermined distance of the camera (refer to fig. 1). It will be appreciated that the second region of interest ROI2 is a region in the imaging plane S perpendicular to the axis L of the camera.
Similarly, the third predetermined distance refers to a distance different from the second predetermined distance shifted leftward or rightward with the intersection of the hand and the axis L of the camera as a reference point. The third predetermined distance may be 50cm. Of course, the third predetermined distance may be set to other values, such as 55cm,60cm, etc., according to actual needs.
The dashed box shown in fig. 2 is a true spatial scene region, which corresponds to a first region of interest ROI1 and a second region of interest ROI2 within the imaging plane S of the camera. As can be appreciated from fig. 2, the first region of interest ROI1 is rectangular with long sides in the x-axis direction in the imaging plane S perpendicular to the axis L of the camera; the second region of interest ROI2 is rectangular with long sides in the y-axis direction in the imaging plane S perpendicular to the camera axis L.
As described above, after the region of interest of the video image including the moving object to be triggered is divided, next, the relationship between the moving index of the moving object to be triggered and the control threshold needs to be determined. If the motion index of the moving object to be triggered, which falls in the first region of interest ROI1 in the multi-frame video image, reaches a first control threshold in a preset time interval, a first control signal (for example, an on signal) is sent, and if the motion index of the moving object to be triggered, which falls in the second region of interest ROI2 in the multi-frame video image, reaches a second control threshold, a second control signal (for example, a stop signal) different from the first control signal is sent. The first and second control signals may also be provided as signals for switching different modes of operation.
The first region of interest ROI1 may be understood as a region projected by decomposing the motion of the moving object to be triggered into X-direction motion, and the second region of interest ROI2 may be understood as a region projected by decomposing the motion of the moving object to be triggered into Y-direction motion.
In combination with a specific example, it can be understood that if the hand tends to be in the horizontal direction during the period of time when the moving object (hand) to be triggered moves, the motion index of the hand in the first region of interest ROI1 meets the first control threshold, and the intelligent cleaning apparatus is controlled to be turned on by the first control signal. If the hand tends to be vertical in the period of time when the moving object (hand) to be triggered moves, the motion index of the hand in the second region of interest (ROI 1) meets a second control threshold, and the intelligent cleaning equipment is controlled to stop through a second control signal.
Specifically, the movement index of the moving object to be triggered may include any one or a combination of a coverage range, an action amplitude, and a movement center size of the movement of the moving object to be triggered. Thereby, the accuracy of triggering the action to be performed can be improved by setting the motion index.
The threshold value for the sports index is an empirical value. The threshold is selected such that palm movement is normally observed in the flowsheet.
It can be understood that if the motion index of the moving object to be triggered is desired, the coordinates of the moving object to be triggered in the region of interest need to be obtained first. Then, when the image is acquired by the camera, the background information in the high air and the stationary target in each frame of video image can be filtered by utilizing a feature extraction algorithm so as to extract the moving target to be triggered.
If the camera is distorted, when the moving object to be triggered is extracted, a distortion model of (k 1, k2, p1, p2, k 3) can be adopted, and distortion correction can be completed according to a pre-calibration result. The distortion model is:
xc=x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2·p 1 ·xy+p 2 ·(r 2 +2x 2 )]
yc=y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2·p 1 ·xy+p 2 ·(r 2 +2y 2 )]
wherein (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
In the following, the control principle of the intelligent cleaning apparatus applying the control method of the present invention in the test process will be described with reference to fig. 3 to 8, so as to better understand the present invention.
As shown in fig. 3, the darker colored line represents the motion trajectory of the moving object to be triggered in the X direction, and the lighter colored line represents the motion trajectory of the moving object to be triggered in the Y direction. It can be seen that the motion track in the X direction has a larger amplitude, and the motion index thereof meets the set first control threshold, so that the detection result is shown in fig. 4, and the action of starting the intelligent cleaning device is triggered.
As shown in fig. 5, the darker colored line represents the movement trace of the moving object to be triggered in the X direction, and the lighter colored line represents the movement trace of the moving object to be triggered in the Y direction. It can be seen that the motion track in the Y direction has a larger amplitude, and the motion index thereof meets the set second control threshold, so that the detection result is shown in fig. 6, and the action of the intelligent cleaning device is triggered to stop.
As shown in fig. 7, the darker colored line represents the movement trace of the moving object to be triggered in the X direction, and the lighter colored line represents the movement trace of the moving object to be triggered in the Y direction. The movement index does not meet the set first and second control thresholds and therefore does not trigger an action.
The invention also provides intelligent cleaning equipment, the specific structure of which is described in detail above and is not repeated here. In addition, the present invention also provides a computer storage medium having stored thereon a computer program which when executed by a processor implements the steps of the method described in any of the above embodiments.
According to the control method provided by the invention, the control such as starting and stopping of the equipment can be realized by using the camera to identify the limb actions of the user, the user does not need to personally operate keys of the intelligent cleaning equipment body, and great convenience is brought to the use of the user.
The processes, steps described in all the preferred embodiments described above are examples only. Unless adverse effects occur, various processing operations may be performed in an order different from that of the above-described flow. The step sequence of the above-mentioned flow can also be added, combined or deleted according to the actual requirement.
Further, the commands, command numbers and data items described in all the above preferred embodiments are examples only, and thus these commands, command numbers and data items may be set in any manner as long as the same functions are achieved. The elements of the terminals of the preferred embodiments may also be integrated, further divided or pruned as desired.
The present invention has been described in terms of the above embodiments, but it should be understood that the above embodiments are for purposes of illustration and description only and are not intended to limit the invention to the embodiments described. In addition, it will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that many variations and modifications may be made in accordance with the teachings of the present invention, which fall within the scope of the claimed invention. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (6)

1. A device control method for an intelligent cleaning device, the control method comprising:
acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing a moving target to be triggered into a plurality of frames of video images according to a preset time interval;
dividing a first region of interest (ROI 1) and a second region of interest (ROI 2) of the video image, wherein the first region of interest (ROI 1) is a rectangle with long sides in the x-axis direction, the second region of interest (ROI 2) is a rectangle with long sides in the y-axis direction, and the first region of interest (ROI 1) and the second region of interest (ROI 2) are imaging plane regions perpendicular to the axis of the camera;
and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the first region of interest ROI1 reaches a first control threshold value, a first control signal is sent, and if the motion index of the to-be-triggered moving object in the multi-frame video image falling into the second region of interest ROI2 reaches a second control threshold value, a second control signal different from the first control signal is sent.
2. The apparatus control method according to claim 1, wherein the movement index includes any one or a combination of a coverage range, an action amplitude, and a movement center size of the movement of the moving object to be triggered.
3. The apparatus control method according to claim 1, wherein background information in high air and stationary objects in the video image of each frame are filtered out using a feature extraction algorithm to extract the moving object to be triggered.
4. The apparatus control method according to claim 3, wherein if the camera is distorted, when the moving object to be triggered is extracted, a distortion model (k 1, k2, p1, p2, k 3) is used to perform distortion correction according to a pre-calibration result, the distortion model being:
xc=x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2•p 1 •xy+p 2 •(r 2 +2x 2 )]
yc=y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )+[2•p 1 •xy+p 2 •(r 2 +2y 2 )]
wherein (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
5. A smart cleaning device comprising a memory, a processor and a computer program stored on the memory and running on the processor, characterized in that the processor implements the steps of the method of any one of claims 1 to 4 when the program is executed by the processor.
6. A computer storage medium having stored thereon a computer program, characterized in that the program when executed by a processor realizes the steps of the method according to any of claims 1 to 4.
CN202010948471.0A 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium Active CN112244705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010948471.0A CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010948471.0A CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Publications (2)

Publication Number Publication Date
CN112244705A CN112244705A (en) 2021-01-22
CN112244705B true CN112244705B (en) 2023-05-23

Family

ID=74232154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010948471.0A Active CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Country Status (1)

Country Link
CN (1) CN112244705B (en)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2457711A1 (en) * 2001-08-16 2003-02-27 Humanbeams, Inc. Music instrument system and method
US20140152566A1 (en) * 2012-12-05 2014-06-05 Brent A. Safer Apparatus and methods for image/sensory processing to control computer operations
CN103257713B (en) * 2013-05-31 2016-05-04 华南理工大学 A kind of gesture control method
CN104422066B (en) * 2013-08-23 2017-03-15 珠海格力电器股份有限公司 The system of intelligent air condition control, method and air-conditioning
CN105725932B (en) * 2016-01-29 2018-12-28 江西智能无限物联科技有限公司 intelligent sweeping robot
US10261183B2 (en) * 2016-12-27 2019-04-16 Gerard Dirk Smits Systems and methods for machine perception
CN107656443A (en) * 2017-09-18 2018-02-02 成都易慧家科技有限公司 A kind of intelligent home control system and method based on deep learning
US11771283B2 (en) * 2017-12-06 2023-10-03 BISSELL , Inc. Method and system for manual control of autonomous floor cleaner
CN109991859B (en) * 2017-12-29 2022-08-23 青岛有屋科技有限公司 Gesture instruction control method and intelligent home control system
CN108107796B (en) * 2018-01-23 2024-03-08 广东美的厨房电器制造有限公司 Range hood, gesture recognition control method and device thereof and readable storage medium
CN108758728B (en) * 2018-03-29 2023-02-17 青岛海尔智能技术研发有限公司 Head gesture-based range hood control method and range hood
CN110069137B (en) * 2019-04-30 2022-07-08 徐州重型机械有限公司 Gesture control method, control device and control system
CN111062269B (en) * 2019-11-25 2021-09-07 珠海格力电器股份有限公司 User state identification method and device, storage medium and air conditioner
CN111461059A (en) * 2020-04-21 2020-07-28 哈尔滨拓博科技有限公司 Multi-zone multi-classification extensible gesture recognition control device and control method

Also Published As

Publication number Publication date
CN112244705A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
AU2018100726A4 (en) Automatic cleaning device and cleaning method
CN109431381B (en) Robot positioning method and device, electronic device and storage medium
CN109947109B (en) Robot working area map construction method and device, robot and medium
CN110623606B (en) Cleaning robot and control method thereof
CN114521836A (en) Automatic cleaning equipment
CN109932726B (en) Robot ranging calibration method and device, robot and medium
CN106239517A (en) Robot and the method for the autonomous manipulation of realization, device
CN106226755B (en) Robot
US20240029298A1 (en) Locating method and apparatus for robot, and storage medium
CN111857153B (en) Distance detection device and robot sweeps floor
CN112244705B (en) Intelligent cleaning device, control method and computer storage medium
CN217792839U (en) Automatic cleaning equipment
CN210673215U (en) Multi-light-source detection robot
CN208207201U (en) robot
CN210931183U (en) Cleaning robot
CN211270533U (en) Camera device and cleaning robot
CN114608520A (en) Distance measuring method, device, robot and storage medium
CN210673216U (en) Light filtering type robot
CN214231240U (en) Cleaning robot
CN209911548U (en) Distance measuring device and autonomous mobile robot
CN117426709A (en) Self-mobile equipment and ranging method thereof
CN117008148A (en) Method, apparatus and storage medium for detecting slip state
CN116942017A (en) Automatic cleaning device, control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220415

Address after: 102299 No. 8008, floor 8, building 16, yard 37, Chaoqian Road, Zhongguancun Science and Technology Park, Changping District, Beijing

Applicant after: Beijing Stone Innovation Technology Co.,Ltd.

Address before: 100192 No. 6016, 6017, 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing

Applicant before: Beijing Roborock Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant