CN112244705A - Intelligent cleaning device, control method and computer storage medium - Google Patents

Intelligent cleaning device, control method and computer storage medium Download PDF

Info

Publication number
CN112244705A
CN112244705A CN202010948471.0A CN202010948471A CN112244705A CN 112244705 A CN112244705 A CN 112244705A CN 202010948471 A CN202010948471 A CN 202010948471A CN 112244705 A CN112244705 A CN 112244705A
Authority
CN
China
Prior art keywords
region
camera
triggered
interest
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010948471.0A
Other languages
Chinese (zh)
Other versions
CN112244705B (en
Inventor
于炀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Stone Innovation Technology Co ltd
Original Assignee
Beijing Rockrobo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockrobo Technology Co Ltd filed Critical Beijing Rockrobo Technology Co Ltd
Priority to CN202010948471.0A priority Critical patent/CN112244705B/en
Publication of CN112244705A publication Critical patent/CN112244705A/en
Application granted granted Critical
Publication of CN112244705B publication Critical patent/CN112244705B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an equipment control method for intelligent cleaning equipment, the intelligent cleaning equipment and a computer storage medium. The method comprises the following steps: acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing the moving target to be triggered into a plurality of frames of video images according to a preset time interval; dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image; if the motion index of the to-be-triggered moving target falling into the first region of interest ROI1 in the multi-frame video image reaches a first control threshold value within the preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving target falling into the second region of interest ROI2 in the multi-frame video image reaches a second control threshold value, a second control signal is sent. The invention adopts the camera to recognize the limb action of the user, and can realize the control of starting and stopping the equipment.

Description

Intelligent cleaning device, control method and computer storage medium
Technical Field
The invention relates to the technical field of cooking appliances, in particular to a control method of a cooking appliance and the cooking appliance.
Background
At present, the situations that intelligent cleaning equipment such as a sweeping robot is adopted to replace part of housework in families are more and more, but certain progress space is provided for reducing housework burden. The existing entity keys of the sweeping robot are arranged on the body of the sweeping robot and are relatively close to the ground, so that the operations of starting and closing the sweeping robot are particularly inconvenient for the old who is not good at using a smart phone and has inconvenience in waist and legs.
Therefore, it is desirable to provide a control method of a cooking appliance and a cooking appliance to at least partially solve the above problems.
Disclosure of Invention
In this summary, concepts in a simplified form are introduced that are further described in the detailed description. This summary of the invention is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The invention provides a device control method for an intelligent cleaning device, which comprises the following steps: acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing the moving target to be triggered into a plurality of frames of video images according to a preset time interval; dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image; if the motion index of the to-be-triggered moving target falling into the first region of interest ROI1 in the multi-frame video image reaches a first control threshold value within the preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving target falling into the second region of interest ROI2 in the multi-frame video image reaches a second control threshold value, a second control signal is sent.
According to the control method, the control of starting and stopping the equipment can be realized by identifying the limb action of the user through the camera, the user does not need to operate the key of the intelligent cleaning equipment body in person, and great convenience is brought to the user.
Preferably, the first region of interest ROI1 and the second region of interest ROI2 are planar regions perpendicular to the axis of the camera.
Therefore, the first region of interest and the second region of interest can be simply and accurately marked off, and the subsequent motion decomposition of the motion target to be triggered is guaranteed.
Preferably, the first region of interest ROI1 is in a planar region perpendicular to the axis of the camera, and the long side is a rectangle in the x-axis direction; the second region of interest ROI2 is a rectangle having a long side in the y-axis direction in a planar region perpendicular to the axis of the camera.
Thereby, the first region of interest ROI1 and the second region of interest ROI2 can be accurately divided.
Preferably, the first region of interest ROI1 is: and horizontally moving at a constant speed a second preset distance in a mapping range formed by an imaging plane of the camera by taking the axis of the camera as a reference at a position higher than the first preset distance of the camera corresponding to the moving target to be triggered.
Therefore, the first region of interest ROI1 has to have the motion object to be triggered, which is beneficial for the intelligent cleaning device to perform corresponding actions according to the motion of the motion object to be triggered.
Preferably, the second region of interest ROI2 is: and horizontally moving a third preset distance at a uniform speed within a mapping range formed by an imaging plane of the camera by taking the axis of the camera as a base point at a position higher than the first preset distance of the camera corresponding to the moving target to be triggered, wherein the third preset distance is different from the second preset distance.
Therefore, the second region of interest ROI2 has to have the motion target to be triggered, which is beneficial for the intelligent cleaning device to execute corresponding action according to the motion of the motion target to be triggered.
Preferably, the motion index includes any one or combination of a coverage range, an action amplitude and a motion center size of the motion target to be triggered.
Therefore, the accuracy of triggering the action to be executed can be improved by setting the motion index.
Preferably, a feature extraction algorithm is used for filtering out background information and stationary targets in the air in each frame of the video image so as to extract the moving target to be triggered.
Thereby, a clear target to be triggered can be obtained.
Preferably, if the camera head is distorted, when the motion target to be triggered is extracted, a (k1, k2, p1, p2, k3) distortion model is adopted, and distortion correction is completed according to a pre-calibration result, wherein the distortion model is as follows:
xc=x(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2x2)]
yc=y(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2y2)]
wherein, (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
Therefore, the influence of inaccuracy of the extracted to-be-triggered moving target caused by camera distortion can be avoided through distortion correction.
Another aspect of the present invention provides an intelligent cleaning device, comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor executes the program to implement the steps of the method of any of the above embodiments.
According to the intelligent cleaning equipment, the control of starting and stopping the equipment can be realized by identifying the limb action of the user through the camera, the user does not need to operate the key of the intelligent cleaning equipment body in person, and great convenience is brought to the user.
A further aspect of the invention provides a computer storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any of the above embodiments.
Drawings
The following drawings of the invention are included to provide a further understanding of the invention. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
In the drawings:
FIG. 1 is a schematic view of a smart cleaning appliance in a work scenario, according to a preferred embodiment of the present invention;
fig. 2 shows a first region of interest ROI1 and a second region of interest ROI 2;
fig. 3 is a motion trajectory of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during a period of time during operation of the intelligent cleaning device in one embodiment;
fig. 4 is a detection result representing that the motion index of the moving object to be triggered in the first region of interest ROI1 reaches the first control threshold;
fig. 5 is a movement track of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during a period of time during the operation of the intelligent cleaning device in another embodiment;
fig. 6 shows the detection result representing that the motion index of the moving object to be triggered in the second region of interest ROI2 reaches the second control threshold;
fig. 7 is a movement track of a moving object to be triggered in a first region of interest ROI1 and a second region of interest ROI2 during a period of time during the operation of the intelligent cleaning device according to yet another embodiment; and
fig. 8 is a detection result representing that the motion index of the motion object to be triggered in the first region of interest ROI1 and the second region of interest ROI2 does not reach the control threshold.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art, that embodiments of the invention may be practiced without one or more of these specific details. In other instances, well-known features have not been described in detail so as not to obscure the embodiments of the invention.
It should be noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Exemplary embodiments according to the present invention will now be described in more detail with reference to the accompanying drawings. These exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to only the embodiments set forth herein. It is to be understood that these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of these exemplary embodiments to those skilled in the art. In the drawings, the thicknesses of layers and regions are exaggerated for clarity, and the same elements are denoted by the same reference numerals, and thus the description thereof will be omitted.
The intelligent cleaning device provided by the present disclosure may be (but is not limited to) a sweeping robot, a mopping robot, or a sweeping and mopping integrated robot, etc., and may include a machine body, a sensing system, a control system, a driving system, a cleaning system, an energy system, and a human-computer interaction system. Wherein: the machine body includes a forward portion and a rearward portion and has an approximately circular shape (circular front to back), but may have other shapes including, but not limited to, an approximately D-shape with a front to back circle.
The sensing system includes a position determining device located above the machine body, a bumper located at a forward portion of the machine body, cliff sensors and ultrasonic sensors, infrared sensors, magnetometers, accelerometers, gyroscopes, odometers, and like sensing devices, providing various positional and kinematic state information of the machine to the control system. The position determining device includes, but is not limited to, a camera, a laser distance measuring device (LDS). The following describes how position determination is performed by taking a laser distance measuring device of the triangulation method as an example. The basic principle of the triangulation method is based on the geometric relation of similar triangles, and is not described herein.
The laser ranging device includes a light emitting unit and a light receiving unit. The light emitting unit may include a light source that emits light, and the light source may include a light emitting element, such as an infrared or visible Light Emitting Diode (LED) that emits infrared light or visible light. Preferably, the light source may be a light emitting element that emits a laser beam. In the present embodiment, a Laser Diode (LD) is taken as an example of the light source. In particular, a light source using a laser beam may make the measurement more accurate than other lights due to the monochromatic, directional, and collimation characteristics of the laser beam. For example, infrared or visible light emitted by a Light Emitting Diode (LED) is affected by ambient environmental factors (e.g., color or texture of an object) as compared to a laser beam, and may be reduced in measurement accuracy. The Laser Diode (LD) may be a spot laser for measuring two-dimensional position information of an obstacle, or a line laser for measuring three-dimensional position information of an obstacle within a certain range.
The light receiving unit may include an image sensor on which a light spot reflected or scattered by an obstacle is formed. The image sensor may be a set of a plurality of unit pixels of a single row or a plurality of rows. These light receiving elements can convert optical signals into electrical signals. The image sensor may be a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge Coupled Device (CCD) sensor, and is preferably a Complementary Metal Oxide Semiconductor (CMOS) sensor due to cost advantages. Also, the light receiving unit may include a light receiving lens assembly. Light reflected or scattered by the obstruction may travel through a light receiving lens assembly to form an image on the image sensor. The light receiving lens assembly may comprise a single or multiple lenses.
The base may support the light emitting unit and the light receiving unit, which are disposed on the base and spaced apart from each other by a certain distance. In order to measure the obstacle situation in the 360 degree direction around the robot, the base may be rotatably disposed on the main body, or the base itself may be rotated by providing a rotating element to rotate the emitted light and the received light without rotating. The rotating angular speed of the rotating element can be obtained by arranging the optical coupling element and the coded disc, the optical coupling element senses tooth gaps on the coded disc, and instantaneous angular speed can be obtained by dividing the sliding time of the tooth gap distance and the tooth gap distance value. The higher the density of the tooth notches on the coded disc is, the higher the measurement accuracy and precision are correspondingly, but the structure is more precise, and the calculated amount is higher; on the contrary, the smaller the density of the tooth defects is, the lower the accuracy and precision of measurement are, but the structure can be relatively simple, the calculation amount is smaller, and the cost can be reduced.
The data processing device connected with the light receiving unit, such as a DSP, records and transmits the obstacle distance values at all angles in the angle direction of 0 degree relative to the robot to a data processing unit in the control system, such as an Application Processor (AP) comprising a CPU, the CPU runs a particle filter-based positioning algorithm to obtain the current position of the robot, and maps according to the position for navigation. The positioning algorithm preferably uses instant positioning and mapping (SLAM).
Although the laser distance measuring device based on the triangulation method can measure the distance value at an infinite distance beyond a certain distance in principle, in practice, the realization of the long-distance measurement, for example, more than 6 meters, is difficult, mainly because of the size limitation of the pixel unit on the sensor of the light receiving unit, and at the same time, the laser distance measuring device is also influenced by the photoelectric conversion speed of the sensor, the data transmission speed between the sensor and the connected DSP, and the calculation speed of the DSP. The measured value obtained by the laser ranging device under the influence of temperature can also have variation which cannot be tolerated by a system, mainly because the angle between incident light and emergent light is changed due to thermal expansion deformation of a structure between the light emitting unit and the light receiving unit, and the light emitting unit and the light receiving unit can also have the temperature drift problem. After the laser ranging device is used for a long time, the measurement result is also seriously influenced by deformation caused by accumulation of various factors such as temperature change, vibration and the like. The accuracy of the measuring result directly determines the accuracy of the map drawing, and is the basis for further strategy implementation of the robot, and is particularly important.
The forward portion of the machine body may carry a bumper that detects one or more events (or objects) in the travel path of the robot 100 via a sensor system, such as an infrared sensor, as the drive wheel module propels the robot over the ground during cleaning, and the robot may control the drive wheel module to cause the robot to respond to the events (or objects), such as moving away from an obstacle, by detecting the events (or objects), such as an obstacle, a wall, by the bumper.
The control system is arranged on a circuit main board in the machine body and comprises a non-transitory memory, such as a hard disk, a flash memory and a random access memory, a communication calculation processor, such as a central processing unit and an application processor, wherein the application processor draws an instant map of the environment where the robot is located by using a positioning algorithm, such as SLAM, according to the obstacle information fed back by the laser ranging device. And the current working state of the sweeper is comprehensively judged by combining distance information and speed information fed back by sensing devices such as a buffer, a cliff sensor, an ultrasonic sensor, an infrared sensor, a magnetometer, an accelerometer, a gyroscope, a speedometer and the like, for example, when the sweeper passes a threshold, a carpet is arranged at the cliff, the upper part or the lower part of the sweeper is clamped, a dust box is full, the sweeper is taken up and the like, and a specific next-step action strategy is provided according to different conditions, so that the robot can work more according with the requirements of an owner, and better user experience is achieved. Furthermore, the control system can plan the most efficient and reasonable cleaning path and cleaning mode based on the instant map information drawn by the SLAM, and the cleaning efficiency of the robot is greatly improved.
The drive system may steer the robot 100 across the ground based on drive commands having distance and angle information, such as x, y, and theta components. The drive system includes drive wheel modules that can control both the left and right wheels, preferably including a left drive wheel module and a right drive wheel module, respectively, for more precise control of the motion of the machine. The left and right drive wheel modules are opposed along a transverse axis defined by the body. In order for the robot to be able to move more stably or with greater mobility over the ground, the robot may include one or more driven wheels, including but not limited to universal wheels. The driving wheel module comprises a traveling wheel, a driving motor and a control circuit for controlling the driving motor, and can also be connected with a circuit for measuring driving current and a milemeter. The driving wheel module can be detachably connected to the main body, and is convenient to disassemble, assemble and maintain. The drive wheel may have a biased drop-type suspension system movably secured, e.g., rotatably attached, to the robot body and receiving a spring bias biased downward and away from the robot body. The spring bias allows the drive wheel to maintain contact and traction with the floor with a certain landing force while the cleaning element also contacts the floor with a certain pressure.
The cleaning system may be a dry cleaning system and/or a wet cleaning system. As a dry cleaning system, the main cleaning function is derived from a sweeping system composed of a rolling brush structure, a dust box structure, a fan structure, an air outlet, and connecting members therebetween. The rolling brush structure with certain interference with the ground sweeps the garbage on the ground and winds the garbage to the front of a dust suction opening between the rolling brush structure and the dust box structure, and then the garbage is sucked into the dust box structure by the air with suction generated by the fan structure and passing through the dust box structure. The Dust removal capability of the sweeper can be represented by cleaning efficiency DPU (Dust pick up efficiency), the cleaning efficiency DPU is influenced by a rolling brush structure and materials, the wind power utilization rate of an air duct formed by a Dust suction port, a Dust box structure, a fan structure, an air outlet and connecting parts among the Dust suction port, the Dust box structure, the fan structure, the air outlet and the Dust box structure is influenced by the type and the power of a fan, and the sweeping machine is a complicated system design problem. Compared with the common plug-in dust collector, the improvement of the dust removal capability is more significant for the cleaning robot with limited energy. Because the improvement of the dust removal capability directly and effectively reduces the energy requirement, namely the machine which can clean the ground of 80 square meters by charging once can be developed into the machine which can clean 180 square meters or more by charging once. And the service life of the battery, which reduces the number of times of charging, is also greatly increased, so that the frequency of replacing the battery by the user is also increased. More intuitively and importantly, the improvement of the dust removal capability is the most obvious and important user experience, and the user can directly draw a conclusion whether the sweeping/wiping is clean. The dry cleaning system can also include an edge brush having an axis of rotation that is angled relative to the floor for moving debris into the roller brush area of the cleaning system.
Energy systems include rechargeable batteries, such as nickel metal hydride batteries and lithium batteries. The charging battery can be connected with a charging control circuit, a battery pack charging temperature detection circuit and a battery under-voltage monitoring circuit, and the charging control circuit, the battery pack charging temperature detection circuit and the battery under-voltage monitoring circuit are connected with the single chip microcomputer control circuit. The host computer is connected with the charging pile through the charging electrode arranged on the side or the lower part of the machine body for charging.
The man-machine interaction system comprises keys on a host panel, and the keys are used for a user to select functions; the machine control system can further comprise a display screen and/or an indicator light and/or a loudspeaker, wherein the display screen, the indicator light and the loudspeaker show the current state or function selection item of the machine to a user; and a mobile phone client program can be further included. For the path navigation type cleaning equipment, a map of the environment where the equipment is located and the position of a machine can be displayed for a user at a mobile phone client, and richer and more humanized function items can be provided for the user.
The intelligent cleaning equipment provided by the disclosure is provided with an image acquisition unit and a distance measurement unit; the image acquisition unit is used for acquiring image data, and the ranging unit is used for acquiring ranging data. The image acquisition unit and the distance measurement unit can be contained in the position determination device of the sensing system. For example, the image acquisition unit may be a camera and the ranging unit may be a laser ranging device. For another example, the image acquisition unit and the ranging unit may be integrated in a camera; for example, a depth-sensing camera having a TOF (Time of flight) function, or a camera using a 3D structured light technique may be employed. Of course, the present disclosure does not limit the specific hardware form of the image acquisition unit and the ranging unit.
Based on the structure of the intelligent cleaning equipment, the invention provides a control method of the intelligent cleaning equipment. The method can comprise the following steps:
the method comprises the steps of obtaining a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing a moving target to be triggered into a plurality of frames of video images according to a preset time interval;
dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image;
if the motion index of the to-be-triggered moving target in the multi-frame video image, which falls into the first region of interest ROI1, reaches a first control threshold value within a preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving target in the multi-frame video image, which falls into the second region of interest ROI2, reaches a second control threshold value, a second control signal different from the first control signal is sent.
The camera can acquire a space scene video by adopting an infrared technology or a depth of field technology. The "preset time interval" can be understood as: the video acquired by the camera in the short time when the intelligent cleaning equipment is just started up is possibly unstable, so that the video does not belong to the video to be analyzed, and the video can be cut off; the time period during which the intelligent cleaning device is stably operated may be regarded as a preset time interval. It will be appreciated that during the preset time interval, a video containing the moving object (e.g., arm) to be triggered is required by the intelligent cleaning device.
Regarding the spatial scene shot by the camera, the installation angle of the camera is related to the installation angle of the camera on the intelligent cleaning device, and the division of the first region of interest ROI1 and the second region of interest ROI2 of the video image in the subsequent step is also affected by the installation angle of the camera.
After a video within a preset time interval is acquired and decomposed into a plurality of frames of video images, an interested area of each frame of video image is divided. Since the multiple frames of video images all contain the moving object to be triggered, the first region of interest ROI1 may specifically be: and horizontally (leftwards and rightwards) moving the arm at a constant speed by a second preset distance in a mapping range formed by an imaging plane S of the camera by taking the axis L of the camera as a reference at a position (refer to fig. 1) higher than the first preset distance of the camera correspondingly. It will be appreciated that the first region of interest ROI1 is a region in the imaging plane S that is perpendicular to the axis L of the camera.
As shown in fig. 1 and 2, the first predetermined distance may be set between 1 meter and 1.5 meters. The second predetermined distance is a distance translated leftward or rightward with an intersection point of the hand and the axis L of the camera as a reference point. The second predetermined distance may be 30 cm. Of course, the second predetermined distance may be set to other values according to actual requirements, such as 35cm, 40cm, and so on.
Likewise, the second region of interest ROI2 may be: and horizontally (leftwards and rightwards) moving at a constant speed by a third preset distance in a mapping range formed by an imaging plane S of the camera by taking the axis L of the camera as a reference at a position (refer to fig. 1) higher than the first preset distance of the camera corresponding to the arm. It will be appreciated that the second region of interest ROI2 is a region in the imaging plane S that is perpendicular to the axis L of the camera.
Similarly, the third predetermined distance refers to a distance shifted to the left or right from the second predetermined distance with the intersection point of the hand and the axis L of the camera as a reference point. The third predetermined distance may be 50 cm. Of course, the third predetermined distance may be set to other values according to actual requirements, such as 55cm, 60cm, and so on.
The dashed box shown in fig. 2 is a real spatial scene region, which corresponds to the first region of interest ROI1 and the second region of interest ROI2 in the imaging plane S of the camera. As can be understood from fig. 2, the first region of interest ROI1 is in the imaging plane S perpendicular to the axis L of the camera, and the long side is a rectangle in the x-axis direction; the second region of interest ROI2 is a rectangle whose long side is in the y-axis direction in the imaging plane S perpendicular to the axis L of the camera.
As described above, after the region of interest of the video image containing the moving object to be triggered is divided, next, the relationship between the motion index of the moving object to be triggered and the control threshold needs to be determined. If the motion index of the to-be-triggered moving target in the multi-frame video image falling into the first region of interest ROI1 reaches a first control threshold value within a preset time interval, a first control signal (for example, a start signal) is sent, and if the motion index of the to-be-triggered moving target in the multi-frame video image falling into the second region of interest ROI2 reaches a second control threshold value, a second control signal (for example, a stop signal) different from the first control signal is sent. The first and second control signals may also be set as signals for switching different operation modes.
The first region of interest ROI1 may be understood as a region onto which a motion of the moving object to be triggered is decomposed into an X-direction motion, and the second region of interest ROI2 may be understood as a region onto which a motion of the moving object to be triggered is decomposed into a Y-direction motion.
With reference to the specific example, it can be understood that if the hand is inclined to the horizontal direction during the time period in which the movement of the moving object (hand) is to be triggered, it indicates that the motion index of the hand in the first region of interest ROI1 satisfies the first control threshold, the intelligent cleaning device is controlled to be turned on by the first control signal. If the hand is inclined to the vertical direction during the time period of the movement of the moving object (hand) to be triggered, it indicates that the motion index of the hand in the second region of interest ROI1 satisfies the second control threshold, the intelligent cleaning device is controlled to stop by the second control signal.
Specifically, the motion index of the moving object to be triggered may include any one or combination of a coverage range, an action amplitude, and a size of a motion center of the motion of the moving object to be triggered. Therefore, the accuracy of triggering the action to be executed can be improved by setting the motion index.
For the threshold value of the motion index, it is an empirical value. The threshold is chosen so that palm movements are normally observed in the light flow graph.
It can be understood that, if a motion index of the moving object to be triggered is desired to be obtained, the coordinates of the moving object to be triggered in the region of interest need to be obtained first. Then, when the image is collected by the camera, background information and a stationary target in the air in each frame of video image can be filtered by using a feature extraction algorithm so as to extract a moving target to be triggered.
If the camera head generates distortion, when the motion target to be triggered is extracted, a (k1, k2, p1, p2, k3) distortion model can be adopted, and distortion correction is completed according to a pre-calibration result. The distortion model is:
xc=x(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2x2)]
yc=y(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2y2)]
wherein, (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
The control principle of the intelligent cleaning device applying the control method of the invention in the test process is described below with reference to fig. 3-8, so as to facilitate better understanding of the invention.
As shown in fig. 3, the darker lines represent the motion trajectory of the to-be-triggered moving object in the X direction, and the lighter lines represent the motion trajectory of the to-be-triggered moving object in the Y direction. It can be seen that the amplitude of the motion trajectory in the X direction is large, and the motion index of the motion trajectory meets the set first control threshold, so that the detection result is as shown in fig. 4, and the action of starting the intelligent cleaning device is triggered.
As shown in fig. 5, the darker lines represent the motion trajectory of the to-be-triggered moving object in the X direction, and the lighter lines represent the motion trajectory of the to-be-triggered moving object in the Y direction. It can be seen that the amplitude of the motion trajectory in the Y direction is large, and the motion index of the motion trajectory meets the set second control threshold, so that the detection result is as shown in fig. 6, and the action of stopping the intelligent cleaning device is triggered.
As shown in fig. 7, the darker lines represent the motion trajectory of the to-be-triggered moving object in the X direction, and the lighter lines represent the motion trajectory of the to-be-triggered moving object in the Y direction. The motion index does not meet the set first and second control thresholds, and therefore no action is triggered.
The invention also provides an intelligent cleaning device, the specific structure of which is described in detail above and is not described in detail herein. Furthermore, the present invention also provides a computer storage medium having a computer program stored thereon, which when executed by a processor, performs the steps of the method described in any of the above embodiments.
According to the control method, the control of starting and stopping the equipment can be realized by identifying the limb action of the user through the camera, the user does not need to operate the key of the intelligent cleaning equipment body in person, and great convenience is brought to the user.
The flows and steps described in all the preferred embodiments described above are only examples. Unless an adverse effect occurs, various processing operations may be performed in a different order from the order of the above-described flow. The above-mentioned steps of the flow can be added, combined or deleted according to the actual requirement.
Further, the commands, command numbers, and data items described in all the preferred embodiments described above are only examples, and thus the commands, command numbers, and data items may be set in any manner as long as the same functions are achieved. The units of the terminal of the preferred embodiments may also be integrated, further divided or subtracted according to actual needs.
The present invention has been described in terms of the above embodiments, but it should be understood that the above embodiments are for purposes of illustration and description only and are not intended to limit the invention to the scope of the described embodiments. Furthermore, it will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that many variations and modifications may be made in accordance with the teachings of the present invention, which variations and modifications fall within the scope of the present invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (10)

1. An appliance control method for an intelligent cleaning appliance, the control method comprising:
acquiring a video of a space scene shot by a camera on the intelligent cleaning equipment in real time, and decomposing the video containing the moving target to be triggered into a plurality of frames of video images according to a preset time interval;
dividing a first region of interest ROI1 and a second region of interest ROI2 of the video image;
if the motion index of the to-be-triggered moving target falling into the first region of interest ROI1 in the multi-frame video image reaches a first control threshold value within the preset time interval, a first control signal is sent, and if the motion index of the to-be-triggered moving target falling into the second region of interest ROI2 in the multi-frame video image reaches a second control threshold value, a second control signal different from the first control signal is sent.
2. The device control method according to claim 1, characterized in that the first region of interest ROI1 and the second region of interest ROI2 are imaging plane regions perpendicular to an axis of the camera.
3. The apparatus control method according to claim 2, wherein the first region of interest ROI1 is a rectangle whose long side is in the x-axis direction within an imaging plane area perpendicular to the axis of the camera; the second region of interest ROI2 is within an imaging plane region perpendicular to the axis of the camera, and the long side is a rectangle in the y-axis direction.
4. The device control method according to claim 1, characterized in that the first region of interest ROI1 is: and horizontally moving at a constant speed a second preset distance in a mapping range formed by an imaging plane of the camera by taking the axis of the camera as a reference at a position higher than the first preset distance of the camera corresponding to the moving target to be triggered.
5. The device control method according to claim 4, characterized in that the second region of interest ROI2 is: and horizontally moving a third preset distance at a constant speed within a mapping range formed by an imaging plane of the camera by taking the axis of the camera as a reference at a position higher than the first preset distance of the camera corresponding to the moving target to be triggered, wherein the third preset distance is different from the second preset distance.
6. The device control method according to claim 1, wherein the motion index includes any one or a combination of a coverage of motion, an action amplitude, and a size of a motion center of the motion target to be triggered.
7. The device control method according to claim 1, wherein background information and stationary objects in the air in each frame of the video image are filtered out by a feature extraction algorithm to extract the moving object to be triggered.
8. The apparatus control method according to claim 7, wherein if the camera head is distorted, when the motion target to be triggered is extracted, distortion correction is performed according to a pre-calibration result using a (k1, k2, p1, p2, k3) distortion model:
xc=x(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2x2)]
yc=y(1+k1r2+k2r4+k3r6)+[2·p1·xy+p2·(r2+2y2)]
wherein, (x, y) is the coordinates of the distorted moving object to be triggered on the imaging plane of the camera, and (xc, yc) is the coordinates of the corrected moving object to be triggered on the imaging plane of the camera.
9. An intelligent cleaning device comprising a memory, a processor and a computer program stored on the memory and running on the processor, characterized in that the steps of the method of any of claims 1 to 8 are implemented when the program is executed by the processor.
10. A computer storage medium having a computer program stored thereon, wherein the program, when executed by a processor, performs the steps of the method of any one of claims 1 to 8.
CN202010948471.0A 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium Active CN112244705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010948471.0A CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010948471.0A CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Publications (2)

Publication Number Publication Date
CN112244705A true CN112244705A (en) 2021-01-22
CN112244705B CN112244705B (en) 2023-05-23

Family

ID=74232154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010948471.0A Active CN112244705B (en) 2020-09-10 2020-09-10 Intelligent cleaning device, control method and computer storage medium

Country Status (1)

Country Link
CN (1) CN112244705B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050241466A1 (en) * 2001-08-16 2005-11-03 Humanbeams, Inc. Music instrument system and methods
CN103257713A (en) * 2013-05-31 2013-08-21 华南理工大学 Gesture control method
US20140152566A1 (en) * 2012-12-05 2014-06-05 Brent A. Safer Apparatus and methods for image/sensory processing to control computer operations
CN104422066A (en) * 2013-08-23 2015-03-18 珠海格力电器股份有限公司 Intelligent air conditioner control system and method and air conditioner
CN105725932A (en) * 2016-01-29 2016-07-06 江西智能无限物联科技有限公司 Intelligent sweeping robot
CN107656443A (en) * 2017-09-18 2018-02-02 成都易慧家科技有限公司 A kind of intelligent home control system and method based on deep learning
CN108107796A (en) * 2018-01-23 2018-06-01 广东美的厨房电器制造有限公司 Kitchen ventilator and its gesture identification control method, device and readable storage medium storing program for executing
CN108758728A (en) * 2018-03-29 2018-11-06 青岛海尔智能技术研发有限公司 A kind of range hood control method and range hood based on impetus
US20190167059A1 (en) * 2017-12-06 2019-06-06 Bissell Inc. Method and system for manual control of autonomous floor cleaner
CN109991859A (en) * 2017-12-29 2019-07-09 青岛有屋科技有限公司 A kind of gesture instruction control method and intelligent home control system
CN110069137A (en) * 2019-04-30 2019-07-30 徐州重型机械有限公司 Gestural control method, control device and control system
CN110226184A (en) * 2016-12-27 2019-09-10 杰拉德·迪尔克·施密茨 For machine sensible system and method
CN111062269A (en) * 2019-11-25 2020-04-24 珠海格力电器股份有限公司 User state identification method and device, storage medium and air conditioner
CN111461059A (en) * 2020-04-21 2020-07-28 哈尔滨拓博科技有限公司 Multi-zone multi-classification extensible gesture recognition control device and control method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050241466A1 (en) * 2001-08-16 2005-11-03 Humanbeams, Inc. Music instrument system and methods
US20140152566A1 (en) * 2012-12-05 2014-06-05 Brent A. Safer Apparatus and methods for image/sensory processing to control computer operations
CN103257713A (en) * 2013-05-31 2013-08-21 华南理工大学 Gesture control method
CN104422066A (en) * 2013-08-23 2015-03-18 珠海格力电器股份有限公司 Intelligent air conditioner control system and method and air conditioner
CN105725932A (en) * 2016-01-29 2016-07-06 江西智能无限物联科技有限公司 Intelligent sweeping robot
CN110226184A (en) * 2016-12-27 2019-09-10 杰拉德·迪尔克·施密茨 For machine sensible system and method
CN107656443A (en) * 2017-09-18 2018-02-02 成都易慧家科技有限公司 A kind of intelligent home control system and method based on deep learning
US20190167059A1 (en) * 2017-12-06 2019-06-06 Bissell Inc. Method and system for manual control of autonomous floor cleaner
CN109991859A (en) * 2017-12-29 2019-07-09 青岛有屋科技有限公司 A kind of gesture instruction control method and intelligent home control system
CN108107796A (en) * 2018-01-23 2018-06-01 广东美的厨房电器制造有限公司 Kitchen ventilator and its gesture identification control method, device and readable storage medium storing program for executing
CN108758728A (en) * 2018-03-29 2018-11-06 青岛海尔智能技术研发有限公司 A kind of range hood control method and range hood based on impetus
CN110069137A (en) * 2019-04-30 2019-07-30 徐州重型机械有限公司 Gestural control method, control device and control system
CN111062269A (en) * 2019-11-25 2020-04-24 珠海格力电器股份有限公司 User state identification method and device, storage medium and air conditioner
CN111461059A (en) * 2020-04-21 2020-07-28 哈尔滨拓博科技有限公司 Multi-zone multi-classification extensible gesture recognition control device and control method

Also Published As

Publication number Publication date
CN112244705B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
AU2018100726A4 (en) Automatic cleaning device and cleaning method
CN109431381B (en) Robot positioning method and device, electronic device and storage medium
CN109947109B (en) Robot working area map construction method and device, robot and medium
CN111990929B (en) Obstacle detection method and device, self-walking robot and storage medium
CN109932726B (en) Robot ranging calibration method and device, robot and medium
CN112205937B (en) Automatic cleaning equipment control method, device, equipment and medium
TW202110378A (en) Cleaning robot and control method thereof
CN106239517A (en) Robot and the method for the autonomous manipulation of realization, device
CN111990930A (en) Distance measuring method, device, robot and storage medium
EP4209754A1 (en) Positioning method and apparatus for robot, and storage medium
CN111857153B (en) Distance detection device and robot sweeps floor
CN217792839U (en) Automatic cleaning equipment
CN112244705B (en) Intelligent cleaning device, control method and computer storage medium
CN114608520B (en) Ranging method, ranging device, robot and storage medium
CN211270533U (en) Camera device and cleaning robot
CN114829085A (en) Mobile robot and control method thereof
CN210673216U (en) Light filtering type robot
JP7433430B2 (en) Camera equipment and cleaning robot
CN117008148A (en) Method, apparatus and storage medium for detecting slip state
CN116942017A (en) Automatic cleaning device, control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220415

Address after: 102299 No. 8008, floor 8, building 16, yard 37, Chaoqian Road, Zhongguancun Science and Technology Park, Changping District, Beijing

Applicant after: Beijing Stone Innovation Technology Co.,Ltd.

Address before: 100192 No. 6016, 6017, 6018, Block C, No. 8 Heiquan Road, Haidian District, Beijing

Applicant before: Beijing Roborock Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant