CN116982883A - Method and device for executing cleaning operation, storage medium and electronic device - Google Patents

Method and device for executing cleaning operation, storage medium and electronic device Download PDF

Info

Publication number
CN116982883A
CN116982883A CN202210441621.8A CN202210441621A CN116982883A CN 116982883 A CN116982883 A CN 116982883A CN 202210441621 A CN202210441621 A CN 202210441621A CN 116982883 A CN116982883 A CN 116982883A
Authority
CN
China
Prior art keywords
target object
gesture recognition
feature
biological
cleaning robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210441621.8A
Other languages
Chinese (zh)
Inventor
吴飞
郁顺昌
汤盛浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dreame Innovation Technology Suzhou Co Ltd
Original Assignee
Dreame Innovation Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dreame Innovation Technology Suzhou Co Ltd filed Critical Dreame Innovation Technology Suzhou Co Ltd
Priority to CN202210441621.8A priority Critical patent/CN116982883A/en
Priority to PCT/CN2023/088040 priority patent/WO2023207611A1/en
Publication of CN116982883A publication Critical patent/CN116982883A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Abstract

The application discloses a method and a device for executing cleaning operation, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or in a second scanning area of a cleaning robot; acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification; acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction; the method solves the problems that in the related art, the sweeping robot is directly controlled in a voice recognition mode, voice is not verified, noise in the working process of the sweeping robot is large, the voice recognition effect is poor, and the experience degree of the user-controlled sweeping robot is poor.

Description

Method and device for executing cleaning operation, storage medium and electronic device
[ field of technology ]
The present application relates to the field of robots, and in particular, to a method and apparatus for performing a cleaning operation, a storage medium, and an electronic apparatus.
[ background Art ]
In recent years, with the improvement of the living standard of people and the high-speed development of technology, the presence of the sweeping robot brings convenience to people in daily life, liberates hands of people to a certain extent, reduces the labor amount, but the sweeping mode control of the sweeping robot is still a difficult problem.
In the prior art, a sweeping robot is controlled in a voice recognition mode, a sweeping command is sent to the sweeping robot through voice interaction between a user and the sweeping robot, the sweeping robot is controlled to sweep a specified area, and the working efficiency and the cleanliness of the sweeping robot are guaranteed. However, the method does not consider the problems that the noise generated by the sweeping robot in the running process is large, so that the recognition effect of voice awakening and voice interaction of a user is poor, the user experience is poor, and after a voice command is passed, the sweeping robot executes cleaning operation, and no scheme for recognizing a sending object of the voice command exists.
In the related art, the mode of controlling the sweeping robot directly through the voice recognition mode is not used for verifying voice, and the noise in the working process of the sweeping robot is large, so that the voice recognition effect is poor, and the user control of the sweeping robot is poor in experience, so that an effective solution is not provided.
[ invention ]
The embodiment of the invention provides a cleaning operation executing method and device, a storage medium and an electronic device, which at least solve the problems that in the related art, the voice recognition effect is poor and the experience of a user for controlling a sweeping robot is poor due to the fact that noise is large in a mode of controlling the sweeping robot in a voice recognition mode.
According to an aspect of an embodiment of the present invention, there is provided a method of performing a cleaning operation, including: acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or in a second scanning area of a cleaning robot; acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification; and acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
In an exemplary embodiment, before acquiring the gesture recognition feature of the target object in a case where the biometric feature is verified, the method further includes: determining whether the cleaning robot recognizes a wake-up gesture recognition feature of the target object, wherein the wake-up gesture recognition feature is used for indicating the cleaning robot to start a target recognition function, and the target recognition function is used for determining a control instruction corresponding to the gesture recognition feature; and starting the target recognition function when the cleaning robot recognizes the wake gesture recognition feature of the target object.
In an exemplary embodiment, after turning on the target recognition function, the method further comprises: and closing the target recognition function under the condition that the gesture recognition characteristic of the current object is not recognized within a preset time period after the current moment.
In an exemplary embodiment, the method includes obtaining a biological identification result of a target object, and obtaining a biological feature of the target object, where the biological feature includes at least one of the following: performing biological feature recognition on the target object through the cleaning robot to obtain a biological recognition result, and analyzing the biological feature of the target object from the biological recognition result; and receiving a biological recognition result obtained by the mobile terminal for carrying out biological feature recognition on the target object, and analyzing the biological feature of the target object from the biological recognition result.
In an exemplary embodiment, before acquiring the control instruction corresponding to the gesture recognition feature, the method further includes: receiving a setting operation of the target object; and setting the corresponding relation between different gesture recognition features and different control instructions in response to the setting operation.
In an exemplary embodiment, in controlling the cleaning robot to perform a cleaning operation corresponding to the control instruction, the method includes: determining whether the cleaning robot recognizes a termination gesture recognition feature of the target object; and if the termination gesture recognition feature is recognized, terminating the cleaning operation which corresponds to the cleaning operation and is being executed.
In an exemplary embodiment, obtaining a control instruction corresponding to the gesture recognition feature includes: acquiring a target corresponding relation corresponding to the biological characteristics of the target object, wherein the target corresponding relation is a corresponding relation between gesture recognition characteristics of the target object and control instructions, and the biological characteristics of different target objects correspond to different corresponding relations; and acquiring a control instruction corresponding to the gesture recognition feature from the target corresponding relation.
According to another aspect of the embodiment of the present invention, there is also provided an apparatus for performing a cleaning operation, including: the first acquisition unit is used for acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of the mobile terminal or in a second scanning area of the cleaning robot; a second obtaining unit, configured to obtain a gesture recognition feature of the target object if the biometric feature passes verification; and the first control unit is used for acquiring a control instruction corresponding to the gesture recognition characteristic and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
In an exemplary embodiment, the executing device further includes: a first determining unit, configured to determine whether the cleaning robot recognizes a wake gesture recognition feature of the target object, where the wake gesture recognition feature is configured to instruct the cleaning robot to turn on a target recognition function, and the target recognition function is configured to determine a control instruction corresponding to the gesture recognition feature; and starting the target recognition function when the cleaning robot recognizes the wake gesture recognition feature of the target object.
In an exemplary embodiment, the first determining unit is further configured to turn off the target recognition function if the gesture recognition feature of the current object is not recognized within a preset period of time after the current time.
In an exemplary embodiment, the first obtaining unit is further configured to perform biometric identification on the target object by using the cleaning robot, obtain the biometric result, and parse the biometric of the target object from the biometric result; and receiving a biological recognition result obtained by the mobile terminal for carrying out biological feature recognition on the target object, and analyzing the biological feature of the target object from the biological recognition result.
In an exemplary embodiment, the second obtaining unit is further configured to receive a setting operation of the target object; and setting the corresponding relation between different gesture recognition features and different control instructions in response to the setting operation.
In an exemplary embodiment, the first control unit further includes: a second determination unit configured to determine whether the cleaning robot recognizes a termination gesture recognition feature of the target object; and if the termination gesture recognition feature is recognized, terminating the cleaning operation which corresponds to the cleaning operation and is being executed.
In an exemplary embodiment, the second obtaining unit is further configured to obtain a target corresponding relationship corresponding to a biological feature of the target object, where the target corresponding relationship is a corresponding relationship between a gesture recognition feature of the target object and a control instruction, and different biological features of different target objects correspond to different corresponding relationships; and acquiring a control instruction corresponding to the gesture recognition feature from the target corresponding relation.
According to a further aspect of embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the execution method of the cleaning operation described above when run.
According to still another aspect of the embodiments of the present invention, there is further provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the above-mentioned cleaning operation executing method by the computer program.
In the embodiment of the invention, a biological recognition result of a target object is obtained, and the biological characteristics of the target object are obtained, namely, the identity of the target object is recognized, wherein the target object is positioned in a first scanning area of a mobile terminal or in a second scanning area of a cleaning robot, namely, the target object is subjected to identity recognition through the mobile terminal or the cleaning robot; acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification; acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction; the method solves the problems that in the related art, the sweeping robot is directly controlled in a voice recognition mode, voice is not verified, noise in the working process of the sweeping robot is large, and then the voice recognition effect is poor, and the experience degree of a user controlling the sweeping robot is poor, so that the identity of a target object sending a gesture is recognized firstly, the safety usability of the sweeping robot is improved, and the technical effect that the intelligent control efficiency of the sweeping robot is influenced by noise can be avoided.
[ description of the drawings ]
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic illustration of a hardware environment for an alternative method of performing a cleaning operation in accordance with an embodiment of the present application;
FIG. 2 is a flow chart of an alternative method of performing a cleaning operation according to an embodiment of the present application;
FIG. 3 is a flow chart of another alternative method of performing a cleaning operation according to an embodiment of the present application;
FIG. 4 is a block diagram of an alternative cleaning operation performing device according to an embodiment of the present application;
fig. 5 is a block diagram of an alternative electronic device according to an embodiment of the application.
[ detailed description ] of the application
The application will be described in detail hereinafter with reference to the drawings in conjunction with embodiments.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to one embodiment of the present application, a method of performing a cleaning operation is provided. Alternatively, in the present embodiment, the above-described execution method of the cleaning operation may be applied to a hardware environment constituted by the robot 102 and the server 104 as shown in fig. 1. As shown in fig. 1, the robot 102 may be connected to a server 104 (e.g., an internet of things platform or cloud server) through a network to control the robot 102.
The network may include, but is not limited to, at least one of: wired network, wireless network. The wired network may include, but is not limited to, at least one of: a wide area network, a metropolitan area network, a local area network, and the wireless network may include, but is not limited to, at least one of: WIFI (Wireless Fidelity ), bluetooth, infrared.
The method for performing the cleaning operation according to the embodiment of the present application may be performed by the robot 102 or the server 104 alone, or may be performed by both the robot 102 and the server 104. The method for performing the cleaning operation by the robot 102 according to the embodiment of the present application may be performed by a client installed thereon.
Taking the method of performing the cleaning operation in the present embodiment by the robot 102 as an example, fig. 2 is a schematic flow chart of an alternative method of performing the cleaning operation according to an embodiment of the present application, and as shown in fig. 2, the flow of the method may include the following steps:
step S202, acquiring a biological identification result of a target object to obtain a biological characteristic of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or in a second scanning area of a cleaning robot;
Step S204, acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification;
step S206, a control instruction corresponding to the gesture recognition feature is acquired, and the cleaning robot is controlled to execute a cleaning operation corresponding to the control instruction.
The biological recognition result of the target object is obtained through the steps, and the biological characteristics of the target object are obtained, namely the identity of the target object is recognized, wherein the target object is positioned in a first scanning area of the mobile terminal or in a second scanning area of the cleaning robot, namely the target object is subjected to identity recognition through the mobile terminal or the cleaning robot; acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification; acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction; the method solves the problems that in the related art, the sweeping robot is directly controlled in a voice recognition mode, voice is not verified, noise in the working process of the sweeping robot is large, the voice recognition effect is poor, and the experience degree of a user control sweeping robot is poor, and the use safety of the sweeping robot is improved due to the fact that the identity of a target object can be recognized through a mobile terminal or the sweeping robot.
In an exemplary embodiment, before acquiring the gesture recognition feature of the target object in a case where the biometric feature is verified, the method further includes: determining whether the cleaning robot recognizes a wake-up gesture recognition feature of the target object, wherein the wake-up gesture recognition feature is used for indicating the cleaning robot to start a target recognition function, and the target recognition function is used for determining a control instruction corresponding to the gesture recognition feature; starting the target recognition function under the condition that the cleaning robot recognizes the wake gesture recognition feature of the target object; and prohibiting the starting of the target recognition function under the condition that the cleaning robot does not recognize the wake gesture recognition characteristic of the target object.
That is, under the condition that the biological characteristics pass the verification, whether the cleaning robot recognizes the wake-up gesture recognition characteristics of the target object is firstly determined, after the wake-up gesture recognition characteristics of the target object are recognized, the cleaning robot is instructed to start the target recognition function according to the wake-up gesture recognition characteristics, and the target recognition function is used for recognizing control instructions corresponding to the gesture recognition characteristics of the target object, so that the cleaning robot is controlled; and under the condition that the wake-up gesture recognition feature is not acquired, the starting of the target recognition function is forbidden, and misoperation of a user is prevented by increasing recognition of the wake-up gesture recognition feature. Taking a cleaning robot as an example, the face recognition of the user: the household comprises a man owner, a woman owner and children, wherein the man owner and the woman owner have different habit actions, so that the wake gesture recognition characteristics recorded by the two persons are different, namely, the man owner and the woman owner respectively have a set of gesture recognition characteristics designed by the man owner and the woman owner to control the cleaning robot, if the man owner or the woman owner accidentally enables the cleaning robot to acquire biological characteristics and pass verification, the children play at one side, and some random actions of the children may falsely trigger the control instructions of the cleaning robot; therefore, in order to avoid the occurrence of the above situation, after the biological feature passes the verification, a wake-up gesture recognition feature may be added to further start the target recognition function of the cleaning robot with the specific gesture recognition feature, and if the wake-up gesture recognition feature is not obtained, it is determined that the target recognition function is not started.
In an exemplary embodiment, after turning on the target recognition function, the method further comprises: and closing the target recognition function under the condition that the gesture recognition characteristic of the current object is not recognized within a preset time period after the current moment.
It can be understood that in the actual operation process, the user may use the gesture corresponding to the wake gesture recognition feature accidentally, or delay the control of the cleaning robot due to other things after waking up the cleaning robot, where the cleaning robot is not controlled by sending the subsequent gesture recognition feature, so as to avoid damage to the cleaning robot function and waste of energy caused by the fact that the cleaning robot is always in the state of starting the target recognition function after being woken up; therefore, a preset time is set for the cleaning robot, and after the cleaning robot is awakened, if the gesture recognition characteristics of the target object are not recognized within the preset time period, the target recognition function is actively turned off.
For example, a man or a woman may accidentally make the robot acquire the biological characteristics and pass the identity verification, a child in the home may send out the corresponding wake-up gesture recognition characteristics due to the action of a learning adult, after recognizing the wake-up gesture recognition characteristics, the cleaning robot starts the target recognition function and is in a state of waiting for receiving the gesture recognition characteristics, but the child does not receive other gesture recognition characteristics for the cleaning robot, and the cleaning robot cannot always be in a state of receiving the gesture recognition characteristics, so that after the cleaning robot is waken, if the gesture recognition characteristics of the target object are not recognized in a preset time period, the target recognition function is actively turned off to avoid damage to the system.
In an exemplary embodiment, the method includes obtaining a biological identification result of a target object, and obtaining a biological feature of the target object, where the biological feature includes at least one of the following: performing biological feature recognition on the target object through the cleaning robot to obtain a biological recognition result, and analyzing the biological feature of the target object from the biological recognition result; and receiving a biological recognition result obtained by the mobile terminal for carrying out biological feature recognition on the target object, and analyzing the biological feature of the target object from the biological recognition result.
In the practical application process, a user can carry out identity verification through an identification unit of the cleaning robot, and can also carry out identity verification through mobile terminal equipment bound by the cleaning robot, so that the user can conveniently complete the operation and control of the cleaning robot at any time and any place; for example, when a user is working, he suddenly thinks that a visitor arrives at home today, but the sanitation of the home is not cleaned yet, in order to avoid bad impressions of the visitor caused by too mess of the home, the user controls the cleaning robot through the mobile phone device, and the cleaning robot is controlled to clean certain designated areas of the home.
It should be noted that the mobile terminal device may be a mobile smart device such as a mobile phone, a tablet, or a smart watch, which is not limited by the present application.
In an exemplary embodiment, before acquiring the control instruction corresponding to the gesture recognition feature, the method further includes: receiving a setting operation of the target object; and setting the corresponding relation between different gesture recognition features and different control instructions in response to the setting operation.
In order to ensure that a user can accurately control the cleaning robot to clean a designated area through gesture recognition features, the user can set the gesture recognition features entered by the cleaning robot and bind control instructions corresponding to the gesture recognition features before using the cleaning robot; for example, a user inputs a gesture recognition feature for drawing a circle, the cleaning robot receives the gesture recognition feature, the user clicks a cleaning whole house command immediately, and the cleaning robot binds the gesture recognition feature with a cleaning whole house control command; and executing a cleaning whole house command every time the circle drawing gesture recognition feature is recognized.
Based on the above steps, in the process of controlling the cleaning robot to execute the cleaning operation corresponding to the control instruction, the method includes: determining whether the cleaning robot recognizes a termination gesture recognition feature of the target object; and if the termination gesture recognition feature is recognized, terminating the cleaning operation which corresponds to the cleaning operation and is being executed.
In the process of executing the cleaning operation corresponding to the identified control instruction, the cleaning robot also continuously monitors other gesture recognition features possibly sent by the user, such as termination gesture recognition features, the condition that a child is active in a designated area of a household may occur in the process of controlling the cleaning robot to clean the designated area, in order to avoid the child stumbled by the cleaning robot, safety problems are brought to the child, the user stops the operation of the cleaning robot, so the termination gesture recognition features are sent out in the middle of the cleaning robot, and if the termination gesture recognition features are identified in the operation process of the cleaning robot, the cleaning operation which is being executed is stopped immediately.
In an exemplary embodiment, obtaining a control instruction corresponding to the gesture recognition feature includes: acquiring a target corresponding relation corresponding to the biological characteristics of the target object, wherein the target corresponding relation is a corresponding relation between gesture recognition characteristics of the target object and control instructions, and the biological characteristics of different target objects correspond to different corresponding relations; and acquiring a control instruction corresponding to the gesture recognition feature from the target corresponding relation.
Different users in the household may have different gesture habits, so when the cleaning robot acquires a control instruction corresponding to the gesture recognition feature, the cleaning robot firstly acquires the corresponding relation between the gesture recognition feature set by the target object and the control instruction according to the acquired biological feature of the target object, and acquires the control instruction corresponding to the gesture recognition feature according to the corresponding relation.
In order to better understand the technical solutions of the embodiments and the alternative embodiments of the present invention, the following description is given with reference to the flow of the above-mentioned execution method of the cleaning operation by way of example, but is not limited to the technical solutions of the embodiments of the present invention.
As shown in fig. 3, the flow of the method for performing the cleaning operation in this alternative embodiment may include the following steps:
step S302, inputting face recognition data to a server through a mobile phone APP matched with a sweeper (equivalent to the cleaning robot);
step S304, starting;
step S306, the floor sweeper (equivalent to the cleaning robot) performs face recognition identity verification on the user through a preset AI camera; if the verification is passed, step S308 is executed, and if not, step S316 is executed;
step S308, capturing a gesture recognition characteristic image of a user through an AI camera;
Step S310, recognizing the obtained gesture recognition feature image through a built-in gesture recognition algorithm model of the sweeper (equivalent to the cleaning robot), comparing the recognized gesture recognition feature with gesture recognition features in a model library, and judging whether the gesture recognition feature image is a gesture recognition feature set stored in the model library; if not, executing step S312, and if yes, executing step S314;
step S312: prompting the user to input correct gesture recognition features;
step S314: acquiring a control instruction corresponding to the gesture recognition characteristic, and sending the control instruction to a motion control module to control a sweeper (equivalent to the cleaning robot) to run;
step S316: and (5) ending.
According to the embodiment, a gesture recognition algorithm is set for the sweeper, gesture recognition features sent by a user are obtained and recognized, the gesture recognition features are compared with gesture recognition features in a model library, corresponding control instructions are issued when the comparison is successful, and the sweeper executes corresponding sweeping operations according to the control instructions; the floor sweeping machine prevents false recognition through face recognition and additionally wake-up gesture recognition feature recognition double verification, through the steps, the problems that in the related art, due to the fact that noise is large and then the voice recognition effect is poor, the experience degree of a user control floor sweeping robot is poor and the like are solved, and the technical effect that intelligent control efficiency of the floor sweeping robot is affected due to noise can be avoided.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM (Read-Only Memory)/RAM (Random Access Memory), magnetic disk, optical disk), comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
According to still another aspect of the embodiments of the present application, there is also provided an execution apparatus for a cleaning operation for executing the execution method of the cleaning operation described above. Fig. 4 is a block diagram of an alternative cleaning operation performing device according to an embodiment of the present application, as shown in fig. 4, the device may include:
a first obtaining unit 42, configured to obtain a biometric result of a target object, where the target object is located in a first scanning area of the mobile terminal or in a second scanning area of the cleaning robot;
a second obtaining unit 44, configured to obtain a gesture recognition feature of the target object if the biometric feature passes verification;
the first control unit 46 is configured to acquire a control instruction corresponding to the gesture recognition feature, and control the cleaning robot to perform a cleaning operation corresponding to the control instruction.
It should be noted that, the first obtaining unit 42 in this embodiment may be used to perform the step S202 described above, the second obtaining unit 44 in this embodiment may be used to perform the step S204 described above, and the first control unit 46 in this embodiment may be used to perform the step S206 described above.
The device is used for acquiring the biological recognition result of the target object to obtain the biological characteristics of the target object, namely, the identity of the target object is recognized, wherein the target object is positioned in a first scanning area of the mobile terminal or in a second scanning area of the cleaning robot, namely, the target object is subjected to identity recognition through the mobile terminal or the cleaning robot; acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification; acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction; the method solves the problems that in the related art, the sweeping robot is directly controlled in a voice recognition mode, voice is not verified, noise in the working process of the sweeping robot is large, the voice recognition effect is poor, and the experience degree of a user control sweeping robot is poor, and the use safety of the sweeping robot is improved due to the fact that the identity of a target object can be recognized through a mobile terminal or the sweeping robot.
That is, under the condition that the biological characteristics pass the verification, whether the cleaning robot recognizes the wake-up gesture recognition characteristics of the target object is firstly determined, after the wake-up gesture recognition characteristics of the target object are recognized, the cleaning robot is instructed to start the target recognition function according to the wake-up gesture recognition characteristics, and the target recognition function is used for recognizing control instructions corresponding to the gesture recognition characteristics of the target object, so that the cleaning robot is controlled; and under the condition that the wake-up gesture recognition feature is not acquired, the starting of the target recognition function is forbidden, and misoperation of a user is prevented by increasing recognition of the wake-up gesture recognition feature. Taking a cleaning robot as an example, the face recognition of the user: the household comprises a man owner, a woman owner and children, wherein the man owner and the woman owner have different habit actions, so that the wake gesture recognition characteristics recorded by the two persons are different, namely, the man owner and the woman owner respectively have a set of gesture recognition characteristics designed by the man owner and the woman owner to control the cleaning robot, if the man owner or the woman owner accidentally enables the cleaning robot to acquire biological characteristics and pass verification, the children play at one side, and some random actions of the children may falsely trigger the control instructions of the cleaning robot; therefore, in order to avoid the occurrence of the above situation, after the biological feature passes the verification, a wake-up gesture recognition feature may be added to further start the target recognition function of the cleaning robot with the specific gesture recognition feature, and if the wake-up gesture recognition feature is not recognized, it is determined that the recognition is incorrect, and the target recognition function is not started.
In an exemplary embodiment, the first determining unit is further configured to turn off the target recognition function if the gesture recognition feature of the current object is not recognized within a preset period of time after the current time.
It can be understood that in the actual operation process, the user may accidentally and mistakenly send out a gesture corresponding to the wake-up gesture recognition feature, or delay the control of the cleaning robot due to other things after waking up the cleaning robot, where in the above case, in order to avoid damage to the cleaning robot function caused by that the cleaning robot is always in the state of starting the target recognition function after being waken up, and waste energy; therefore, a preset time is set for the cleaning robot, and after the cleaning robot is awakened, if the gesture recognition characteristics of the target object are not recognized within the preset time period, the target recognition function is actively turned off.
For example, a man or a woman may accidentally make the robot acquire biological characteristics and pass identity verification, a child in the house may make corresponding wake-up gesture recognition characteristics due to learning actions of a man, after recognizing the wake-up gesture recognition characteristics, the cleaning robot starts the target recognition function and is in a state of waiting for receiving the gesture recognition characteristics, but the child cannot recognize other gesture recognition characteristics of the cleaning robot, and the cleaning robot cannot always be in a state of receiving the gesture recognition characteristics, so that after the cleaning robot is wakened up, if the gesture recognition characteristics of the target object are not recognized in a preset time period, the target recognition function is actively turned off to avoid damage to the system.
In an exemplary embodiment, the first obtaining unit is further configured to perform biometric identification on the target object by using the cleaning robot, obtain the biometric result, and parse the biometric of the target object from the biometric result; and receiving a biological recognition result obtained by the mobile terminal for carrying out biological feature recognition on the target object, and analyzing the biological feature of the target object from the biological recognition result.
In the practical application process, a user can carry out identity verification through an identification unit of the cleaning robot, and can also carry out identity verification through mobile terminal equipment bound by the cleaning robot, so that the user can conveniently complete the operation and control of the cleaning robot at any time and any place; for example, when a user is working, he suddenly thinks that a visitor arrives at home today, but the sanitation of the home is not cleaned yet, in order to avoid bad impressions of the visitor caused by too mess of the home, the user controls the cleaning robot through the mobile phone device, and the cleaning robot is controlled to clean certain designated areas of the home.
In an exemplary embodiment, the second obtaining unit is further configured to receive a setting operation of the target object; and setting the corresponding relation between different gesture recognition features and different control instructions in response to the setting operation.
In order to ensure that a user can accurately control the cleaning robot to clean a designated area through gesture recognition features, the user can set the gesture recognition features entered by the cleaning robot and bind control instructions corresponding to the gesture recognition features before using the cleaning robot; for example, a user inputs a gesture recognition feature for drawing a circle, the cleaning robot receives the gesture recognition feature, the user clicks a cleaning whole house command immediately, and the cleaning robot binds the gesture recognition feature with a cleaning whole house control command; and executing a cleaning whole house command every time the circle drawing gesture recognition feature is recognized.
In an exemplary embodiment, the first control unit further includes: a second determination unit configured to determine whether the cleaning robot recognizes a termination gesture recognition feature of the target object; and if the termination gesture recognition feature is recognized, terminating the cleaning operation which corresponds to the cleaning operation and is being executed.
In the process of executing the cleaning operation corresponding to the identified control instruction, the cleaning robot also continuously monitors other gesture recognition features possibly sent by the user, such as termination gesture recognition features, the condition that a child is active in a designated area of a household may occur in the process of controlling the cleaning robot to clean the designated area, in order to avoid the child stumbled by the cleaning robot, safety problems are brought to the child, the user stops the operation of the cleaning robot, so the termination gesture recognition features are sent out in the middle of the cleaning robot, and if the termination gesture recognition features are identified in the operation process of the cleaning robot, the cleaning operation which is being executed is stopped immediately.
In an exemplary embodiment, the second obtaining unit is further configured to obtain a target corresponding relationship corresponding to a biological feature of the target object, where the target corresponding relationship is a corresponding relationship between a gesture recognition feature of the target object and a control instruction, and different biological features of different target objects correspond to different corresponding relationships; and acquiring a control instruction corresponding to the gesture recognition feature from the target corresponding relation.
Different users in the household may have different gesture habits, so when the cleaning robot acquires a control instruction corresponding to the gesture recognition feature, the cleaning robot firstly acquires the corresponding relation between the gesture recognition feature set by the target object and the control instruction according to the acquired biological feature of the target object, and acquires the control instruction corresponding to the gesture recognition feature according to the corresponding relation.
An embodiment of the invention also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
S1, acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or a second scanning area of a cleaning robot;
s2, under the condition that the biological characteristics pass verification, acquiring gesture recognition characteristics of the target object;
s3, acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
Embodiments of the present invention also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
In one exemplary embodiment, the computer readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
In an exemplary embodiment, the electronic apparatus may further include a transmission device connected to the processor, and an input/output device connected to the processor.
In an exemplary embodiment, the above-mentioned processor may be arranged to perform the following steps by means of a computer program:
s1, acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or a second scanning area of a cleaning robot;
s2, under the condition that the biological characteristics pass verification, acquiring gesture recognition characteristics of the target object;
s3, acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may be implemented in program code executable by computing devices, so that they may be stored in a storage device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
Fig. 5 is a block diagram of an alternative electronic device, according to an embodiment of the present application, including a processor 502, a communication interface 504, a memory 506, and a communication bus 508, as shown in fig. 5, wherein the processor 502, the communication interface 504, and the memory 506 communicate with each other via the communication bus 508, wherein,
a memory 506 for storing a computer program;
the processor 502 is configured to execute the computer program stored in the memory 506, and implement the following steps:
s1, acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or a second scanning area of a cleaning robot;
s2, under the condition that the biological characteristics pass verification, acquiring gesture recognition characteristics of the target object;
s3, acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
Alternatively, in the present embodiment, the communication bus may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 5, but not only one bus or one type of bus. The communication interface is used for communication between the electronic device and other equipment.
The memory may include RAM or nonvolatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
As an example, the memory 506 may include, but is not limited to, the first acquiring unit 42, the second acquiring unit 44, and the first controlling unit 46 in the control apparatus including the device. In addition, other module units in the control device of the above apparatus may be included, but are not limited to, and are not described in detail in this example.
The processor may be a general purpose processor and may include, but is not limited to: CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but also DSP (Digital Signal Processing, digital signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be understood by those skilled in the art that the structure shown in fig. 5 is only illustrative, and the device implementing the above cleaning method may be a terminal device, and the terminal device may be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palmtop computer, a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 5 is not limited to the structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 5, or have a different configuration than shown in FIG. 5.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, etc.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method described in the embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided by the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the present embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (10)

1. A method of performing a cleaning operation, comprising:
acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of a mobile terminal or in a second scanning area of a cleaning robot;
acquiring gesture recognition characteristics of the target object under the condition that the biological characteristics pass verification;
and acquiring a control instruction corresponding to the gesture recognition feature, and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
2. The method of claim 1, wherein, in the event that the biometric feature is verified, prior to acquiring the gesture recognition feature of the target object, the method further comprises:
determining whether the cleaning robot recognizes a wake-up gesture recognition feature of the target object, wherein the wake-up gesture recognition feature is used for indicating the cleaning robot to start a target recognition function, and the target recognition function is used for determining a control instruction corresponding to the gesture recognition feature;
and starting the target recognition function when the cleaning robot recognizes the wake gesture recognition feature of the target object.
3. The method of claim 2, wherein after turning on the target recognition function, the method further comprises:
and closing the target recognition function under the condition that the gesture recognition characteristic of the current object is not recognized within a preset time period after the current moment.
4. The method of claim 1, wherein obtaining the biometric result of the target object to obtain the biometric characteristic of the target object comprises at least one of:
Performing biological feature recognition on the target object through the cleaning robot to obtain a biological recognition result, and analyzing the biological feature of the target object from the biological recognition result;
and receiving a biological recognition result obtained by the mobile terminal for carrying out biological feature recognition on the target object, and analyzing the biological feature of the target object from the biological recognition result.
5. The method of claim 1, wherein prior to obtaining the control instruction corresponding to the gesture recognition feature, the method further comprises:
receiving a setting operation of the target object;
and setting the corresponding relation between different gesture recognition features and different control instructions in response to the setting operation.
6. The method according to claim 1, wherein in controlling the cleaning robot to perform the cleaning operation corresponding to the control instruction, the method includes:
determining whether the cleaning robot recognizes a termination gesture recognition feature of the target object;
and if the termination gesture recognition feature is recognized, terminating the cleaning operation which corresponds to the cleaning operation and is being executed.
7. The method of claim 5, wherein obtaining control instructions corresponding to the gesture recognition feature comprises:
acquiring a target corresponding relation corresponding to the biological characteristics of the target object, wherein the target corresponding relation is a corresponding relation between gesture recognition characteristics of the target object and control instructions, and the biological characteristics of different target objects correspond to different corresponding relations;
and acquiring a control instruction corresponding to the gesture recognition feature from the target corresponding relation.
8. An apparatus for performing a cleaning operation, comprising:
the first acquisition unit is used for acquiring a biological identification result of a target object to obtain the biological characteristics of the target object, wherein the target object is positioned in a first scanning area of the mobile terminal or in a second scanning area of the cleaning robot;
a second obtaining unit, configured to obtain a gesture recognition feature of the target object if the biometric feature passes verification;
and the first control unit is used for acquiring a control instruction corresponding to the gesture recognition characteristic and controlling the cleaning robot to execute a cleaning operation corresponding to the control instruction.
9. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored program, wherein the program when run performs the method of any of the preceding claims 1 to 7.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method according to any of the claims 1 to 7 by means of the computer program.
CN202210441621.8A 2022-04-25 2022-04-25 Method and device for executing cleaning operation, storage medium and electronic device Pending CN116982883A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210441621.8A CN116982883A (en) 2022-04-25 2022-04-25 Method and device for executing cleaning operation, storage medium and electronic device
PCT/CN2023/088040 WO2023207611A1 (en) 2022-04-25 2023-04-13 Cleaning operation execution method and apparatus, storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210441621.8A CN116982883A (en) 2022-04-25 2022-04-25 Method and device for executing cleaning operation, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN116982883A true CN116982883A (en) 2023-11-03

Family

ID=88517460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210441621.8A Pending CN116982883A (en) 2022-04-25 2022-04-25 Method and device for executing cleaning operation, storage medium and electronic device

Country Status (2)

Country Link
CN (1) CN116982883A (en)
WO (1) WO2023207611A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014190886A1 (en) * 2013-05-27 2014-12-04 上海科斗电子科技有限公司 Intelligent interaction system and software system thereof
DE102016102644A1 (en) * 2016-02-15 2017-08-17 RobArt GmbH Method for controlling an autonomous mobile robot
CN107773169A (en) * 2017-12-04 2018-03-09 珠海格力电器股份有限公司 A kind of control method of sweeping robot, control device and sweeping robot
CN109199240B (en) * 2018-07-24 2023-10-20 深圳市云洁科技有限公司 Gesture control-based sweeping robot control method and system
CN112545373B (en) * 2019-09-26 2022-08-05 珠海一微半导体股份有限公司 Control method of sweeping robot, sweeping robot and medium
CN113116224B (en) * 2020-01-15 2022-07-05 科沃斯机器人股份有限公司 Robot and control method thereof
CN113679298B (en) * 2021-08-27 2022-05-10 美智纵横科技有限责任公司 Robot control method, robot control device, robot, and readable storage medium

Also Published As

Publication number Publication date
WO2023207611A1 (en) 2023-11-02

Similar Documents

Publication Publication Date Title
CN104010539A (en) Identity recognition method based on electronic cigarette and corresponding electronic cigarette
CN109002387B (en) User reminding method and device of application program, terminal equipment and storage medium
CN109858227B (en) Fingerprint input method and device, electronic equipment and storage medium
US11216543B2 (en) One-button power-on processing method and terminal thereof
CN106845267B (en) The processing method and mobile terminal of applicating history information
CN111461337B (en) Data processing method, device, terminal equipment and storage medium
CN108847216B (en) Voice processing method, electronic device and storage medium
CN111638651A (en) Intelligent household control panel, setting method thereof, server and storage medium
EP3226623A1 (en) Method for awakening wireless-fidelity network and terminal
CN109067628A (en) Sound control method, control device and the intelligent appliance of intelligent appliance
CN112331213A (en) Intelligent household equipment control method and device, electronic equipment and storage medium
CN112558911B (en) Voice interaction method and device for massage chair
CN105022945A (en) Human face biological information based screen unlocking method and mobile device
CN107515765B (en) Alarm turn-off method, system and terminal equipment
CN109857929A (en) A kind of man-machine interaction method and device for intelligent robot
CN116982883A (en) Method and device for executing cleaning operation, storage medium and electronic device
CN110390856B (en) Express cabinet operation guiding method and device, express cabinet and storage medium
CN108170336A (en) A kind of terminal desktop application display control method and system
CN110765911A (en) Alarm clock turning-off method and device
CN105069337A (en) Palm biological information based screen unlocking method and mobile device
CN109923593B (en) Fingerprint module, fingerprint identification system, control method and intelligent lock
CN108459838B (en) Information processing method and electronic equipment
CN209453563U (en) A kind of intelligent robot
CN117953882A (en) Control method and device of cleaning equipment, storage medium and electronic device
CN113253835A (en) Man-machine interaction equipment control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination