CN116088326A - Pet robot control method for controlling household equipment by pets - Google Patents

Pet robot control method for controlling household equipment by pets Download PDF

Info

Publication number
CN116088326A
CN116088326A CN202211593146.2A CN202211593146A CN116088326A CN 116088326 A CN116088326 A CN 116088326A CN 202211593146 A CN202211593146 A CN 202211593146A CN 116088326 A CN116088326 A CN 116088326A
Authority
CN
China
Prior art keywords
pet
command
preset
robot
pet robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211593146.2A
Other languages
Chinese (zh)
Inventor
姜新桥
邓文拔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai 1 Micro Robot Technology Co ltd
Original Assignee
Zhuhai 1 Micro Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai 1 Micro Robot Technology Co ltd filed Critical Zhuhai 1 Micro Robot Technology Co ltd
Priority to CN202211593146.2A priority Critical patent/CN116088326A/en
Publication of CN116088326A publication Critical patent/CN116088326A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Toys (AREA)

Abstract

The invention discloses a pet robot control method for controlling household equipment by pets, which comprises the steps of determining the real control intention of the pets on the household equipment from multiple dimensions (including repeatedly guiding the pets to make corresponding actions) in the process that the pet robot autonomously moves outside the pets, firstly defining the effective control range of the pets on the household equipment, guiding the pets to repeatedly execute preset commands in the effective control range by utilizing host guiding instructions associated with the household equipment until the successful execution times meet the successful target ratio, determining that the pets effectively control the household equipment, and then enabling the pet robot to send control commands to the household equipment effectively controlled by the pets according to the preset commands, starting one of the household equipment to operate according to the functions corresponding to the control commands, so as to realize the indirect control of the household equipment by means of sending actions or sounds in the effective control range based on the intermediaries of the pet robot.

Description

Pet robot control method for controlling household equipment by pets
Technical Field
The invention belongs to the technical field of pet electronic equipment, and particularly relates to a pet robot control method for controlling household equipment by a pet.
Background
Currently, research and development of intelligent hardware in the pet information industry has been in the spotlight, anti-lost positioners, intelligent feeders, intelligent collars, and the like. For example, the existing intelligent pet monitoring wearable equipment performs data acquisition on the behavior characteristics and the body surface characteristics of actions through various sensors, and finally uploads the data to a background server through a remote wireless communication module to perform operation, analysis and feedback to a user, so that the user can timely and accurately know the health condition of animals.
In the prior art, the wearable pet equipment is generally used for collecting data on body surface characteristics so as to feed back pet behavior state parameters, and because the wearable pet equipment can keep following the movement of a pet and can only obtain local body surface characteristics of the pet, the collected data is unstable and cannot comprehensively reflect the overall actions sent by the pet, so that the control of the wearable pet equipment on home equipment associated with an owner guiding instruction is unstable or the accuracy is low.
Disclosure of Invention
The application discloses a pet robot control method for controlling household equipment by pets, which specifically comprises the following technical scheme:
a pet robot control method for controlling home appliances by a pet, the pet robot control method comprising: step S1, a pet robot establishes network connection with at least one household device; s2, detecting the position of the pet in a mode that the pet robot moves autonomously; step S3, if the pet robot detects that the pet is in a target area corresponding to one of the household devices, selecting an owner guiding instruction associated with the one of the household devices from a plurality of preset owner guiding instructions; step S4, the pet robot guides the pet to repeatedly execute a preset command for controlling one of the household devices in the target area according to the host guiding command associated with the one of the household devices until the number of times of repeated execution reaches a preset training number, and then determines whether the pet effectively controls the one of the household devices according to the proportion of the number of times of successful execution of the preset command; wherein the pet robot keeps detecting the pet and sets the time maintained by executing the preset command each time as a first preset control time; step S5, when the pet is determined not to effectively control one of the household devices, the pet robot rotates at least one circle or linearly moves back and forth for a preset distance, then changes the preset command, repeatedly executes the step S4 and the step S5 until the pet is determined to effectively control one of the household devices, and then executes the step S6; and S6, when the pet is determined to effectively control one of the household devices, the pet robot sends a control command to the one of the household devices according to the preset command, and the one of the household devices is started to operate according to the function corresponding to the control command.
Further, each time it is determined that the pet effectively controls one of the home devices, the time that the one of the home devices operates according to the function corresponding to the control command is a second preset control time; when the running time of one of the household devices exceeds the second preset control time, controlling the one of the household devices to be closed, and then executing the steps S2 to S6; the second preset control time is longer than the first preset control time.
Further, the target area corresponding to one of the household devices is a regular graph area surrounding the one of the household devices, and the projection position of the sensor device in the one of the household devices, which is in networking communication with the pet robot, on the horizontal ground occupies the central position of the regular graph area; wherein the furthest distance between the edge of the regular pattern area and the center of the regular pattern area is twice the body length of the pet; the pet robot and one of the home devices are configured to bind together such that a preset command is associated with the one of the home devices.
Further, in the process of executing step S2 to step S6, there is: the moving mode of the pet robot is configured to keep the image of the pet collected and keep the sound signal of the pet collected, but the pet robot does not contact the pet, so that the pet robot captures action commands and/or sound commands sent by the pet in real time in the moving process; under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area, matching the image of the pet acquired in real time with a pre-stored pet behavior digital image associated with the host guiding command selected in the step S3, analyzing the matched pet behavior digital image into an action command, and determining that the pet robot captures the action command and the pet successfully executes the action command; under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area each time, matching a sound signal of the pet acquired in real time with pre-stored pet audio data associated with the host guiding command selected in the step S3, analyzing the matched pet audio data into a sound command, and determining that the pet robot captures the sound command and the pet successfully executes the sound command; then setting the working state of one of the household devices according to the voice command and/or the action command so as to fulfill the aim of controlling the one of the household devices by the pet; wherein, the action command comprises the activity position parameters of the pet; the sound command comprises sound source position parameters; the position information of the target area corresponding to one of the household devices is preset in the pet robot and is acquired in real time in the form of map coordinates, and the image of the one of the household devices is preconfigured as a road sign image; wherein the preset command comprises an action command and/or a sound command.
Further, in the step S4, the method for guiding the pet robot to repeatedly execute the preset command for controlling the one of the home devices in the target area according to the host guiding command associated with the one of the home devices until the number of repeated execution reaches the preset training number, and determining whether the pet effectively controls the one of the home devices according to the proportion of the number of successful execution of the preset command specifically includes: step S41, when the pet robot receives an owner guiding instruction associated with one of the household devices, the pet robot sends a preset command for controlling the one of the household devices to a pet, so that the pet robot is triggered to make a training action corresponding to the preset command when the pet robot walks along a specific route in the target area, walks to a specific position or plays a specific sound signal at the specific position; step S42, the pet robot captures training actions of the pet on the preset command in the target area, and judges whether the training actions are successfully executed; step S43, if the pet robot judges that the pet successfully executes the training action, playing a first preset voice to prompt the successful execution of the preset command once, and counting the times of successful execution; step S44, if the pet robot judges that the pet does not execute the training action, a second preset voice is played to prompt that the preset command is not executed once successfully; step S45, when the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, determining that the pet effectively controls one of the household devices when the ratio of the number of times of successfully executing to the preset training number is greater than or equal to the successful target ratio, so as to realize that a master is replaced to manually control the household device associated with the master guiding instruction, and the household device is enabled to operate according to the function corresponding to the control command; and when the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, and the ratio of the number of times of successfully executing to the preset training number is smaller than the successful target ratio, determining that the pet does not effectively control one of the household devices.
Further, in the step S42, the judgment manner of whether the training action is successfully executed is as follows: judging whether the training action captured by the pet robot is matched with the preset command or not, if yes, determining that the pet successfully executes the training action and successfully executes the preset command to successfully control the household equipment once, otherwise, determining that the pet does not successfully execute the training action and determines that the preset command is not successfully executed; wherein the matching between the training actions and the preset commands includes matching of behavior state parameters of the pet.
Further, in the step S5, after the pet robot rotates at least one turn or linearly moves back and forth for a preset distance, the changing the mode of the preset command includes: repeatedly executing the steps S41 to S44, capturing failure training actions in the target area by the pet robot in the process of repeatedly executing the steps S41 to S44, counting the repeated occurrence times of the same failure training actions or the repeated occurrence times of the preset command which is not successfully executed by the pet, and recording the repeated occurrence times counted currently as failure times; when the training action captured by the pet robot is not matched with the preset command, the training action captured by the pet robot is a failed training action; before the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, if the ratio of the counted number of failures to the preset training number is equal to the successful target ratio, correcting the preset command according to the failed training action, then associating the corrected preset command to one of the household devices, setting the working state reached by the household device controlled by the corrected preset command, and finishing the setting of the preset command for controlling the same household device.
Further, the home appliances include, but are not limited to, self-moving cleaning robots, and stationary set home appliances; wherein, the preset command includes a control command associated to the same home equipment.
Further, the pet robot is a spherical robot equipped with a wireless communication module, a camera, and a sound sensor; wherein the geometric shape of the driving wheel of the spherical robot is a hemisphere or a semi-ellipsoid, and the driving wheel is used for driving the spherical robot to move on a movable plane where the pet is located but not contact with the pet; wherein the camera is arranged in an upper shell of the spherical robot shell.
The application brings the following beneficial technical effects:
in the process that the pet robot autonomously moves outside the pet body, the real control intention of the pet on the household equipment is determined from a plurality of dimensions, the pet is firstly defined in an effective control range corresponding to the household equipment (namely, in a target area corresponding to one household equipment), then the pet is guided to repeatedly execute a preset command in the effective control range by utilizing an owner guiding instruction associated with the household equipment until the successful execution times meet the successful target ratio, and the pet robot effectively controls the household equipment is determined, so that the pet robot sends a control command to the household equipment effectively controlled by the pet according to the preset command, the purpose of indirectly controlling the household equipment to be opened and closed in a mode of sending actions or sounds in the effective control range based on the intermediate medium effect of the pet robot is achieved, the purpose that the pet replaces manual effective control of the household equipment is met, and the accuracy of controlling the household equipment is further improved.
In the application, the pet robot located outside the pet body can judge the position of the pet without combining equipment such as a camera of the household equipment, and further triggers the pet to make corresponding combined control on the household equipment by combining the action command and the sound command which are matched in real time so as to set the working state of the household equipment. The pet robot can effectively and accurately control household equipment associated with the host guiding instruction, living fun and comfort of pets are improved, and management between the household equipment and the pets is effectively coordinated by users.
In the process of repeatedly executing preset commands for the pets for preset training times, the pet robot detects actions sent by the pets in real time outside the pets and timely judges whether the pets successfully control the household equipment; when the proportion of the successful execution times (the successful control times) meets the successful target proportion, the training action is effective and then is not required to be repeated, and the speed of the pet controlling the household equipment through the pet robot is improved.
After the pet is determined to effectively control the household equipment, the time for running the household equipment according to the function corresponding to the preset command is set to be longer than the time for which the preset command is sent to the pet once, so that the working state of the household equipment in the process of repeatedly executing the preset command for preset training times and the working state of the household equipment which is effectively controlled by the pet and normally runs can be distinguished.
If the counted failure times are too many and at least exceed half of the preset training times, the situation that the pet possibly has difficulty according to the action sent by the preset command is indicated, the attention of the pet can be attracted from the household equipment in a mode of reciprocally adjusting the position of the pet robot, correction processing of the preset command is carried out according to the failed action while the attention of the pet is changed, the corrected preset command is sent to the pet after the correction action command is determined, and the preset command (corresponding to the execution of the training action) still needs to be executed for multiple times through the pet equipment, so that the proficiency and accuracy of the pet on the preset command are further improved, and the accuracy of controlling the household equipment is further improved.
Drawings
Fig. 1 is a schematic flow chart of a pet robot control method for controlling home devices by a pet according to an embodiment of the present application.
Fig. 2 is a flowchart of a method for guiding a pet to execute a preset command in step S4 provided in fig. 1.
Fig. 3 is a flowchart of a method for changing a preset command in step S5 provided in fig. 1.
Detailed Description
The following describes the technical solution in the embodiment of the present invention in detail with reference to the drawings in the embodiment of the present invention. The following embodiments and features of the embodiments may be combined with each other without conflict.
As can be seen from the foregoing background art, in the prior art, there is a lack of actions for guiding and monitoring the pet from outside the body of the pet in an autonomous movement manner, so that a non-wearable pet device is required to effectively trigger the operation of household devices around the pet.
Referring to fig. 1, the pet robot control method includes:
step S1, the pet robot establishes network connection with at least one household device, and then step S2 can be executed. In step S1, the pet robot supports networking connection of a plurality of home devices, including but not limited to a self-moving cleaning robot including a sweeping robot, a window cleaning robot, a washing robot, etc., and a fixedly set home appliance including: sound, electronic entrance guard, television, air conditioner, lighting lamp, etc. After a user starts the pet robot and the household equipment to be networked, operating keys or a touch screen on the pet equipment or the household equipment to be networked so as to enable the pet robot and the household equipment to establish a binding relation according to a proper binding mode; whether binding is completed can be confirmed by an indicator light in the pet robot or the home equipment. Based on the network connection relationship, each time one home device is added to the same pet robot, an owner guiding instruction associated with the home device is set for the newly added home device, so as to determine an action command and/or a sound command capable of directly or indirectly controlling the home device.
Step S2, detecting the position of the pet by means of autonomous movement of the pet robot, realizing real-time tracking of the pet, attracting the pet to move in a certain area according to a walking path trained in advance to a certain extent, and then executing step S3; specifically, since the pet has randomness in the indoor activities, in order to restrict the movement mode of the pet, an effective control range obtained by the pet can be defined according to the household equipment, for example, the pet can be triggered to train in an area range in front of the sweeping robot for executing the charging operation (the distance between the pet robot and a charging seat for charging the sweeping robot is within 2 meters), and when the pet makes a specific command in the area range, the charging and cleaning operation of the sweeping robot can be controlled.
Step S3, if the pet robot detects that the pet is in a target area corresponding to one of the household devices (the pet is located in the vicinity of the household device to be controlled), selecting a host guiding instruction associated with the one of the household devices from a plurality of host guiding instructions set in advance, and then executing step S4. Each host guiding instruction can be preset in a memory in the pet robot, and the host guiding instruction comprises guiding the pet to make a sound or act, so that the training requirement of the host on the pet is met, but the pet robot is required to convert the stored host guiding instruction into a command signal which can be accepted by the pet, and the converted command is associated with one of the household devices based on the established network connection.
Step S4, the pet robot guides the pet to repeatedly execute a preset command for controlling one of the household devices in the target area according to the host guiding command associated with the one of the household devices until the number of times of repeated execution reaches a preset training number, and then determines whether the pet effectively controls the one of the household devices according to the proportion of the number of times of successful execution of the preset command; when the pet robot determines that the pet is effectively controlling one of the home devices, executing step S5; when the pet robot determines that the pet is not actively controlling the one of the home devices, step S6 is performed.
In this embodiment, the user may reasonably guide the pet to make a corresponding voice command and/or an action command according to the action already mastered by the pet, and capture the voice command and/or the action command from the pet through the pet robot, where the reasonable guide is stored in the memory of the pet robot in the form of an owner guide command and is associated with the home device, and when called, guides the pet to execute a preset command associated with the owner guide command currently called; the pet robot keeps detecting the pet in the process of guiding the pet to execute the preset command for controlling one of the household devices in the target area every time, and sets the time maintained by executing the preset command every time as a first preset control time, wherein the first preset control time can also be regarded as the time when the preset command continuously affects the pet, and provides enough reaction time for the pet to execute the preset command; after the pet executes the preset command and captures the corresponding image and/or sound by the pet robot, the function information corresponding to the preset command is sent to the household equipment in a network connection mode so as to play the effect of operating the household equipment, for example, the pet is guided to continuously call five sounds in a target area corresponding to the sound, the pet robot captures the sound command (comprising the position information and the sound information), the pet robot can preset the sound command association sound or real-time sets the sound command association sound in the target area where the pet is currently located, and for the pet robot, the sound command is configured as a preset command for starting the sound to be opened, namely, the pet is triggered to complete the execution of the preset command called five-sound starting sound; for pets, the voice command is configured as a preset command called five-voice on sound, namely, the pet is led to call out a specific voice signal to start the sound, and the pet realizes the execution of the training action.
In some embodiments, the preset command is a pet training command set according to the host guiding command and can be associated with a home device, for example, the pet wants to let the sweeping robot charged in the target area enter the planned sweeping state, and the user can jump in place for a designated number of times, where the designated number of times is the pet training command set by the host guiding command. Based on the host guiding instruction, the pet robot can be controlled to walk along a specific route in the target area, walk to a specific position or play a specific sound signal at the specific position, and then the pet is triggered to make a training action corresponding to the preset command, wherein each behavior of the pet robot can be trained repeatedly to enable the pet to remember that the behavior matches with a certain function of the household equipment, a corresponding conditional reflex is established, the pet conditionally forms a mastered action corresponding to a certain running function of the certain household equipment, each behavior of the pet robot is determined by the host guiding instruction, and the preset command comprises behavior state parameters of the mastered action made by the pet conditional reflex and position information relative to the household equipment.
And S5, when the pet is determined not to effectively control one of the household devices, the pet robot rotates at least one circle or linearly moves back and forth for a preset distance, then the preset command is changed, so that the attention of the pet is attracted from the household device in a mode of reciprocally adjusting the position of the pet robot, the pet robot can also understand that the pet performs the condition reflection behaviors of reciprocating circular motion and autorotation on the pet robot, the pet robot can change the attention of the pet while carrying out the correction processing of the preset command according to the failed action, and then the step S4 and the step S5 are repeatedly executed until the pet is determined to effectively control one of the household devices through the judgment of the step S4, and then the step S6 is executed.
Step S6, when the fact that the pet effectively controls one of the household devices is determined, the control right of the household device can be confirmed to be obtained without adding additional verification measures, meanwhile, the pet robot sends a control command to the one of the household devices according to the preset command, and the one of the household devices is started to operate according to the function corresponding to the control command, wherein the control command is related to the same household device, and the preset command comprises the control command; the preset command includes a control command directly applied to and bound to the home equipment, is regarded as accuracy and familiarity decision of actions generated by the repeated execution of the preset command by the pet, and can not change along with the change of the position of the home equipment, but the position information in the preset command changes along with the change of the position of the home equipment, so that the environmental adaptability of the pet to be guided is enhanced.
On the basis of the step S6, when the control command is a switch command, the corresponding household equipment is controlled to be started according to a preset command detected by the pet robot; for example, detecting that a pet jumps down to three places in a target area of an air conditioner by a pet robot, and controlling the air conditioner to be started by the pet robot; and on the premise that the air conditioner is started, controlling the corresponding air conditioner to be closed according to the detected preset command. When the pet robot detects the same preset command (in-situ jump three times) again in the target area of the air conditioner, the pet robot is determined to be a closing command, and the corresponding air conditioner is controlled to be closed. When the control command is an adjustment command, controlling corresponding household equipment to adjust according to a preset command detected by the pet robot; for example, in the case where the air conditioner has been turned on, the pet robot is controlled to adjust up the air conditioner by 1 degree every time the pet jumps over the target area of the air conditioner, and detects the action command (belonging to the preset command).
Compared with the prior art, in the process that the pet robot autonomously moves outside the pet body, the real control intention of the pet on the household equipment is determined from multiple dimensions, the pet is firstly defined in an effective control range corresponding to the household equipment (namely, in a target area corresponding to one household equipment), and then the pet is guided to repeatedly execute a preset command in the effective control range by utilizing an owner guiding instruction associated with the household equipment, when the successful control times (the successful execution times of the preset command) meet the successful target rate (for example, 95 percent), the control of the pet on the household equipment reaches the degree of conditional reflection, the intermediate medium effect based on the pet robot is realized, the pet is indirectly controlled to open and close the household equipment in the effective control range in a mode of sending out actions or sounds, the purpose that the household equipment is replaced by the pet is met, and the accuracy of effectively controlling the household equipment is further improved.
In addition, the pet robot positioned outside the pet body can judge the position of the pet without combining equipment such as a camera of the household equipment, and the position is packaged in a preset command, so that the pet is triggered to make corresponding combined control on the household equipment by combining the action command and the sound command which are matched in real time, and the working state of the household equipment is set. The pet robot can effectively control household equipment associated with the host guiding instruction, living fun and comfort of pets are improved, and management between the household equipment and the pets is facilitated to be coordinated by users.
As an embodiment, each time it is determined that the pet effectively controls one of the home devices, the time that the one of the home devices operates according to the function corresponding to the control command is a second preset control time, in this embodiment, the second preset control time is greater than the first preset control time, the second preset control time may be understood as a time that the one of the home devices is formally operated and maintained, and an operating state of the one of the home devices in this second preset control time is a working state that satisfies an expected effect that the host directs the pet to control the home device proficiently.
Further, when the running time of one of the home devices exceeds the second preset control time, the one of the home devices is controlled to be closed, and then the steps S2 to S6 are executed. Preferably, the second preset control time is set to be greater than a product of the preset number of exercises and the first preset control time. In some control implementation scenes, a pet robot sends out a preset command for controlling the starting of the water heater, so that a pet can execute corresponding actions and/or sounds, after the pet effectively controls the water heater, the pet robot sends out a control command to the water heater, the water heater is started to operate the second preset control time according to the function corresponding to the control command, for example, the water heater is started for 20 minutes (namely the second preset control time) later, hot water is reserved, and then the water heater is turned off; after the pet robot sends a control command to the water heater, the pet robot moves across the region, guides the pet to come into a target region of the television, and then sends a preset command for controlling the television, so that the pet can execute corresponding actions and/or sounds, and after the pet effectively controls the television, the pet robot sends the control command to the television, for example, the television is switched to a target program channel after the current program channel passes the second preset control time (which can be 16 minutes in time and serves as a delay interval between different control tasks); the owner then returns to home television, and the hot water is fully ready. Therefore, the convenience of the host for using the household equipment is improved through the control of the pet on the household equipment in the target area.
As an embodiment, the target area corresponding to the one of the home devices is a regular graphic area surrounding the one of the home devices, where the regular graphic area includes, but is not limited to, a circle, a square, a rectangle, a triangle, and the like. In one of the household devices, the projection position of the sensor device in networking communication with the pet robot on the horizontal ground occupies the central position of the regular graph area, so that all or a symmetrical partial area of the target area is in the coverage range of a receiving and transmitting signal (a signal used by networking between the pet robot and the household device) generated by the household device, the pet can accurately and effectively execute a preset command for controlling one of the household devices in the target area, and the command interaction accuracy between the pet robot and the household device in the target area is improved.
In this embodiment, the furthest distance between the edge (e.g., corner point) of the regular pattern area and the center of the regular pattern area is twice the body length of the pet, so that the distance between the pet robot and one of the home devices in tracking the pet is less than or equal to twice the body length of the pet, and the pet robot builds a map for the regular pattern area on the basis of not touching the pet, and the pet robot and one of the home devices are configured to bind together so that a preset command issued by the pet robot is associated with the one of the home devices; therefore, the home equipment is marked in the constructed map to form a landmark position, and the landmark position can be understood as the origin position of the map coordinate system.
On the basis of the above embodiment, in the process of executing step S2 to step S6, there are: the moving mode of the pet robot is configured to keep the image of the pet collected and keep the sound signal of the pet collected, but the pet robot does not contact the pet, so that the pet robot captures action commands and/or sound commands sent by the pet in real time in the moving process; preferably, the moving route of the pet robot is set around the pet without crossing the target area. The sound sensor and the camera are assembled in the shell of the pet robot, and are used for collecting images of the pet and sound signals of the pet, and the images and the sound signals correspond to preset commands sent by the captured pet.
Under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area, matching the image of the pet acquired in real time with a pre-stored pet behavior digital image associated with the host guiding command selected in the step S3, analyzing the matched pet behavior digital image into an action command included in the preset command, storing the action command to update the preset command for one time, and determining that the pet successfully executes the action command, wherein the action command can be recorded as the pet successfully executes the preset command once and further the one of the household devices is successfully controlled; determining that the pet robot captures an action command included in the preset command and the action command is associated with one of the household devices, wherein the matching degree between the image of the pet and the digital image of the pet behavior is determined according to the matching property between gray values and the spatial relationship of the action of the pet in an image area, and a specific matching algorithm is a template matching algorithm or a feature matching algorithm common in the prior art; the position information of the target area corresponding to one of the household devices is preset in the pet robot and is acquired in real time in the form of map coordinates, and the image of the one of the household devices is preconfigured as a road sign image. Then executing step S6, wherein the pet robot sends a control command to one of the household devices according to the preset command in step S6, and starts the one of the household devices to operate according to the function corresponding to the control command; the pet behavior digital image related to the host guiding instruction reflects an image of a standard action executed by the pet according to the preset command. The preset command analyzed by the pet behavior digital image successfully matched can be used when one of the household devices is subsequently controlled again, and the control mode of the one of the household devices is updated.
Under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area, matching a real-time collected sound signal of the pet with pre-stored pet audio data associated with the host guiding command selected in the step S3, analyzing the matched pet audio data into a sound command included in the preset command, storing the sound command to update the preset command for one time, determining that the pet successfully executes the sound command, and recording that the pet successfully executes the preset command for one time and then successfully controlling the one of the household devices; the pet robot is determined to capture a sound command included in the preset command and the sound command is associated with one of the household devices; the matching degree between the sound signals of the pets and the pre-stored pet audio data associated with the host guiding instruction selected in the step S3 is determined according to the matching property between the audio data and the distribution rule of the sound emitted by the pets in a specific frequency spectrum, and the specific matching mode is to extract the sound signals by adopting a common characteristic extraction algorithm in the prior art to perform amplitude-frequency characteristic comparison, and obtain the matching degree between the audio data according to the comparison result.
Then, setting the working state of one of the household devices according to the voice command and/or the action command so as to fulfill the aim of controlling the one of the household devices by the pet, which is equivalent to the aim of replacing the pet owner to indirectly operate the household devices; wherein, the action command comprises the activity position parameters of the pet; the sound command includes sound source position parameters for restoring specific position coordinate information of the pet.
In some embodiments, the pet's gesture motion signal is not acquired directly through a tri-axial accelerometer or gyroscope in the inertial sensing device, as in a wearable device, but rather image acquisition is performed from outside the pet's body, while sound signals are acquired through a microphone. Of course, the preset command may be a combination of a sound command and/or an action command, such as waving the claw part and simultaneously speaking two sounds.
For the setting of the voice command and/or the action command, for example, guiding the pet to call five voices continuously, capturing the voice signal by the pet robot; then, the sound signal associated sound is set on the pet robot, the pet robot analyzes and converts the sound signal, extracts a sound command for controlling the sound, and sends the sound command to the sound, and the sound command realizes the control of the opening of the sound, namely the setting of a series of preset commands of 'five sounds' and 'sound opening' is completed. For another example, the pet is guided to make a jump-in-place, the action command is captured by the pet robot, then the air conditioner associated with the action command is set in the pet robot, the pet robot analyzes, converts and extracts the action image and controls the starting command of the air conditioner, and then the command for controlling the starting of the air conditioner realizes the starting of the air conditioner, namely, the setting of a group of preset commands of 'jump-in-place' and 'starting of the air conditioner' is completed. For example, after the air conditioner is turned on, the pet robot captures an action command of "jumping up" of the pet, and sets the action command as a command for controlling the temperature of the air conditioner to rise once, that is, a set of preset commands of "jumping up once" and "rising up once" are set. The method can also set a plurality of preset commands such as 'groveling down once and staying for 3 seconds for cooling once', 'turning on the television for three circles in situ', 'rolling on the television for one circle for channel changing', 'calling for three sounds in a certain direction for radio starting', and the like according to actual demands.
As an embodiment, in the step S4, the method for guiding the pet robot to repeatedly execute the preset command for controlling the one of the home devices in the target area according to the host guiding command associated with the one of the home devices until the number of repeated execution reaches the preset training number, and determining whether the pet effectively controls the one of the home devices according to the proportion of the number of successful execution of the preset command, as shown in fig. 2, specifically includes:
step S401, judging whether the times of repeatedly executing the steps S402 to S405 reach the preset training times, if yes, executing the step S406, otherwise, executing the step S402; in step S4, the preset command is a training command applied to the pet before formally controlling the home device, and the training command needs to be repeatedly executed to determine whether the pet can actually control the home device to start, so that step S401 needs to be executed to determine the number of times of cycle execution.
Step S402, when the pet robot receives an owner guiding instruction associated with one of the household devices, the pet robot sends a preset command for controlling the one of the household devices to a pet, so that the pet performs a training action on the preset command, specifically, when the pet robot walks along a specific route in the target area (can also rotate in situ for a plurality of circles), walks to a specific position, or plays a specific sound signal at the specific position (can be controlled by the owner guiding instruction), the pet is triggered to perform the training action corresponding to the preset command; step S403 is then performed. Wherein the host guidance instructions associated with the one of the home devices are pre-recorded or issued in real-time by the pet host. And for the effect of the sent preset command, for example, the pet robot sends out a sound to inform the pet to jump in place, and then the pet robot catches the command, and then the air conditioner is informed to be turned on.
Step S403, the pet robot captures training actions of the pet on the preset command in the target area, and judges whether the training actions are successfully executed; if yes, step S404 is executed, otherwise step S405 is executed. In the step S403, the judgment mode of whether the training action is successfully executed is as follows: judging whether the training action captured by the pet robot is matched with the preset command, if so, determining that the pet successfully executes the training action (considered to successfully execute the training action once) and successfully executes the preset command (considered to successfully execute the preset command once) to realize successful control of household equipment once, otherwise, determining that the pet does not successfully execute the training action (considered to not successfully execute the training action once) and determining that the preset command (considered to not successfully execute the preset command once); the matching between the training action and the preset command includes matching a behavior state parameter of the pet, where the behavior state parameter of the pet includes, but is not limited to, a jump designated number of times, a waving claw, a running set time period, a running set distance, or a rotation designated number of turns, and the matching and matching may be performed in a gesture action image mode, and the matching and matching mode refers to a matching mode between an image of the pet acquired in real time in the foregoing embodiment and a pre-stored digital image of the behavior of the pet associated with the host guiding command selected in step S3, and a specific matching algorithm is a template matching algorithm or a feature matching algorithm that is common in the prior art and is not repeated herein.
Step S404, the pet robot judges that the pet successfully executes the training action, and can determine that the preset command is successfully executed once, further determine to control the household equipment once, play a first preset voice to prompt the pet to successfully execute the preset command once, and count the number of times of successful control as the number of times of successful execution of the preset command; then, the process returns to the execution of step S401 to determine the number of times of repeatedly executing steps S402 to S405 (it can be understood that the number of times of repeatedly executing step S402 is determined). In some embodiments, playing the first preset voice is actually playing a preset voice reward, which may be factory configured of the pet equipment or may be entered by the user.
Step S405, if the pet robot judges that the pet does not successfully execute the training action, a second preset voice is played to prompt the pet to successfully execute the preset command once; then, the process returns to the execution of step S401 to determine the number of times of repeatedly executing steps S402 to S405 (it can be understood that the number of times of repeatedly executing step S402 is determined). The implementation mode that the pet does not successfully execute the training action may be: the preset command specifies that the pet jumps three times in place, but the pet jumps only two times, and the pet robot cannot confirm that the pet successfully performs the training action (or the preset command), and cannot send the control command to start the associated home equipment. At this time, the pet robot guides the pet to perform the training action again by playing the second preset voice, for example, playing the encouraging audio information mastered by the pet that "need to jump three times" to help the pet perform the correct training action.
Step 406, when the number of times of repeatedly executing the steps S402 to S405 reaches the preset training number, and the ratio of the number of times of successfully executing occupying the preset training number is greater than or equal to the successful target ratio, determining that the pet effectively controls one of the home devices, so as to realize that a master is replaced to manually control the home device associated with the master guiding instruction, and enabling the home device to operate according to the function corresponding to the control command; in addition, in the case that the number of times of repeatedly executing the steps S402 to S405 reaches the preset training number, when the ratio of the number of times of successful execution to the preset training number is smaller than the successful target ratio, it is determined that the pet does not effectively control the one home device, and step S5 is required to be executed to correct the preset command. Preferably, in order to improve the effectiveness of the training actions among the steps S402 to S405, the preset training times may be preliminarily set to 30 times, and the number of times of successfully performing the training actions may be set to 20 times. And (3) repeatedly executing the steps S402 to S405 to drive the pet to repeatedly perform training actions in the same target area until the proportion of the successful times reaches a certain value (such as more than 60%), so that the pet can understand a preset command for controlling household equipment and proficiently perform accurate training actions.
It should be noted that, there is a correspondence between the training action sent by the pet and the function operated by the household device after being effectively controlled by the pet. Typically, the pet needs to train repeatedly to remember that each preset command can trigger the target function of the associated home device under the control command, and corresponding conditional reflex is established. The step S402 corresponds to the step S41 of the disclosure, the step S403 corresponds to the step S42 of the disclosure, the step S404 corresponds to the step S43 of the disclosure, the step S405 corresponds to the step S44 of the disclosure, and the step S406 corresponds to the step S45 of the disclosure.
In summary, in the process of repeatedly executing the preset command on the pet for the preset training times, the pet robot detects the action sent by the pet in real time outside the pet and timely judges whether the pet successfully controls the household equipment; when the proportion of the successful execution times meets the successful target ratio, the training action is effective and then is not required to be repeated, so that the speed of controlling household equipment by the pet robot is improved.
Based on the above embodiment, in the step S5, after the pet robot rotates at least one turn or linearly moves back and forth for a preset distance, the method for changing the preset command, as shown in fig. 3, specifically includes:
Step S501, repeatedly executing step S402 to step S405; in the process of repeatedly executing steps S402 to S405, the pet robot captures a failed training action in the target area, where when it is determined in the foregoing step S403 that the training action captured by the pet robot does not match the preset command, the training action captured by the pet robot is the failed training action. Then counting the repeated occurrence times of the same failure training action or the repeated occurrence times of the preset command which is not successfully executed by the pet, and counting the repeated occurrence times counted currently as failure times; step S502 is then performed.
Step S502, before the number of times of repeatedly executing steps S402 to S405 reaches the preset training number, if the ratio of the counted number of failures to the preset training number is equal to the successful target ratio, it indicates that there may be difficulty in training, and whether there are other more suitable actions may be considered, so that the preset command is corrected according to the failed training action, where the correction method includes correcting from the quantifiable dimensions such as the number of times specified by jumping, the running set time period, the running set distance, or the number of rotations specified, and correcting the preset command to a command adapted to the failed training action, then completing adjustment of the action command under the same home device and the same associated control command, so that the pet performs the training action with a more willing action, and then executing step S503.
Step S503, associating the corrected preset command to one of the household devices, and setting the working state reached by the household device controlled by the corrected preset command, so as to complete the setting of the preset command for controlling the same household device; and repeatedly executing the steps S4 to S5, and continuously verifying whether the pet can effectively control one of the household devices according to the corrected preset command, so that the probability of the pet successfully controlling the household device in the target area can be continuously improved. Therefore, the pet robot is accurately triggered to send the control command, and the function of accurately controlling the household equipment to run to meet the guiding command of the host or the requirement of the preset command is achieved.
It should be noted that, the preset distance is less than or equal to the body length of the pet, so as to prevent the posture of attracting the pet from changing greatly before executing step S501; the rotation number of the pet robot is preset, the time spent for accumulating the rotation is less than the first preset control time before executing the step S501, so that the pet robot is prevented from rotating too long to excessively attract the attention of the pet, and the time spent for executing the preset command once by the pet is generally set to be less than the time spent for executing the preset command once by the pet, which can be understood as the time spent for making the training action once, including the time spent for issuing the preset command once by the pet robot; preferably, when determining the ratio of the number of failures to the preset number of training, the success target ratio is set to be greater than 50%, and if not, measures can be taken to correct the original preset command before the number of repeated execution of steps S402 to S405 reaches the preset number of training, so as to increase the correction speed.
In summary, when the number of failure counted in this embodiment is too large, at least half of the preset training times are exceeded, which indicates that there may be difficulty in the pet's actions according to the preset commands, the pet's attention may be attracted from the home device by reciprocally adjusting the position of the pet robot, while the attention of the pet is changed, the correction processing of the preset commands is performed according to the failed actions, and after the correction action command is determined, the corrected preset commands are issued to the pet, and the preset commands (corresponding to executing the training actions) are performed multiple times by the pet device, so that the proficiency and accuracy of the pet on the preset commands are improved, and the accuracy of controlling the home device is further improved.
Preferably, the home appliances include, but are not limited to, self-moving cleaning robots, and stationary set-up home appliances. In the process of networking the pet robot and the household equipment, the pet robot can acquire relevant information of a communication protocol supported by the household equipment, wherein the relevant information of the communication protocol comprises a protocol format or a protocol identifier, and the preset command is required to follow the relevant communication protocol in the process of transmission and conversion so as to restore sound and image information which is convenient for the pet to recognize (or cause conditional reflection).
In the foregoing embodiment, the pet robot is a spherical robot equipped with a wireless communication module, a camera, and a sound sensor (including a microphone array and a voice player); wherein the geometric shape of the driving wheel of the spherical robot is a hemisphere or a semi-ellipsoid, and the driving wheel is used for driving the spherical robot to move on a movable plane where the pet is located but not contact with the pet; the camera is arranged in an upper shell of the spherical robot shell. Wherein, gesture action image that the pet sent is gathered from the external of pet through the camera generally, and sound signal passes through the sound sensor and gathers. Of course, the preset command may be a combination of two kinds of information in the gesture motion image or the sound signal, such as waving the claw while making two sounds.
Specifically, the spherical robot includes a housing, and driving wheels mounted at both sides of the housing; the geometric shape of the driving wheel is a hemisphere or a semi-ellipsoid, the driving wheel can be regarded as a hemispherical wheel, the hemispherical shells of the two symmetrically arranged driving wheels and the cover surfaces of the upper shell and the lower shell of the shell form a sphere structure, alternatively, the hemisphere is not a completely balanced half of a complete sphere, and a little less is calculated as a hemisphere, for example, the 0.8 volume ratio part is a sphere; the driving wheel is driven by a motor accommodated in the shell to rotate relative to the shell, which is equivalent to rotating relative to the main body of the machine body, so that the spherical robot is driven to move.
Specifically, when the spherical robot falls forward, backward, left or right or is pushed down, the spherical robot can maintain the balance state by itself, so as to form a tumbler structure; the driving wheels are used for driving the intelligent spherical robot to move, including forward, backward, left turn or right turn, optionally, two driving wheels can be arranged on two sides outside the shell, optionally, the two driving wheels are respectively arranged on the left side and the right side of the shell, the problem that the space between the driving wheels is small due to the fact that the driving wheels are arranged inside the shell is overcome, and the stability of the spherical robot is enhanced; in some embodiments, the two drive wheels and the housing form a sphere. The upper surface and the lower surface of the shell and the outer side surface (side cover surface) of the driving wheel can be strictly cambered surfaces, or can be similar to a hemisphere, for example, a part of the hemisphere and other shapes are smoothly connected to form a shape, so that friction force can be reduced, and the spherical robot can return to an original upright state conveniently.
In some embodiments, the battery and the battery counterweight device thereof form a counterweight unit of the spherical robot, so that the gravity center of the spherical robot is adjusted, no matter how large a moment is generated between the gravity of the spherical robot and the landing point of the spherical robot, the battery accommodated in the battery counterweight device always forces the spherical robot to restore to the original state due to the moment, and the shape of the battery counterweight device and the specific assembly position of the battery counterweight device on the end face of the bottom of the shell are not limited. Therefore, the balance of the state is stable and balanced, so that the spherical robot always does not fall down no matter how the spherical robot swings, so that the spherical robot has strong gesture recovery capability and terrain adaptability, and meanwhile has the outstanding characteristics of strong maneuverability, low energy consumption, small turning radius and the like, and the spherical robot has no falling problem due to the appearance characteristics, can turn in all directions, and has strong continuous movement and high reliability.
When an instruction triggered by a pet owner is received, setting the acquired action image or sound signal sent by the pet as a preset command corresponding to a target function required to be operated by household equipment. Based on the intermediary effect of the pet robot, the pet can indirectly control the opening, closing and adjusting of the household equipment in the effective control range in a mode of emitting actions or sounds, and the purpose that the pet replaces manpower to effectively control the household equipment is met. In this embodiment, the pet robot located outside the pet body can determine the position of the pet without combining with devices such as a camera of the home device, and the position is encapsulated in a preset command, so that the pet is triggered to make corresponding combination control on the home device by combining with a motion command and a sound command which are well matched in real time to set the working state of the home device. The pet robot can effectively control the household equipment associated with the host guiding instruction, the interactivity of the pet and the household equipment is enhanced, the living fun and the comfort of the pet are improved, and the user can coordinate the management between the household equipment and the pet.
As a preferred embodiment, when the spherical robot has a laterally symmetrical structure, a symmetry axis in a horizontal direction of the spherical robot is designated as a central axis passing through front and rear sides of a housing of the spherical robot. The spherical robot is directional, the direction of the camera is positive, specifically, the camera is arranged in the upper shell of the shell, wherein the shell can be divided into an upper shell and a lower shell along the central axis, and the microphone array can be distributed in the shell or outside the shell around the camera. The spherical robot walks forward in the forward direction and walks backward in the direction opposite to the forward direction. The end toward which the lens including the camera is directed is referred to as the front end or front surface of the robot, and the opposite end is referred to as the rear end or back surface.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same; while the invention has been described in detail with reference to the preferred embodiments, those skilled in the art will appreciate that: modifications may be made to the specific embodiments of the present invention or equivalents may be substituted for part of the technical features thereof; without departing from the spirit of the invention, it is intended to cover the scope of the invention as claimed.

Claims (9)

1. A pet robot control method for controlling home appliances by pets, characterized in that the pet robot control method comprises:
step S1, a pet robot establishes network connection with at least one household device;
s2, detecting the position of the pet in a mode that the pet robot moves autonomously;
step S3, if the pet robot detects that the pet is in a target area corresponding to one of the household devices, selecting an owner guiding instruction associated with the one of the household devices from a plurality of preset owner guiding instructions;
step S4, the pet robot guides the pet to repeatedly execute a preset command for controlling one of the household devices in the target area according to the host guiding command associated with the one of the household devices until the number of times of repeated execution reaches a preset training number, and then determines whether the pet effectively controls the one of the household devices according to the proportion of the number of times of successful execution of the preset command; wherein the pet robot keeps detecting the pet and sets the time maintained by executing the preset command each time as a first preset control time;
Step S5, when the pet is determined not to effectively control one of the household devices, the pet robot rotates at least one circle or linearly moves back and forth for a preset distance, then changes the preset command, repeatedly executes the step S4 and the step S5 until the pet is determined to effectively control one of the household devices, and then executes the step S6;
and S6, when the pet is determined to effectively control one of the household devices, the pet robot sends a control command to the one of the household devices according to the preset command, and the one of the household devices is started to operate according to the function corresponding to the control command.
2. The pet robot control method according to claim 1, wherein the time during which the one of the home devices operates according to the function corresponding to the manipulation command is a second preset control time whenever it is determined that the pet effectively controls the one of the home devices; when the running time of one of the household devices exceeds the second preset control time, controlling the one of the household devices to be closed, and then executing the steps S2 to S6; the second preset control time is longer than the first preset control time.
3. The pet robot control method according to claim 1, wherein the target area corresponding to one of the home devices is a regular graphic area surrounding the one of the home devices, and a projection position of a sensor device in the one of the home devices, which is in networking communication with the pet robot, on a horizontal ground occupies a central position of the regular graphic area; wherein the furthest distance between the edge of the regular pattern area and the center of the regular pattern area is twice the body length of the pet; the pet robot and one of the home devices are configured to bind together such that a preset command is associated with the one of the home devices.
4. A pet robot control method according to claim 3, wherein in the process of performing step S2 to step S6, there is:
the moving mode of the pet robot is configured to keep the image of the pet collected and keep the sound signal of the pet collected, but the pet robot does not contact the pet, so that the pet robot captures action commands and/or sound commands sent by the pet in real time in the moving process;
Under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area, matching the image of the pet acquired in real time with a pre-stored pet behavior digital image associated with the host guiding command selected in the step S3, analyzing the matched pet behavior digital image into an action command, and determining that the pet robot captures the action command and the pet successfully executes the action command;
under the condition that the pet robot detects that a pet is in a target area corresponding to one of the household devices through an image, in the process of guiding the pet to execute a preset command for controlling the one of the household devices in the target area each time, matching a sound signal of the pet acquired in real time with pre-stored pet audio data associated with the host guiding command selected in the step S3, analyzing the matched pet audio data into a sound command, and determining that the pet robot captures the sound command and the pet successfully executes the sound command;
Then setting the working state of one of the household devices according to the voice command and/or the action command so as to fulfill the aim of controlling the one of the household devices by the pet; wherein, the action command comprises the activity position parameters of the pet; the sound command comprises sound source position parameters;
the position information of the target area corresponding to one of the household devices is preset in the pet robot and is acquired in real time in the form of map coordinates, and the image of the one of the household devices is preconfigured as a road sign image;
wherein the preset command comprises an action command and/or a sound command.
5. The pet robot control method according to claim 3, wherein in the step S4, the pet robot guides the pet to repeatedly execute the preset command for controlling the one of the home devices in the target area according to the host guiding command associated with the one of the home devices until the number of repeated execution reaches the preset training number, and the method for determining whether the pet effectively controls the one of the home devices according to the proportion of the number of successful execution of the preset command specifically includes:
Step S41, when the pet robot receives an owner guiding instruction associated with one of the household devices, the pet robot sends a preset command for controlling the one of the household devices to a pet, so that the pet robot is triggered to make a training action corresponding to the preset command when the pet robot walks along a specific route in the target area, walks to a specific position or plays a specific sound signal at the specific position;
step S42, the pet robot captures training actions of the pet on the preset command in the target area, and judges whether the training actions are successfully executed;
step S43, if the pet robot judges that the pet successfully executes the training action, playing a first preset voice to prompt the successful execution of the preset command once, and counting the times of successful execution;
step S44, if the pet robot judges that the pet does not execute the training action, a second preset voice is played to prompt that the preset command is not executed once successfully;
step S45, when the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, determining that the pet effectively controls one of the household devices when the ratio of the number of times of successfully executing to the preset training number is greater than or equal to the successful target ratio, so as to realize that a master is replaced to manually control the household device associated with the master guiding instruction, and the household device is enabled to operate according to the function corresponding to the control command; and when the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, and the ratio of the number of times of successfully executing to the preset training number is smaller than the successful target ratio, determining that the pet does not effectively control one of the household devices.
6. The pet robot control method according to claim 5, wherein in the step S42, the judgment as to whether the training action is successfully performed is as follows: judging whether the training action captured by the pet robot is matched with the preset command or not, if yes, determining that the pet successfully executes the training action and successfully executes the preset command to successfully control the household equipment once, otherwise, determining that the pet does not successfully execute the training action and determines that the preset command is not successfully executed;
wherein the matching between the training actions and the preset commands includes matching of behavior state parameters of the pet.
7. The pet robot control method according to claim 6, wherein in the step S5, changing the manner of the preset command after the pet robot rotates at least one turn or moves straight back and forth by a preset distance comprises:
repeatedly executing the steps S41 to S44, capturing failure training actions in the target area by the pet robot in the process of repeatedly executing the steps S41 to S44, counting the repeated occurrence times of the same failure training actions or the repeated occurrence times of the preset command which is not successfully executed by the pet, and recording the repeated occurrence times counted currently as failure times; when the training action captured by the pet robot is not matched with the preset command, the training action captured by the pet robot is a failed training action;
Before the number of times of repeatedly executing the steps S41 to S44 reaches the preset training number, if the ratio of the counted number of failures to the preset training number is equal to the successful target ratio, correcting the preset command according to the failed training action, then associating the corrected preset command to one of the household devices, setting the working state reached by the household device controlled by the corrected preset command, and finishing the setting of the preset command for controlling the same household device.
8. The pet robot control method of claim 7, wherein the home appliances include, but are not limited to, self-moving cleaning robots, and fixedly-arranged home appliances;
wherein, the preset command includes a control command associated to the same home equipment.
9. The pet robot control method according to claim 7, wherein the pet robot is a spherical robot equipped with a wireless communication module, a camera, and a sound sensor;
wherein the geometric shape of the driving wheel of the spherical robot is a hemisphere or a semi-ellipsoid, and the driving wheel is used for driving the spherical robot to move on a movable plane where the pet is located but not contact with the pet;
Wherein the camera is arranged in an upper shell of the spherical robot shell.
CN202211593146.2A 2022-12-13 2022-12-13 Pet robot control method for controlling household equipment by pets Pending CN116088326A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211593146.2A CN116088326A (en) 2022-12-13 2022-12-13 Pet robot control method for controlling household equipment by pets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211593146.2A CN116088326A (en) 2022-12-13 2022-12-13 Pet robot control method for controlling household equipment by pets

Publications (1)

Publication Number Publication Date
CN116088326A true CN116088326A (en) 2023-05-09

Family

ID=86212842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211593146.2A Pending CN116088326A (en) 2022-12-13 2022-12-13 Pet robot control method for controlling household equipment by pets

Country Status (1)

Country Link
CN (1) CN116088326A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116482987A (en) * 2023-06-19 2023-07-25 贵州大学 Automatic induction method and device for realizing intelligent furniture based on user behaviors
CN117731183A (en) * 2024-02-19 2024-03-22 杭州万向职业技术学院 Shield door cleaning robot and cleaning method thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116482987A (en) * 2023-06-19 2023-07-25 贵州大学 Automatic induction method and device for realizing intelligent furniture based on user behaviors
CN116482987B (en) * 2023-06-19 2023-08-22 贵州大学 Automatic induction method and device for realizing intelligent furniture based on user behaviors
CN117731183A (en) * 2024-02-19 2024-03-22 杭州万向职业技术学院 Shield door cleaning robot and cleaning method thereof
CN117731183B (en) * 2024-02-19 2024-05-10 杭州万向职业技术学院 Shield door cleaning robot and cleaning method thereof

Similar Documents

Publication Publication Date Title
CN116088326A (en) Pet robot control method for controlling household equipment by pets
EP3508938B1 (en) Mobile cleaning robot teaming and persistent mapping
CN112739244B (en) Mobile robot cleaning system
CN106406119A (en) Service robot based on voice interaction, cloud technology and integrated intelligent home monitoring
JP7259015B2 (en) Mobile robot and its control method
KR102048992B1 (en) Artificial intelligence cleaner and controlling method thereof
EP3787458B1 (en) A plurality of robot cleaners
CN109514582B (en) Pet teasing control device for robot and mobile robot
KR20190106891A (en) Artificial intelligence monitoring device and operating method thereof
TW201733512A (en) Autonomous traveling device
KR101984516B1 (en) Cleaner and controlling method thereof
US20220032450A1 (en) Mobile robot, and control method of mobile robot
JP7375748B2 (en) Information processing device, information processing method, and program
JP7002744B2 (en) Mobile platform system
KR20190105530A (en) An artificial intelligence robot for cleaning using zoned pollution information and method for the same
EP3493013B1 (en) Moving robot and associated control method
JP2020014845A (en) Autonomous action type robot for receiving guest
KR20200027072A (en) Controlling method for Moving robot
JP2018075192A (en) Vacuum cleaner
KR20200010635A (en) A plurality of autonomous cleaner and a controlling method for the same
KR102081340B1 (en) A plurality of autonomous cleaner and a controlling method for the same
TWI808480B (en) Moving robot, moving robot system and method of performing collaborative driving in moving robot system
CN109717796A (en) Intelligent cleaning equipment
JP5552710B2 (en) Robot movement control system, robot movement control program, and robot movement control method
KR102423572B1 (en) Controlling method for Artificial intelligence Moving robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination