CN110506661B - Method for preventing pet fighting based on machine learning - Google Patents

Method for preventing pet fighting based on machine learning Download PDF

Info

Publication number
CN110506661B
CN110506661B CN201910820242.8A CN201910820242A CN110506661B CN 110506661 B CN110506661 B CN 110506661B CN 201910820242 A CN201910820242 A CN 201910820242A CN 110506661 B CN110506661 B CN 110506661B
Authority
CN
China
Prior art keywords
pet
command
wearable device
identity information
wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910820242.8A
Other languages
Chinese (zh)
Other versions
CN110506661A (en
Inventor
沈泳龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wu Shukuan
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910820242.8A priority Critical patent/CN110506661B/en
Publication of CN110506661A publication Critical patent/CN110506661A/en
Application granted granted Critical
Publication of CN110506661B publication Critical patent/CN110506661B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/021Electronic training devices specially adapted for dogs or cats

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Zoology (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Toys (AREA)

Abstract

The invention discloses a method for preventing pet fighting based on machine learning, which comprises the following steps: the method comprises the steps that a first pet wearable device detects whether a second pet wearable device exists within a preset distance; the first pet wearable device and the second pet wearable device establish communication connection; the first pet wearable device and the second pet wearable device start video recording and sound recording; acquiring the emotional state, behavior intention and physiological data of the first pet and the emotional state, behavior intention and physiological data of the second pet at intervals of preset time, and executing a preset command by combining a preset rule; the first pet wearable device and the second pet wearable device finish video recording and sound recording; and obtaining pet command machine learning model training data, and retraining the pet command machine learning model. The method combines machine learning and pet physiological state data, and finally effectively prevents the occurrence of the pet fighting event.

Description

Method for preventing pet fighting based on machine learning
Technical Field
The invention belongs to the technical field of pet control, and particularly relates to a method for preventing pets from fighting based on machine learning.
Background
With the continuous progress of society and the continuous improvement of living standard, more and more families are raised with pets, and raising the pets, especially domesticating the pets, is a work which is laboursome and laboursome, and many people lack time to accompany the pets to go to outdoor exercises at ordinary times due to reasons such as busy work, so sometimes people can put the pets out of doors and arrive at the pets, but because people are not around the pets, people can worry about the situation that the pets are alone outside. The pet also can be out of the house, the mood of the pet often becomes impatient, the pet can bite things outside, jump everywhere, make sound at will, and even fight against other people's pets. The pet cage can disturb the life of neighbors slightly, damage public facilities and cause accidents seriously, so that pets are injured, and the economy and the life are threatened.
Therefore, it is a problem to be solved how to prevent the pet from fighting other pets when the pet alone goes out, and to enable the pet owner to monitor the situation when the pet alone goes out, and to issue a command to the pet when necessary.
Disclosure of Invention
The invention aims to solve the defects of the prior art and provides a pet fighting prevention method based on machine learning.
It is a second object of the present invention to provide a wearable device for pets.
Another object of the present invention is to provide a client for issuing simulation commands.
The purpose of the invention can be achieved by adopting the following technical scheme:
a pet fighting prevention method based on machine learning is characterized by comprising the following steps:
the first wearable pet device detects whether a second wearable pet device exists within a preset distance.
The first pet wearable device is in communication connection with the second pet wearable device.
The first pet wearable device and the second pet wearable device start video recording and sound recording.
And acquiring the emotional state, the behavior intention and the physiological data of the first pet and the emotional state, the behavior intention and the physiological data of the second pet at preset intervals.
And executing the preset command by combining the preset rule.
The first pet wearable device and the second pet wearable device finish video recording and sound recording.
Preferably, the first wearable pet device detects whether a second wearable pet device exists within a preset distance, including:
within a preset distance, if the first wearable pet device can receive the identity information of the preset format returned by the second wearable pet device, judging that the second wearable pet device exists within the preset distance, and if the first wearable pet device cannot receive the identity information of the preset format returned by the second wearable pet device, judging that the second wearable pet device does not exist within the preset distance.
The identity information in the preset format at least comprises: a pet ID capable of identifying the identity of the pet.
Preferably, the first wearable pet device and the second wearable pet device establish a communication connection, including:
the first wearable pet equipment sends first identity information to the second wearable pet equipment, and the second wearable pet equipment sends second identity information to the first wearable pet equipment.
The first identity information is data stored by the wearable pet device, and specifically includes: pet ID, bacterial content, whether carrying virus, weight, sex, whether there is fighting record, age, character, amount of exercise in near time, sleep status in near time, force of biting, intelligence level, pet picture, sound of pet fear, and vibration frequency of pet feeling comfortable.
The second identity information is data stored in the second wearable pet device, and specifically includes: pet ID, bacterial content, whether carrying virus, weight, sex, whether there is fighting record, age, character, amount of exercise in near time, sleep status in near time, force of biting, intelligence level, pet picture, sound of pet fear, and vibration frequency of pet feeling comfortable.
The wearable equipment of first pet, according to the second pet image combines image recognition technology, detects the pet that contains in the second pet image with whether the pet that contains in the pet photo in the second identity information is same pet, if not, will pet ID, bacterial content in the second identity information, whether carry virus, weight, sex, have and fight record, age, personality, near period time amount of exercise, near period time sleep state, tear bite, intelligence level, pet photo, the sound of pet fear, the comfortable vibrations frequency of pet feeling sets up to unknown.
The wearable equipment of second pet, according to first pet image combines image recognition technology, detects the pet that contains in the first pet image with whether the pet that contains in the pet photo in the first identity information is same pet, if not, will pet ID, bacterial content in the first identity information, whether carry virus, weight, sex, have or not to fight record, age, personality, near period motion volume, near period sleep state, tear bite, intelligence level, pet photo, the sound of pet fear, the comfortable vibrations frequency of pet feeling sets up to unknown.
Preferably, the obtaining of the emotional state, the behavioral intention and the physiological data of the first pet at preset intervals, and the emotional state, the behavioral intention and the physiological data of the second pet include:
through wearable equipment of first pet acquires heart rate, blood pressure, body temperature, respiratory rate, the sound information of first pet.
And obtaining the emotional state of the first pet by combining a voice emotion recognition technology according to the voice information of the first pet.
And obtaining the behavior intention of the first pet by combining a voice behavior intention identification technology according to the voice information of the first pet.
Through wearable equipment of second pet acquires the rhythm of the heart, blood pressure, body temperature, respiratory rate, the sound information of second pet.
And obtaining the emotional state of the second pet by combining a voice emotion recognition technology according to the voice information of the second pet.
And obtaining the behavior intention of the second pet by combining a voice behavior intention identification technology according to the voice information of the second pet.
The emotional state includes: anger, joy, calmness, fear.
The behavioral intent includes: warning, attack, and sadness.
Preferably, the executing the preset command in combination with the preset rule includes:
taking the first identity information, the second identity information, the emotional state, the behavioral intention and the physiological data of the first pet, and the emotional state, the behavioral intention and the physiological data of the second pet as command judgment factor data, and executing a preset command by combining a preset rule, specifically comprising:
a) and judging whether the pet ID in the second identity information exists in a blacklist of the first pet wearable device, if so, executing the playing of the escape voice information, executing the connection master client, and requesting and executing a master remote command.
b) And when the emotional state of the first pet is anger, the sex in the first identity information is the same as that in the second identity information, a fighting record exists in the second identity information, and the second pet has an attacking behavioral intention, executing the playing of the voice information of 'escape'.
c) And when the emotional state of the first pet is happy, the emotional state of the second pet is happy, and the gender of the first identity information is different from that of the second identity information, executing the playing of the voice information of ' going to play with the other ' to the user '.
d) And when the emotional state of the first pet is anger, the emotional state of the second pet is anger, the first pet has an attacking behavioral intention, and the second pet has an attacking behavioral intention, executing the action blocking command to limit the action ability of the first pet. And listing the pet ID in the second identity information into a blacklist, and storing the blacklist into the first pet wearable device.
e) And when the behavior intention of the first pet is attack and the behavior intention of the second pet is sadness, executing the electric shock command.
f) When the heart rate of the first pet exceeds a first threshold value or the respiratory frequency of the first pet exceeds a second threshold value, obtaining vibration frequency which enables the first pet to feel comfortable through the first identity information, executing the vibration command according to the vibration frequency which enables the first pet to feel comfortable, and stopping the vibration command until the heart rate of the first pet is recovered to be lower than the first threshold value or the respiratory frequency of the first pet is recovered to be lower than the second threshold value.
g) And judging whether the pet in the second identity information carries virus information or not, if so, executing the playing of the escaping voice information, and simultaneously executing the connection master client to request and execute a master remote command.
h) And when the behavior intention of the first pet is warning, executing the voice frightening instruction for acquiring the frightening of the pet of the other party.
Preferably, the executing the preset command in combination with the preset rule further includes:
acquiring a command judgment factor data sample set and a corresponding simulation command sample set; training the pet order machine learning model based on an order judgment factor data sample set and a corresponding simulation order sample set; and inputting the command judgment factor data into a pet command machine learning model, determining an output command corresponding to the command judgment factor data, and executing the pet command machine learning model output command by the first pet wearable device.
Preferably, the acquiring command judgment factor data sample set and the corresponding simulation command sample set include:
the first master client plays video data and sound recording data of the first pet wearable device, displays the first identity information, the second identity information, emotional state, behavior intention and physiological data of the first pet at each time point, emotional state, behavior intention and physiological data of the second pet, receives simulation command operation of the master of the first pet wearable device, and obtains the command judgment factor data sample set and the corresponding simulation command sample set.
Preferably, the wearable equipment of first pet and the wearable equipment of second pet end video recording, audio recording, include:
detecting whether the distance between the first pet wearable device and the second pet wearable device exceeds the preset distance, and if so, finishing video recording and sound recording of the first pet wearable device and the second pet wearable device.
The second purpose of the invention can be achieved by adopting the following technical scheme:
a wearable device for pets, characterized in that the wearable device comprises:
and the wireless communication device is used for establishing wireless connection with other electronic equipment.
The parameter acquisition device is internally provided with a motion sensor and a pressure sensor and is mainly used for detecting the heart rate, the blood pressure, the body temperature and the respiratory rate of a pet wearing the wearable equipment.
And the video recording device is used for recording the image data in front of the pet wearing the wearable equipment.
And the recording device is used for recording the audio data around the pet wearing the wearable equipment.
And the processor is mainly used for executing image recognition operation, voice emotion recognition operation, voice behavior intention recognition operation and logic operation for issuing commands to the pets by combining preset rules.
The audio playing device is internally provided with a loudspeaker and is mainly used for playing audio information.
And the vibration device is internally provided with a vibrator and is mainly used for executing the vibration command, and the vibration frequency of the vibration device can be adjusted according to the parameters.
And the electric shock device is internally provided with an electric shock device and is mainly used for executing the electric shock command.
The action stopping device is internally provided with the knee joint inflatable sheath and is mainly used for limiting the action ability of the pet dog.
The other purpose of the invention can be achieved by adopting the following technical scheme:
a client for issuing simulation commands, the client comprising:
and the data display device is used for playing the video data and the sound recording data of the wearable pet equipment and displaying the first identity information, the second identity information, the emotional state, the behavior intention and the physiological data of the first pet at each time point and the emotional state, the behavior intention and the physiological data of the second pet.
And the simulation command device is used for receiving simulation command operation of the owner of the first pet wearable device and obtaining machine learning model training data.
A remote command device for sending a remote command to the first pet wearable device.
The technical scheme provided by the disclosure has the following beneficial effects:
according to the pet meeting judgment method, the communication connection is automatically established through the Bluetooth or wifi technology, the identity information is transmitted, the judgment of pet meeting is timely, and the interaction command is more accurate through the identity information; through the image recognition technology, the situation that identity information stored in the wearable device is not matched with a pet wearing the wearable device at present is avoided, and the error rate is reduced. Through the preset command, the pet is automatically given an instruction, so that the number of times of intervention on the owner is reduced, the time of the owner is saved, compared with the situation that the owner is directly contacted, the pet can be given the instruction more timely, and the occurrence of conflict is avoided; continuously enhancing the accuracy of the machine learning model through a simulation command; the automatic video recording is automatically disconnected, and the video recording is only carried out at key necessary moments, so that the power and the storage space of equipment are saved; through the blacklist mechanism, in time remind owner, give the pet and assign and avoid the order, reduce the conflict and take place to owner can oneself choose to remove the blacklist, give the pet more improve the chance of being in touch with, and is more nimble.
Drawings
Fig. 1 is a flowchart of a method for preventing pet fighting based on machine learning according to an embodiment of the present invention.
Detailed Description
Step S101: the wearable equipment of first pet combines bluetooth or wifi technique, detects whether there is the wearable equipment of second pet in predetermineeing the distance, if there is, the wearable equipment of first pet with the wearable equipment of second pet establishes communication connection, the wearable equipment of first pet to the wearable equipment of second pet sends first identity information, the wearable equipment of second pet to the wearable equipment of first pet sends second identity information, gets into step S201.
Whether there is the wearable equipment of second pet in the distance is predetermine in the detection, specifically include: within a preset distance, if the first wearable pet device can receive the identity information of the preset format returned by the second wearable pet device, judging that the second wearable pet device exists within the preset distance, and if the first wearable pet device cannot receive the identity information of the preset format returned by the second wearable pet device, judging that the second wearable pet device does not exist within the preset distance.
The identity information in the preset format at least comprises: a pet ID capable of identifying the identity of the pet.
The first identity information includes: the first pet wearable device stores pet data such as: pet ID, bacterial content, whether carrying virus, weight, sex, whether there is fighting record, age, character, amount of exercise in near time, sleep status in near time, biting power, intelligence level, pet picture, sound of pet fear, frequency of vibration for pet to feel comfortable, etc.
The second identity information includes: the second pet wearable device stores pet data such as: pet ID, bacterial content, whether carrying virus, weight, sex, whether there is fighting record, age, character, amount of exercise in near time, sleep status in near time, biting power, intelligence level, pet picture, sound of pet fear, frequency of vibration for pet to feel comfortable, etc.
Step S201: the first pet wearable device and the second pet wearable device start video recording and sound recording.
Step S301: the wearable equipment for the first pet detects whether the pet contained in the second pet image and the pet contained in the pet picture in the second identity information are the same pet or not according to the second pet image obtained in the step S201 by combining an image recognition technology, and sets the pet ID, the bacteria content, whether the pet carries viruses, the weight, the sex, whether fighting records exist, the age, the character, the motion amount in the near time, the sleep condition in the near time, the tearing force, the intelligence level, the pet picture, the sound of pet fear and the vibration frequency of the pet feeling comfortable to feel to be unknown if the pet contained in the second pet image and the pet picture in the second identity information are not the same pet.
The wearable equipment of the second pet detects whether the pet contained in the first pet image and the pet contained in the pet picture in the first identity information are the same pet or not according to the first pet image obtained in the step S201 by combining an image recognition technology, and if not, the wearable equipment of the second pet sets the pet ID, the bacteria content, whether the wearable equipment carries viruses, the weight, the sex, whether fighting records exist, the age, the character, the motion quantity in the near time, the sleep condition in the near time, the tearing force, the intelligence level, the pet picture, the sound of pet fear and the vibration frequency of the pet feeling comfortable to the pet in the first identity information to be unknown.
The first pet image is a picture obtained by the second pet wearable device according to the video in step S201, and the picture contains the picture of the first pet.
The second pet image is a picture obtained by the first pet wearable device according to the video in step S201, and contains a picture of the second pet.
Step S401: every preset time, acquire the emotional state, the action intention and the physiological data of first pet, the emotional state, the action intention and the physiological data of second pet, combine the rule execution preset order of predetermineeing, specifically include:
through wearable equipment of first pet acquires physiological data such as heart rate, blood pressure, body temperature, respiratory rate, sound information of first pet.
And obtaining the emotional state of the first pet by combining a voice emotion recognition technology according to the voice information of the first pet.
And obtaining the behavior intention of the first pet by combining a voice behavior intention identification technology according to the voice information of the first pet.
Through wearable equipment of second pet acquires the physiological data such as the rhythm of the heart, blood pressure, body temperature, respiratory rate, sound information of second pet.
And obtaining the emotional state of the second pet by combining a voice emotion recognition technology according to the voice information of the second pet.
And obtaining the behavior intention of the second pet by combining a voice behavior intention identification technology according to the voice information of the second pet.
The first pet is a pet wearing the first pet wearable device.
The second pet is a pet wearing the second pet wearable device.
The emotional state includes: anger, joy, calmness, fear, etc.
The behavioral intent includes: warning, attack, sadness, etc.
And taking the first identity information, the second identity information, the emotional state, the behavior intention and the physiological data of the first pet, and the emotional state, the behavior intention and the physiological data of the second pet as command judgment factor data, and executing a preset command by combining a preset rule.
The preset command specifically includes: the method comprises the steps of vibrating, obtaining sound frightened by a pet of the other party, playing escaping voice information, playing standing voice information, playing low-jeer voice information, stopping actions, electric shocking, playing voice information for playing with the other party, connecting a master client, and requesting and executing a master remote command.
The vibration specifically comprises: starting a vibration device of the wearable equipment for the first pet to enable the first pet to feel vibration, wherein the vibration frequency of the vibration device can be adjusted according to parameters.
The sound frightening the other side for obtaining the fear of the pet of the other side specifically comprises: acquiring the fear sound of the pet in the second pet identity information, starting the audio playing device of the wearable equipment of the first pet, and playing the fear sound of the pet in the second pet identity information.
The playing of the "run away" voice message specifically includes: and starting an audio playing device of the first pet wearable device to play escaping voice information.
The playing of the upright voice information specifically includes: and starting the audio playing device of the first pet wearable device to play upright voice information.
The playing of the voice information of the 'low jeer' specifically comprises the following steps: and starting an audio playing device of the first pet wearable device to play the voice information of 'low jeer'.
The action blocking specifically comprises: and starting a movement stopping device of the first pet wearable equipment to limit the movement capacity of the first pet.
The electric shock specifically comprises: the first pet wearable device obtains weather conditions of a current place at the current moment, if the weather conditions are rainy weather, the electric shock device of the first pet wearable device is not started, and if the weather conditions are not rainy weather, the electric shock device of the first pet wearable device is started, so that the first pet can feel electric shock. Through weather condition judgement, can avoid the rainwater to lead to the electric leakage accidental injury the condition of first pet.
The playing of the voice information of 'going to play with the other' specifically comprises the following steps: and starting the audio playing device of the first pet wearable device, and playing voice information of 'going to play with the other side'.
The method for connecting the master client to request and execute the master remote command specifically comprises the following steps: the first wearable pet device sends a request for the first identity information, the second identity information, the emotional state, behavior intention and physiological data of the first pet, the emotional state, behavior intention and physiological data of the second pet at each time point, and the video data and the sound recording data of the first wearable pet device obtained in the step S201 to a first master client; receiving and executing a remote command returned by the first host client, wherein the remote command comprises: vibration, getting the sound frightened by the pet of the other party, playing the voice information of escaping, playing the voice information of standing, playing the voice information of low jeer, stopping the action, electric shock and playing the voice information of going to play with the other party.
And executing a preset command by combining a preset rule, wherein the preset command comprises the following steps:
a) and judging whether the pet ID in the second identity information exists in a blacklist of the first pet wearable device, if so, executing the playing of the escape voice information, executing the connection master client, and requesting and executing a master remote command.
In the embodiment, the connection master client is executed while the voice message command for playing the escape is executed, the master remote command is requested and executed, and the master remote command is obtained, so that accidents caused by the pet not listening to the escape command are prevented, and the control degree of the master is increased.
b) And when the emotional state of the first pet is anger, the sex in the first identity information is the same as that in the second identity information, a fighting record exists in the second identity information, and the second pet has an attacking behavioral intention, executing the playing of the voice information of 'escape'.
In this embodiment, under the action intention condition that the second pet has an attack, the emotional state of the first pet is detected simultaneously, the first identity information and the gender in the second identity information, the fighting record of the second identity information can judge the danger at this time more accurately, and then it is necessary to issue an "escape" command.
c) And when the emotional state of the first pet is happy, the emotional state of the second pet is happy, and the gender of the first identity information is different from that of the second identity information, executing the playing of the voice information of ' going to play with the other ' to the user '.
In the embodiment, the emotional states and the sexes of the first pet and the second pet are combined, and the voice message of 'going to play with the other' is played under the appropriate condition, so that the possibility of making friends among pets is promoted.
d) And when the emotional state of the first pet is anger, the emotional state of the second pet is anger, the first pet has an attacking behavioral intention, and the second pet has an attacking behavioral intention, executing the action blocking command to limit the action ability of the first pet. And listing the pet ID in the second identity information into a blacklist, and storing the blacklist into the first pet wearable device.
In the embodiment, by detecting the emotional states and the behavioral intentions of the first pet and the second pet, when both pets are prone to attack and angry, the action ability is limited in time, and accidents are avoided.
e) And when the behavior intention of the first pet is attack and the behavior intention of the second pet is sadness, executing the electric shock command.
In this embodiment, an electric shock penalty is imposed on the attack tendency of the first pet by detecting the behavioral intentions of the first pet and the second pet.
f) When the heart rate of the first pet exceeds a first threshold value or the respiratory frequency of the first pet exceeds a second threshold value, obtaining vibration frequency which enables the first pet to feel comfortable through the first identity information, executing the vibration command according to the vibration frequency which enables the first pet to feel comfortable, and stopping the vibration command until the heart rate of the first pet is recovered to be lower than the first threshold value or the respiratory frequency of the first pet is recovered to be lower than the second threshold value.
In this embodiment, according to the physiological data of the first pet, when the physiological data of the first pet is abnormal, for example, the heart rate exceeds the first threshold value of 150 times/min, and the breathing frequency exceeds the second threshold value of 15 times/min, the vibration frequency of the vibration device is adjusted to the vibration frequency which makes the first pet feel comfortable and vibrates, so that the pet feels pleasant until the normal physiological state is recovered.
g) And judging whether the pet in the second identity information carries virus information, if the pet in the second identity information carries the virus information, executing the playing of the escaping voice information, and simultaneously executing the connection master client to request and execute a master remote command.
In this embodiment, when it is determined that the pet in the second identity information carries virus information, the voice information of "escaping" played by the first pet is timely connected to the owner client of the wearable device for the first pet, and an owner remote command is requested and executed, so that the probability that the first pet is infected by the virus carried by the second pet is reduced, the owner of the first pet can monitor the condition that the first pet is in contact with a virus source, and a basis is provided for the owner to determine and process the infected virus afterwards.
h) And when the behavior intention of the first pet is warning, executing the voice frightening instruction for acquiring the frightening of the pet of the other party.
Executing the preset command by combining the preset rule, and further comprising:
and inputting the command judgment factor data into a pet command machine learning model, determining an output command corresponding to the command judgment factor data, outputting the command according to the pet command machine learning model, and executing the output command.
Wherein the pet command machine learning model is formed by training in advance based on a command judgment factor data sample set and a corresponding simulation command sample set.
The training algorithm of the pet command machine learning model can be svm, deep learning, naive Bayes and the like, and the training process of the pet command machine learning model is a technique known by those skilled in the art and is not described herein again.
The pet command machine learning model outputs commands comprising: the method comprises the steps of vibrating, obtaining sound frightened by a pet of the other party, playing escaping voice information, playing standing voice information, playing low-jeer voice information, stopping actions, electric shocking, playing voice information for playing with the other party, connecting a master client, and requesting and executing a master remote command.
Step S501: detecting whether the distance between the first pet wearable device and the second pet wearable device exceeds the preset distance, and if so, finishing video recording and sound recording of the first pet wearable device and the second pet wearable device.
Step S601: obtaining pet order machine learning model training data, retraining the pet order machine learning model:
the pet command machine learning model training data acquisition process is as follows:
acquiring a plurality of training data, wherein each training data at least comprises a command judgment factor data sample set and a simulation command sample set corresponding to each command judgment factor data sample set.
When a plurality of pieces of training data are acquired, firstly, a command judgment factor data sample set is acquired, specifically, the manner of acquiring the command judgment factor data includes that the first identity information, the second identity information, the emotional state, the behavior intention and the physiological data of the first pet, the emotional state, the behavior intention and the physiological data of the second pet at each time point are taken as the command judgment factor data sample set. Further, acquiring a simulation command corresponding to each command judgment factor data sample to obtain a simulation command sample set corresponding to the command judgment factor data sample set; specifically, the video data and the audio data of the first pet wearable device obtained in step S201 are played at the first owner client, and the command judgment factor data is displayed. Meanwhile, the simulation command operation of the first owner client is received, so that the corresponding simulation command can be obtained while the command judgment factor data sample data is obtained, and further the simulation command sample set corresponding to the command judgment factor data sample set is obtained, wherein the simulation command comprises vibration, obtaining sound frightening the other side feared by the pet of the other side, playing escaping voice information, playing right voice information, playing low-jeer voice information, stopping action, electric shock, playing voice information of going to play with the other side, connecting the owner client and executing the remote command of the owner.
The first owner client is a client of an owner of the first pet wearable device and is mainly used for receiving and presenting data sent by the first pet wearable device, sending a remote command to the first pet wearable device, receiving a simulation command operation of the owner of the first pet wearable device, and obtaining machine learning model training data.
The first host client is further configured to: editing the blacklist stored by the first wearable pet device, specifically including: adding a pet ID into a blacklist stored in the first pet wearable device; and deleting the pet ID from the blacklist stored in the first pet wearable device.
The present disclosure also provides a wearable device of pet, including: the wireless communication device is mainly used for establishing communication connection with other electronic equipment and receiving or transmitting data; the processor is mainly used for executing image recognition operation, voice emotion recognition operation, voice behavior intention recognition operation and logic operation for issuing commands to the pets by combining preset rules; the recording device is internally provided with a recorder and is mainly used for recording audio information; the video device is internally provided with a camera and is mainly used for recording video information; the audio playing device is internally provided with a loudspeaker and is mainly used for playing audio information; the vibrator is internally provided with a vibrator and is mainly used for executing the vibration command, and the vibration frequency of the vibrator can be adjusted according to parameters; the electric shock device is internally provided with an electric shock device and is mainly used for executing the electric shock command; the action stopping device is internally provided with the knee joint inflatable sheath and is mainly used for limiting the action capacity of the pet dog; the parameter acquisition device is internally provided with a motion sensor and a pressure sensor and is mainly used for detecting physiological data of the pet, such as heart rate, blood pressure, body temperature, respiratory rate and the like.

Claims (7)

1. A pet fighting prevention method based on machine learning is characterized by comprising the following steps:
the method comprises the steps that a first pet wearable device detects whether a second pet wearable device exists within a preset distance;
the first pet wearable device and the second pet wearable device establish communication connection;
the first pet wearable device and the second pet wearable device start video recording and sound recording;
acquiring the emotional state, behavior intention and physiological data of a first pet and the emotional state, behavior intention and physiological data of a second pet at preset intervals;
the first wearable pet device sends first identity information to the second wearable pet device, and the second wearable pet device sends second identity information to the first wearable pet device;
the first identity information is data stored by the first wearable pet device, and includes: pet ID, bacterial content, whether carrying virus, weight, sex, whether fighting records exist, age, character, exercise amount in a near period of time, sleep condition in a near period of time, biting force, intelligence level, pet pictures, sound of pet fear, and vibration frequency of pet feeling comfortable;
the second identity information is data stored by the second pet wearable device, and includes: pet ID, bacterial content, whether carrying virus, weight, sex, whether fighting records exist, age, character, exercise amount in a near period of time, sleep condition in a near period of time, biting force, intelligence level, pet pictures, sound of pet fear, and vibration frequency of pet feeling comfortable;
acquiring heart rate, blood pressure, body temperature, respiratory rate and sound information of the first pet through the wearable equipment for the first pet;
acquiring heart rate, blood pressure, body temperature, respiratory rate and sound information of the second pet through the wearable second pet equipment;
the emotional state includes: anger, joy, calmness, fear;
the behavioral intent includes: warning, attacking, saddling;
taking the first identity information, the second identity information, the emotional state, the behavioral intention and the physiological data of the first pet, and the emotional state, the behavioral intention and the physiological data of the second pet as command judgment factor data, and executing a preset command in combination with a preset rule, wherein the executing of the preset command in combination with the preset rule comprises:
a) judging whether the pet ID in the second identity information exists in a blacklist of the first pet wearable device, if so, playing escape voice information, connecting with an owner client, and requesting and executing an owner remote command;
b) when the emotional state of the first pet is anger, the gender in the first identity information is the same as that in the second identity information, a fighting record exists in the second identity information, and the second pet has an attacking behavioral intention, playing the voice information of 'escape';
c) when the emotional state of the first pet is happy, the emotional state of the second pet is happy, and the gender of the first identity information is different from that of the second identity information, playing voice information of 'going to play with the other' is executed;
d) when the emotional state of the first pet is anger, the emotional state of the second pet is anger, the first pet has an aggressive behavior intention, and the second pet has an aggressive behavior intention, executing an action blocking command, limiting the behavior ability of the first pet, blacklisting the pet ID in the second identity information, and storing the pet ID in the first pet wearable device;
e) when the behavior intention of the first pet is attack and the behavior intention of the second pet is sadness, executing a shock command;
f) when the heart rate of the first pet exceeds a first threshold or the respiratory frequency of the first pet exceeds a second threshold, obtaining vibration frequency which makes the first pet feel comfortable through the first identity information, executing a vibration command according to the vibration frequency which makes the first pet feel comfortable, and stopping the vibration command until the heart rate of the first pet is recovered to be lower than the first threshold or the respiratory frequency of the first pet is recovered to be lower than the second threshold;
g) judging whether the pet in the second identity information carries virus information or not, if so, playing escaping voice information, and simultaneously connecting with an owner client to request and execute an owner remote command;
h) when the behavior intention of the first pet is warning, executing a sound scaring command for acquiring the scared sound of the pet of the opposite side;
the first pet wearable device and the second pet wearable device finish video recording and sound recording.
2. The pet fighting prevention method based on machine learning according to claim 1, characterized in that: whether first wearable equipment of pet detects the wearable equipment of second pet in the preset distance includes:
within a preset distance, if the first pet wearable device can receive the identity information in the preset format returned by the second pet wearable device, judging that the second pet wearable device exists within the preset distance, and if the first pet wearable device cannot receive the identity information in the preset format returned by the second pet wearable device, judging that the second pet wearable device does not exist within the preset distance;
the identity information in the preset format at least comprises: a pet ID capable of identifying the identity of the pet.
3. The pet fighting prevention method based on machine learning according to claim 1, characterized in that: the first wearable equipment of pet with the wearable equipment of second pet establishes communication connection, includes:
the first pet wearable device detects whether the pet contained in the second pet image and the pet contained in the pet photo in the second identity information are the same pet or not according to the second pet image and by combining an image recognition technology, and if not, sets the pet ID, the bacteria content, whether the pet carries viruses, the weight, the sex, whether the pet carries a fighting record, the age, the character, the motion amount in the near time, the sleep condition in the near time, the tearing and biting force, the intelligence level, the pet photo, the sound of pet fear and the vibration frequency of the pet feeling comfortable to the unknown;
the wearable equipment of second pet, according to first pet image combines image recognition technology, detects the pet that contains in the first pet image with whether the pet that contains in the pet photo in the first identity information is same pet, if not, will pet ID, bacterial content in the first identity information, whether carry virus, weight, sex, have or not to fight record, age, personality, near period motion volume, near period sleep state, tear bite, intelligence level, pet photo, the sound of pet fear, the comfortable vibrations frequency of pet feeling sets up to unknown.
4. The pet fighting prevention method based on machine learning according to claim 1, characterized in that: the emotion state, behavior intention and physiological data of the first pet are obtained at preset intervals, and the emotion state, behavior intention and physiological data of the second pet comprise:
obtaining the emotional state of the first pet by combining a voice emotion recognition technology according to the voice information of the first pet;
obtaining the behavior intention of the first pet by combining a voice behavior intention identification technology according to the voice information of the first pet;
obtaining the emotional state of the second pet by combining a voice emotion recognition technology according to the voice information of the second pet;
and obtaining the behavior intention of the second pet by combining a voice behavior intention identification technology according to the voice information of the second pet.
5. The pet fighting prevention method based on machine learning according to claim 1, characterized in that: the executing of the preset command in combination with the preset rule further comprises:
acquiring a command judgment factor data sample set and a corresponding simulation command sample set; training a pet command machine learning model based on the command judgment factor data sample set and the corresponding simulation command sample set; and inputting the command judgment factor data into a pet command machine learning model, determining an output command corresponding to the command judgment factor data, and executing the pet command machine learning model output command by the first pet wearable device.
6. The pet fighting prevention method based on machine learning according to claim 5, characterized in that: the acquiring command judgment factor data sample set and the corresponding simulation command sample set comprise:
and the first master client plays the video data and the audio data of the first pet wearable device, displays the first identity information, the second identity information, the emotional state, the behavior intention and the physiological data of the first pet at each time point, and the emotional state, the behavior intention and the physiological data of the second pet, receives the simulation command operation of the master of the first pet wearable device, and obtains the command judgment factor data sample set and the corresponding simulation command sample set.
7. The pet fighting prevention method based on machine learning according to claim 1, characterized in that: first wearable equipment of pet reaches wearable equipment of second pet finishes video recording, includes:
detecting whether the distance between the first pet wearable device and the second pet wearable device exceeds the preset distance, and if so, finishing video recording and sound recording of the first pet wearable device and the second pet wearable device.
CN201910820242.8A 2019-09-01 2019-09-01 Method for preventing pet fighting based on machine learning Expired - Fee Related CN110506661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910820242.8A CN110506661B (en) 2019-09-01 2019-09-01 Method for preventing pet fighting based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910820242.8A CN110506661B (en) 2019-09-01 2019-09-01 Method for preventing pet fighting based on machine learning

Publications (2)

Publication Number Publication Date
CN110506661A CN110506661A (en) 2019-11-29
CN110506661B true CN110506661B (en) 2021-12-21

Family

ID=68630049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910820242.8A Expired - Fee Related CN110506661B (en) 2019-09-01 2019-09-01 Method for preventing pet fighting based on machine learning

Country Status (1)

Country Link
CN (1) CN110506661B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112450116A (en) * 2020-12-14 2021-03-09 深圳科帮电子有限公司 Pet management method, device, system, equipment and storage medium
CN116391630A (en) * 2023-04-24 2023-07-07 重庆长安汽车股份有限公司 In-vehicle pet management method, system, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042670A3 (en) * 2002-11-08 2004-06-24 Eyal Zehavi Canine security system
CN105123551A (en) * 2015-08-28 2015-12-09 广州中胜物联网络科技有限公司 Intelligent pet wearable device and control method thereof
CN106534477A (en) * 2016-08-30 2017-03-22 深圳市沃特沃德股份有限公司 Method, device and system for managing living habits of pet
CN107926747A (en) * 2018-01-02 2018-04-20 合肥淘云科技有限公司 A kind of Combined pet instructs and guides system
CN109064012A (en) * 2018-07-30 2018-12-21 合肥东恒锐电子科技有限公司 A kind of pet continues tracing management monitoring method and system
CN109360115A (en) * 2018-11-07 2019-02-19 中山乐心电子有限公司 Pet friend-making control method, device and wearable device
CN109757395A (en) * 2018-09-24 2019-05-17 天津大学 A kind of pet behavioral value monitoring system and method
CN110063267A (en) * 2019-06-11 2019-07-30 深圳派特科技有限公司 A kind of system and method for automatic detection and correction animal behavior

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017193256A1 (en) * 2016-05-09 2017-11-16 深圳市欸阿技术有限公司 Pet wearable device and pet monitoring method therefor
US10375930B1 (en) * 2017-07-07 2019-08-13 Chad R. James Animal training device that controls stimulus using proportional pressure-based input
KR102154081B1 (en) * 2017-12-29 2020-09-10 (주)씽크웨이브 Companion dog management apparatus
CN110122364A (en) * 2019-05-13 2019-08-16 安徽三品技术服务有限公司 Multifunctional pet clothes based on smart machine interconnection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042670A3 (en) * 2002-11-08 2004-06-24 Eyal Zehavi Canine security system
CN105123551A (en) * 2015-08-28 2015-12-09 广州中胜物联网络科技有限公司 Intelligent pet wearable device and control method thereof
CN106534477A (en) * 2016-08-30 2017-03-22 深圳市沃特沃德股份有限公司 Method, device and system for managing living habits of pet
CN107926747A (en) * 2018-01-02 2018-04-20 合肥淘云科技有限公司 A kind of Combined pet instructs and guides system
CN109064012A (en) * 2018-07-30 2018-12-21 合肥东恒锐电子科技有限公司 A kind of pet continues tracing management monitoring method and system
CN109757395A (en) * 2018-09-24 2019-05-17 天津大学 A kind of pet behavioral value monitoring system and method
CN109360115A (en) * 2018-11-07 2019-02-19 中山乐心电子有限公司 Pet friend-making control method, device and wearable device
CN110063267A (en) * 2019-06-11 2019-07-30 深圳派特科技有限公司 A kind of system and method for automatic detection and correction animal behavior

Also Published As

Publication number Publication date
CN110506661A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
JP6402320B2 (en) An autonomous behavioral robot
JP6495486B2 (en) Autonomous behavior robot and computer program
US20180177451A1 (en) Systems, environment and methods for identification and analysis of recurring transitory physiological states and events using a portable data collection device
RU2732433C2 (en) Device and method for treating gait disorder in a subject
US20070167689A1 (en) Method and system for enhancing a user experience using a user's physiological state
CN110506661B (en) Method for preventing pet fighting based on machine learning
CN108724205B (en) Interaction device, interaction method, interaction program, and robot
CN101198277A (en) Methods and systems for physiological and psycho-physiological monitoring and uses thereof
JPWO2016021236A1 (en) Information processing system, information processing apparatus, information processing program, and information processing method
CN111975772B (en) Robot control method, device, electronic device and storage medium
US11869666B2 (en) Computer system for crisis state detection and intervention
JP2014528806A (en) Emotion control method and apparatus
JP7107302B2 (en) Information processing device, information processing method, and program
JP2016062277A (en) Virtual reality presentation system, virtual reality presentation device, and virtual reality presentation method
CN112236203A (en) Allocating contextual gameplay assistance to player responses
JP2016032501A (en) Nursing and care support device, and nursing and care support method
KR20060133607A (en) Mobile communication terminal for self-checking user's health, system and method for offering multimedia contents using mobile communication terminal for self-checking user's health
CN110598612B (en) Patient nursing method based on mobile terminal, mobile terminal and readable storage medium
CN112188296A (en) Interaction method, device, terminal and television
WO2020116233A1 (en) Information processing device, information processing method, and program
CN108724206B (en) Interaction device, interaction method, interaction program, and robot
CN113934295A (en) Pet robot growing method, system and storage medium
CN110587621A (en) Robot, robot-based patient care method and readable storage medium
US20160293043A1 (en) Device, system and method for providing feedback to a user relating to a behavior of the user
JP2005074107A (en) Life management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211112

Address after: 453000 No. 51, South District, Taoyuan Village, Doumen Township, Yuanyang County, Xinxiang City, Henan Province

Applicant after: Wu Shukuan

Address before: 512026 room f101-12, incubation production building 1, guanshao innovation and innovation (equipment) center, Huake City, No. 42, Baiwang Avenue, Wujiang District, Shaoguan City, Guangdong Province

Applicant before: Shaoguan Qizhi Information Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211221

CF01 Termination of patent right due to non-payment of annual fee