CN107006389B - Terminal and pet action signal identification method and device - Google Patents

Terminal and pet action signal identification method and device Download PDF

Info

Publication number
CN107006389B
CN107006389B CN201610977396.4A CN201610977396A CN107006389B CN 107006389 B CN107006389 B CN 107006389B CN 201610977396 A CN201610977396 A CN 201610977396A CN 107006389 B CN107006389 B CN 107006389B
Authority
CN
China
Prior art keywords
information
pet
preset
action
identification information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610977396.4A
Other languages
Chinese (zh)
Other versions
CN107006389A (en
Inventor
王忠山
周毕兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Waterward Information Co Ltd
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co Ltd filed Critical Shenzhen Water World Co Ltd
Priority to CN201610977396.4A priority Critical patent/CN107006389B/en
Priority to PCT/CN2017/074456 priority patent/WO2018082225A1/en
Publication of CN107006389A publication Critical patent/CN107006389A/en
Application granted granted Critical
Publication of CN107006389B publication Critical patent/CN107006389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating

Abstract

The invention discloses a terminal and a pet action signal identification method and device, wherein the method comprises the following steps: acquiring action signals of the pet; searching preset action information matched with the action signal in a preset list; and if the matched preset action information is found, calling identification information corresponding to the preset action information in the preset list to transmit to the user. The terminal and the pet action signal identification method and device of the invention preset the action of the pet and the information represented by the action into the list to form the preset action information and the identification information, when the obtained action of the pet is the same as the preset action information in the list, the identification information of the preset action information for drinking is called and then sent to the user, so that the user can accurately know the information transmitted by the pet and better communicate with the pet to treat the pet.

Description

Terminal and pet action signal identification method and device
Technical Field
The invention relates to the field of pet language identification, in particular to a terminal and a pet action signal identification method and device.
Background
The pet is a friend of human, the pet is in a long time with the owner, and the owner can judge the information which the pet wants to express through some action behaviors of the pet. However, for some people who just touch the pet, the emotion of the pet cannot be accurately recognized, and the communication between the pet and the owner is influenced. Therefore, how to help people unfamiliar with pets to accurately identify the information transmitted by the pets is a problem which needs to be solved urgently.
Disclosure of Invention
The invention mainly aims to provide a terminal for helping a user to accurately identify information transmitted by a pet, and a pet action signal identification method and device.
In order to achieve the above object, the present invention provides a pet motion signal recognition method, including:
acquiring action signals of the pet;
searching preset action information matched with the action signal in a preset list;
and if the matched preset action information is found, calling identification information corresponding to the preset action information in the preset list to transmit to the user.
Further, before the step of searching the preset action information matched with the action signal in the preset list, the method includes:
receiving preset action information with commonality of pets of the same species in batches and identification information corresponding to the preset action information, and adding the identification information into the preset list.
Further, before the step of searching the preset action information matched with the action signal in the preset list, the method includes:
and receiving identification information input by a user, associating the identification information with the acquired preset action information, and adding the identification information to the preset list.
Further, the invoking identification information corresponding to the preset action information in the preset list to be transmitted to the user includes:
calling identification information corresponding to the preset action information in the preset list, converting the identification information into user readable information and displaying the user readable information; alternatively, the first and second electrodes may be,
and calling identification information corresponding to the preset action information in the preset list, converting the identification information into voice information and playing the voice information.
Further, after the step of calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found, the method includes:
and if the identification information is preset danger information, sending corresponding alarm information.
The embodiment of the invention also provides a pet action signal recognition device, which comprises:
the acquisition unit is used for acquiring action signals of the pet;
the searching unit is used for searching preset action information matched with the action signal in a preset list;
and the calling and transmitting unit is used for calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found by the searching unit.
Further, the pet action signal recognition device further comprises:
the batch import unit is used for receiving the preset action information with the commonality of the pets of the same species in batches and the identification information corresponding to the preset action information, and adding the identification information into the preset list.
Further, the pet action signal recognition device further comprises:
and the input unit is used for receiving the identification information input by the user, associating the identification information with the acquired preset action information and adding the identification information to the preset list.
Further, the pet action signal recognition device further comprises:
the display module is used for calling identification information corresponding to the preset action information in the preset list, converting the identification information into user readable information and displaying the user readable information; alternatively, the first and second electrodes may be,
and the playing module is used for calling the identification information corresponding to the preset action information in the preset list, converting the identification information into voice information and playing the voice information.
The invention also provides a terminal, which comprises a processor and a memory;
the memory is used for storing a program for executing the pet action signal identification method by any one of the pet action signal identification devices;
the processor is configured to execute programs stored in the memory.
The terminal and the pet action signal identification method and device of the invention preset the action of the pet and the information represented by the action into the list to form the preset action information and the identification information, when the obtained action of the pet is the same as the preset action information in the list, the identification information corresponding to the preset action information is called and then sent to the user, so that the user can accurately know the information transmitted by the pet, and better communicate with the pet to treat the pet.
Drawings
FIG. 1 is a flow chart illustrating a pet action signal recognition method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a pet action signal recognition device according to an embodiment of the present invention;
FIG. 3 is a block diagram of a pet action signal recognition device according to an embodiment of the present invention;
FIG. 4 is a block diagram illustrating a call forwarding unit according to an embodiment of the present invention;
FIG. 5 is a block diagram illustrating a schematic structure of an obtaining unit according to an embodiment of the present invention;
fig. 6 is a block diagram illustrating a structure of a terminal according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative only and should not be construed as limiting the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As will be appreciated by those skilled in the art, "terminal" as used herein includes both devices that are wireless signal receivers, devices that have only wireless signal receivers without transmit capability, and devices that include receive and transmit hardware, devices that have receive and transmit hardware capable of performing two-way communication over a two-way communication link. Such a device may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. As used herein, the "terminal" and "terminal Device" may also be a communication terminal, a web terminal, and a music/video playing terminal, such as a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, and may also be a smart tv, a pet smart wearable Device, a set-top box, and the like.
Referring to fig. 1, an embodiment of the present invention provides a pet action signal identification method, including:
s1, acquiring action signals of the pet;
s2, searching preset action information matched with the action signal in a preset list;
and S3, if the matched preset action information is found, calling the identification information corresponding to the preset action information in the preset list to transmit to the user.
As described in the above steps S1 to S3, the motion signal of the pet includes the body motion, the moving state, the traveling route, etc. of the pet, and the motion signal of the pet can reflect the information that the pet wants to transmit to the outside to a certain extent. For example, when the pet is a dog, the change of mood can be seen through the eyes of the dog, the pupils are open when angry is generated, and the eyes hang up to become terrible catch of the eyes. Sadness and lonely, the eyes are wet. In happy time, the eyes are bright. When confidence is filled or trust is desired, the eyes are never removed. If the person is stressed or an error is made, the sight line can be slightly moved. When the person is distrusted, the eyes flicker. Dog ears can also express emotion, and when the ears are full of strength and are stuck backwards, the dog ears show that the dog ears want to attack the other side; when the ear is very soft, it is happy or carefree. The dog's tail most correctly expresses its emotion, and the tail horse shakes, indicating pleasure; tail hanging down, meaning danger; the tail pinched off, indicating fear. Dogs use whole-body tension to indicate their anger, eye murder, grin-through, throat-off, hair erection. The tail is stretched straight and kept a certain distance away from the object whose tail is angry. If it is underhung, the back is raised and flapping, then an attack is initiated. Dogs are silent to indicate their own grief, which often droops the head, makes no more choice, or thinks poorly at the owner, or hides in the corners to lie still. Dogs expressed their pleasure with jumping. Dogs also "smile", have a slightly stretched mouth, exposed teeth, wrinkled nose cramped, soft eyes, ear pendulous, humming in the mouth, constant body twisting, and tail shaking. The dog expresses fear by the tremolo of the body, the whole body of the dog is upright when the dog is frightened, the dog warns bodily, the body shakes ceaselessly, the tail drops simultaneously, or the dog is clamped between two legs, and the like. The information transmitted to the outside by the dog can be basically and accurately judged by acquiring the action signal of the dog. The preset list is an electronic list stored in a designated storage medium, the preset action information stored in the list is action information which is finished in advance according to habitual actions and the like of the pet, and then action information corresponding to the habitual actions is stored in the preset list in a mapping mode. After the action signal of the pet is obtained, the action signal is compared with preset action information in a preset list one by one, and after the action signal finds the matched preset action information in the preset list (the action signal is basically the same as the preset action information or within a threshold range), the condition that the information transmitted by the current action of the pet is the same as the identification information corresponding to the preset action information in the preset list is indicated, and then the identification information is extracted and transmitted to users such as owners of the pet in a preset mode. The user can accurately know the information transmitted under the current state of the pet, and timely know the state of the pet, so that the communication between the user and the pet is better.
In this embodiment, before the step S2 of searching for the preset action information matching the action signal in the preset list, the method includes the steps of:
s201, receiving preset action information with commonality of pets of the same species in batch, and identification information corresponding to the preset action information, and adding the identification information into the preset list.
As mentioned in step S101, the pets of the same species are dogs such as kyoto dogs, Tibetan mastiff dogs, shepherd dogs, and the like. Generally, pets of the same species have the same or similar action habits, for example, the tail-shaking action of the canine animals is a happy action, and the like, and the preset action information corresponding to the common action and the identification information represented by the preset action information can be imported into the preset list in batch, so that the user can conveniently use the pets of the same species. For example, the same set of pet action signal recognition device can be used for the same species of pet corresponding to the same set of pet action signal recognition device, such as the same set of pet action signal recognition device can be used for a Kyoto dog, the same set of pet action signal recognition device can also be used for a Tibetan mastiff dog, and the like.
In this embodiment, before the step S2 of searching for the preset action information matching the action signal in the preset list, the method includes the steps of:
s202, receiving identification information input by a user, associating the identification information with the acquired preset action information, and adding the identification information to the preset list.
As described in step S202, different pets may have different living habits, and the information expressed by some actions is unique to the pet itself, so that the user can record the preset action information corresponding to the unique action signal and the identification information expressed by the preset action information into the preset list by observing, and the pet can be identified when the pet performs the action of the corresponding preset action information recorded into the preset list again. For example, if the action that a pet needs to eat is unique, the preset action information corresponding to the action that the pet needs to eat can be recorded and added to the preset list, and the identification information of "need to eat" corresponding to the preset action information is input into the preset list. When the pet takes the action of needing to eat, the information of 'needing to eat' is transmitted to the user. For example, when a user goes on a business trip, the user pays a pet to a friend for care, and when the pet takes a motion needing to be fed, the pet cannot feed the pet in time because the friend of the user cannot know information transmitted by the motion of the pet, but the user can connect a server through a specified terminal such as a mobile phone and the like, then receives a collected motion signal uploaded to the server by a device for acquiring the motion signal of the pet, and then obtains identification information of 'needing to be fed' through the server or the mobile phone, and the user can remind the friend of the pet through instant communication devices such as a telephone, a mail and the like, so that the pet needs to be fed.
In this embodiment, the step S3 of calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found includes:
s301, calling identification information corresponding to the preset action information in the preset list, and converting the identification information into user readable information and displaying the user readable information.
As described in step S301, the identification information is generally some characters, so that the identification information needs to be converted into information that can be accurately identified by the user. The readable information can be directly displayed by an APP installed on a terminal such as a mobile phone, or can be an instant message such as a short message, a mail, a WeChat and the like sent to a user.
In another embodiment, the step S3 of invoking the identification information corresponding to the preset action information in the preset list to be delivered to the user if the matching preset action information is found includes:
s302, calling identification information corresponding to the preset action information in the preset list, converting the identification information into voice information and playing the voice information.
As described in the step S302, the voice information is voice information, for example, the pet wears a pet smart collar, the pet smart collar has a memory, a processor, a player, etc., and the pet collar can receive the action signal of the pet and call the corresponding identification information. For example, when the pet takes the action of needing to eat, the information of 'needing to eat' is extracted, converted into voice information and played, and if the user is near the pet, the user can hear the played content, so that the pet can be prepared for food in time.
In this embodiment, the step S1 of acquiring the motion signal of the pet includes the steps of:
s101, shooting an image of the pet through an image shooting device, analyzing the image and acquiring an action signal of the pet.
As described in step S101, the image capturing device may be a camera, a video camera, a monitor, or the like, and the image may be a photograph or a video. And analyzing the image by using an image analysis technology to judge information such as the motion state of the pet. In a specific embodiment, a plurality of cameras are arranged in the pet activity area, images in the formulated area are shot respectively, when a pet enters the shooting area of the first camera, the pet is sensed by the first camera, the image of the pet is analyzed, and the action signal of the pet is acquired. When the pet leaves the first camera shooting area and enters the second camera shooting area, the first camera stops shooting, the second camera senses that the pet enters the second camera shooting area, the pet starts to shoot images, the images of the pet start to be analyzed in real time, and action signals of the pet are obtained. The cameras are generally connected with the same host, image analysis is performed through the host, and each camera senses that a pet enters a corresponding shooting area, and the image analysis technology, the infrared induction technology and the like can be used for achieving the purpose.
In another embodiment, the step S1 of obtaining the motion signal of the pet includes the steps of:
s102, collecting motion signals of the pet through a motion sensor arranged on the body of the pet.
As described in step S102, the motion sensor may be a multi-axis acceleration sensor that can collect the pet, such as walking and running, or may include a micro-motion signal sensor, such as a piezoelectric film sensor, that measures the heart rate and respiration rate. The motion sensor can be arranged on equipment such as a collar and a chest strap of a pet and collects the action signals of the pet at any time.
In this embodiment, after the step S3 of calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found, the method includes the steps of:
and S303, if the identification information is preset danger information, sending corresponding alarm information.
As described in step S303, the pet generally grows healthily in daily life, but some accidents, such as falling from a high place, being hit by other objects, being injured, etc., may happen inevitably. These unexpected motion signals can also be used as motion signals, and identification information corresponding to danger is provided for these corresponding motion signals. When the identification information is dangerous information, alarm information can be sent to the user, so that the attention of the user is improved, the condition of the pet can be known in time, and the pet can be treated and treated.
In this embodiment, after the step S3 of calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found, the method includes:
s304, entering a related online pet mall, and retrieving the sold commodity corresponding to the identification information according to the content of the identification information.
As described in step S304, the pet online shopping mall is a shopping mall on a network, such as a kyoto shopping mall, a tianmao shopping mall, and the like. The identification information is used as a keyword for searching, the corresponding selling goods are found in the shopping mall, the user experience is improved, and the time of the user is saved. In an embodiment, the identification information corresponding to the action signal of the pet is "cold and heating needed", and then after logging in the online shopping mall of the pet, relevant commodities such as a heater, a pet tent and the like, such as "cold and heating needed", are automatically retrieved.
In this embodiment, after the step S3 of calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the matched preset action information is found, the method includes the steps of:
s305, sending a feedback action corresponding to the action signal to the pet.
As described in step S305, the feedback action is an action for the pet, such as shouting the pet, pacifying the pet with a sensor worn by the pet, and the like. In a specific embodiment, when the identification information is "the pet is in panic", at this time, the shout of the pet owner can be played to the pet to placate the pet, and the pet can also be placated by regular vibration of the vibration sensor on the body of the pet, and the like.
In this embodiment, before the step S2 of searching for the preset action information matching the action signal in the preset list, the method includes:
s203, obtaining the pet type, and calling the preset list corresponding to the pet type.
As described in step S203, the species of the pet may be different, so that a plurality of preset lists may be stored for different species, for example, a preset list of canines, a preset list of felines, etc. may be stored, and the corresponding preset list may be selected according to the species of the pet, so as to accurately identify the information transmitted by the pet. In a specific embodiment, the method for obtaining the category of the pet includes obtaining a picture of the pet, automatically analyzing information such as species of the pet through the picture, and selecting a most suitable preset list.
In one embodiment, the pet action signal identification process comprises:
installing a pet action signal recognition APP on a smart phone, wherein preset action information of multiple species and corresponding recognition information are imported into the APP in advance;
when the pet action signal recognition APP is used for the first time, a picture of a pet at a specified angle is shot through the shooting function of the smart phone, then information such as species of the pet is automatically recognized, for example, the pet is recognized as a canine, then a Beijing barnyard dog and the like in the canine are specifically recognized, and then a preset list corresponding to the Beijing barnyard dog is called.
Then, the pet action signal recognition APP receives action signals collected by a device for collecting pet action signals, such as a motion sensor arranged on a pet body or a camera device arranged in a pet activity area, through a wireless connection device of the smart phone;
then, searching preset action information matched with the action signal in a preset list by the pet action signal recognition APP, if the matched preset action information is searched, calling the recognition information corresponding to the preset action information in the preset list and displaying the recognition information on a mobile phone screen, and directly playing voice and the like;
and finally, entering a related online pet mall, and retrieving the corresponding sold commodity according to the content of the identification information.
In the using process, the identification information input by the user can be received, and the identification information is associated with the acquired action information and added into the preset list.
The pet action signal identification method of the embodiment presets the action of the pet and the information represented by the action into the list to form preset action information and identification information, and when the obtained action of the pet is the same as the preset action information in the list, the identification information given by the preset action information is called and then sent to the user, so that the user can accurately know the information transmitted by the pet, and better communicate with the pet to treat the pet.
Referring to fig. 2, an embodiment of the present invention further provides a pet motion signal recognition apparatus, including:
the pet monitoring device comprises an acquisition unit 1, a control unit and a control unit, wherein the acquisition unit is used for acquiring action signals of pets;
the searching unit 2 is used for searching preset action information matched with the action signal in a preset list;
and the calling and transmitting unit 3 is used for calling the identification information corresponding to the preset action information in the preset list to transmit to the user if the searching unit 2 searches the matched preset action information.
As mentioned above, the obtaining unit 1, the searching unit 2 and the transferring and transmitting unit 3, the motion signal of the pet includes the body motion, the moving state, the moving route and other motions of the pet, and the motion signal of the pet can reflect the information that the pet wants to transmit to the outside to a certain extent. For example, when the pet is a dog, the change of mood can be seen through the eyes of the dog, the pupils are open when angry is generated, and the eyes hang up to become terrible catch of the eyes. Sadness and lonely, the eyes are wet. In happy time, the eyes are bright. When confidence is filled or trust is desired, the eyes are never removed. If the person is stressed or an error is made, the sight line can be slightly moved. When the person is distrusted, the eyes flicker. Dog ears can also express emotion, and when the ears are full of strength and are stuck backwards, the dog ears show that the dog ears want to attack the other side; when the ear is very soft, it is happy or carefree. The dog's tail most correctly expresses its emotion, and the tail horse shakes, indicating pleasure; tail hanging down, meaning danger; the tail pinched off, indicating fear. Dogs use whole-body tension to indicate their anger, eye murder, grin-through, throat-off, hair erection. The tail is stretched straight and kept a certain distance away from the object whose tail is angry. If it is underhung, the back is raised and flapping, then an attack is initiated. Dogs are silent to indicate their own grief, which often droops the head, makes no more choice, or thinks poorly at the owner, or hides in the corners to lie still. Dogs expressed their pleasure with jumping. Dogs also "smile", have a slightly stretched mouth, exposed teeth, wrinkled nose cramped, soft eyes, ear pendulous, humming in the mouth, constant body twisting, and tail shaking. The dog expresses fear by the tremolo of the body, the whole body of the dog is upright when the dog is frightened, the dog warns bodily, the body shakes ceaselessly, the tail drops simultaneously, or the dog is clamped between two legs, and the like. The information transmitted to the outside by the dog can be basically and accurately judged by acquiring the action signal of the dog. The preset list is an electronic list stored in a designated storage medium, the preset action information stored in the list is action information which is finished in advance according to habitual actions and the like of the pet, and then action information corresponding to the habitual actions is stored in the preset list in a mapping mode. After the action signal of the pet is obtained, the action signal is compared with preset action information in a preset list one by one, and after the action signal finds the matched preset action information in the preset list (the action signal is basically the same as the preset action information or within a threshold range), the condition that the information transmitted by the current action of the pet is the same as the identification information corresponding to the preset action information in the preset list is indicated, and then the identification information is extracted and transmitted to users such as owners of the pet in a preset mode. The user can accurately know the information transmitted under the current state of the pet, and timely know the state of the pet, so that the communication between the user and the pet is better.
Referring to fig. 3, in this embodiment, the pet motion signal recognition device further includes:
the batch import unit 201 is configured to receive preset action information having commonalities of pets of the same species in batch and identification information corresponding to the preset action information, and add the identification information to the preset list.
As the batch introduction unit 201, the pets of the same species are dogs such as kyoto dogs, Tibetan mastiff dogs, shepherd dogs, and the like. Generally, pets of the same species have the same or similar action habits, for example, the tail-shaking action of the canine animals is a happy action, and the like, and the preset action information corresponding to the common action and the identification information represented by the preset action information can be imported into the preset list in batch, so that the user can conveniently use the pets of the same species. For example, the same set of pet action signal recognition device can be used for the same species of pet corresponding to the same set of pet action signal recognition device, such as the same set of pet action signal recognition device can be used for a Kyoto dog, the same set of pet action signal recognition device can also be used for a Tibetan mastiff dog, and the like.
Referring to fig. 3, in this embodiment, the pet motion signal recognition device further includes:
and the input unit 202 is used for receiving the identification information input by the user, associating the identification information with the acquired preset action information, and adding the identification information to the preset list.
As the input unit 202 is described above, different pets may have different living habits, and the information expressed by some actions is unique to the pet itself, so that the user can record the preset action information corresponding to the unique action signal and the identification information expressed by the preset action information into the preset list by observing, and the pet can be identified when the pet performs the action corresponding to the preset action information recorded into the preset list again. For example, if the action that a pet needs to eat is unique, the preset action information corresponding to the action that the pet needs to eat can be recorded and added to the preset list, and the identification information of "need to eat" corresponding to the preset action information is input into the preset list. When the pet takes the action of needing to eat, the information of 'needing to eat' is transmitted to the user. For example, when a user goes on a business trip, the user pays a pet to a friend for care, and when the pet takes a motion needing to be fed, the pet cannot feed the pet in time because the friend of the user cannot know information transmitted by the motion of the pet, but the user can connect a server through a specified terminal such as a mobile phone and the like, then receives a collected motion signal uploaded to the server by a device for acquiring the motion signal of the pet, and then obtains identification information of 'needing to be fed' through the server or the mobile phone, and the user can remind the friend of the pet through instant communication devices such as a telephone, a mail and the like, so that the pet needs to be fed.
Referring to fig. 4, in this embodiment, the call transfer unit 3 includes:
the display module 301 is configured to call identification information corresponding to the preset action information in the preset list, convert the identification information into user-readable information, and display the user-readable information.
As the display module 301, the identification information is generally some characters, so that the identification information needs to be converted into information that can be accurately identified by a user. The readable information can be directly displayed by an APP installed on a terminal such as a mobile phone, or can be an instant message such as a short message, a mail, a WeChat and the like sent to a user.
Referring to fig. 4, in another embodiment, the call transfer unit 3 includes
The playing module 302 is configured to call the identification information corresponding to the preset action information in the preset list, convert the identification information into voice information, and play the voice information.
As the playing module 302, the voice information is a voice information, for example, the pet wears a pet smart collar, the pet smart collar has a memory, a processor, a player, etc., and the pet collar can receive the action signal of the pet and call the corresponding identification information. For example, when the pet takes the action of needing to eat, the information of 'needing to eat' is extracted, converted into voice information and played, and if the user is near the pet, the user can hear the played content, so that the pet can be prepared with food in time.
Referring to fig. 5, in this embodiment, the obtaining unit 1 includes:
the image analysis module 101 is configured to capture an image of a pet by an image capture device, analyze the image, and acquire an action signal of the pet.
As the image analysis module 101, the image capturing device may be a camera, a video camera, a monitor, etc., and the image may be a photograph or a video. And analyzing the image by using an image analysis technology to judge information such as the motion state of the pet. In a specific embodiment, a plurality of cameras are arranged in the pet activity area, images in the formulated area are shot respectively, when a pet enters the shooting area of the first camera, the pet is sensed by the first camera, the image of the pet is analyzed, and the action signal of the pet is acquired. When the pet leaves the first camera shooting area and enters the second camera shooting area, the first camera stops shooting, the second camera senses that the pet enters the second camera shooting area, the pet starts to shoot images, the images of the pet start to be analyzed in real time, and action signals of the pet are obtained. The cameras are generally connected with the same host, image analysis is performed through the host, and each camera senses that a pet enters a corresponding shooting area, and the image analysis technology, the infrared induction technology and the like can be used for achieving the purpose.
Referring to fig. 5, in another embodiment, the obtaining unit 1 includes:
and the action acquisition module 102 is used for acquiring the motion signal of the pet through a motion sensor arranged on the body of the pet.
As the motion acquisition module 102, the motion sensor may be a multi-axis acceleration sensor that can acquire the pet such as walking and running, and may also include a micro-motion signal sensor such as a piezoelectric film sensor that measures the heart rate and respiration rate. The motion sensor can be arranged on equipment such as a collar and a chest strap of a pet and collects the action signals of the pet at any time.
Referring to fig. 3, in this embodiment, the pet motion signal recognition device further includes:
and an alarm unit 303, configured to send out corresponding alarm information if the identification information is preset danger information.
As the alarm unit 303 is mentioned above, pets generally grow healthily in daily life, but some accidents, such as dropping from a high place or being hit by other objects and being injured, happen inevitably. These unexpected motion signals can also be used as motion signals, and identification information corresponding to danger is provided for these corresponding motion signals. When the identification information is dangerous information, alarm information can be sent to the user, so that the attention of the user is improved, the condition of the pet can be known in time, and the pet can be treated and treated.
Referring to fig. 3, in this embodiment, the pet motion signal recognition device further includes:
and the retrieval unit 304 is configured to enter the associated online pet mall, and retrieve the sold product corresponding to the identification information according to the content of the identification information.
The above-mentioned pet online shopping mall is a shopping mall on the network, such as a kyoto shopping mall, a tianmao shopping mall, etc., as the above-mentioned retrieval unit 304. The identification information is used as a keyword for searching, the corresponding selling goods are found in the shopping mall, the user experience is improved, and the time of the user is saved. In an embodiment, the identification information corresponding to the action signal of the pet is "cold and heating needed", and then after logging in the online shopping mall of the pet, relevant commodities such as a heater, a pet tent and the like, such as "cold and heating needed", are automatically retrieved.
Referring to fig. 3, in this embodiment, the pet motion signal recognition device further includes:
a feedback unit 305, configured to send a feedback action corresponding to the action signal to the pet.
As the feedback unit 305, the feedback action is an action for the pet, such as shouting the pet, pacifying the pet with a sensor worn by the pet, and the like. In a specific embodiment, when the identification information is "the pet is in panic", at this time, the shout of the pet owner can be played to the pet to placate the pet, and the pet can also be placated by regular vibration of the vibration sensor on the body of the pet, and the like.
In this embodiment, the pet motion signal recognition device further includes:
the invoking unit 203 is configured to obtain the category of the pet, and invoke the preset list corresponding to the category of the pet.
As the invoking unit 203 mentioned above, the species of the pet may be different, so that a plurality of preset lists may be stored for different species, for example, a preset list of canine, a preset list of feline, etc. may be stored, and the corresponding preset list may be selected according to the species of the pet, so as to accurately identify the information conveyed by the pet. In a specific embodiment, the method for obtaining the category of the pet includes obtaining a picture of the pet, automatically analyzing information such as species of the pet through the picture, and selecting a most suitable preset list.
In one embodiment, the pet action signal identification process comprises:
installing a pet action signal recognition APP on a smart phone, wherein preset action information of multiple species and corresponding recognition information are imported into the APP in advance;
when the pet action signal recognition APP is used for the first time, a picture of a pet at a specified angle is shot through the shooting function of the smart phone, then information such as species of the pet is automatically recognized, for example, the pet is recognized as a canine, then a Beijing barnyard dog and the like in the canine are specifically recognized, and then a preset list corresponding to the Beijing barnyard dog is called.
Then, the pet action signal recognition APP receives action signals collected by a device for collecting pet action signals, such as a motion sensor arranged on a pet body or a camera device arranged in a pet activity area, through a wireless connection device of the smart phone;
then, searching preset action information matched with the action signal in a preset list by the pet action signal recognition APP, if the matched preset action information is searched, calling the recognition information corresponding to the preset action information in the preset list and displaying the recognition information on a mobile phone screen, and directly playing voice and the like;
and finally, entering a related online pet mall, and retrieving the corresponding sold commodity according to the content of the identification information.
In the using process, the identification information input by the user can be received, and the identification information is associated with the acquired action information and added into the preset list.
The pet action signal recognition device of this embodiment presets the action of pet and the information that this action represents to the list and forms and predetermine action information and identifying information, and when the action of the pet that obtains is the same with the action information that predetermines in the list, will call this and predetermine the action information to the identifying information that drinks, then send for the user, make the user can accurately know the information that the pet passed, better communicate with the pet and communicate to treat the pet well.
The embodiment of the invention also provides a terminal, which comprises a processor and a memory; the memory is used for storing a program for executing the pet action signal identification method in any embodiment by the pet action signal identification device in any embodiment; the processor is configured to execute programs stored in the memory. The terminal can be a mobile phone, a tablet computer, a smart watch, a smart bracelet or a smart pet collar worn on a pet.
Fig. 6 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present invention. Referring to fig. 6, the handset includes: radio Frequency (RF) circuit 510, memory 520, input unit 530, display unit 540, sensor 550, audio circuit 560, wireless fidelity (WiFi) module 570, processor 580, and power supply 590. Those skilled in the art will appreciate that the handset configuration shown in fig. 6 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 6:
RF circuit 510 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to processor 580; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 510 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 520 may be used to store software programs and modules, and the processor 580 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 520. The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 530 may include a touch panel 531 and other input devices 532. The touch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near the touch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 580, and can receive and execute commands sent by the processor 580. In addition, the touch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 530 may include other input devices 532 in addition to the touch panel 531. In particular, other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 540 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 540 may include a Display panel 541, and optionally, the Display panel 541 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 531 may cover the display panel 541, and when the touch panel 531 detects a touch operation on or near the touch panel 531, the touch panel is transmitted to the processor 580 to determine the type of the touch event, and then the processor 580 provides a corresponding visual output on the display panel 541 according to the type of the touch event. Although the touch panel 531 and the display panel 541 are shown as two separate components in fig. 6 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 531 and the display panel 541 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 550, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 541 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 541 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 560, speaker 561, and microphone 562 may provide an audio interface between a user and a cell phone. The audio circuit 560 may transmit the electrical signal converted from the received audio data to the speaker 561, and convert the electrical signal into a sound signal by the speaker 561 for output; on the other hand, the microphone 562 converts the collected sound signals into electrical signals, which are received by the audio circuit 560 and converted into audio data, which are then processed by the audio data output processor 580, and then passed through the RF circuit 510 to be sent to, for example, another cellular phone, or output to the memory 520 for further processing.
WiFi belongs to short distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 570, which provides wireless broadband internet access for the user. Although fig. 6 shows the WiFi module 570, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 580 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 520 and calling data stored in the memory 520, thereby performing overall monitoring of the mobile phone. Alternatively, processor 580 may include one or more processing units; preferably, the processor 580 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 580.
The handset also includes a power supply 590 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 580 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
Referring to fig. 6, in the embodiment of the present invention, the processor 580 included in the terminal further has the following functions:
acquiring action signals of the pet;
searching preset action information matched with the action signal in a preset list;
and if the matched preset action information is found, calling identification information corresponding to the preset action information in the preset list to transmit to the user.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It will be understood by those skilled in the art that all or part of the steps in the method for implementing the above embodiments may be implemented by hardware that is instructed to implement by a program, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (7)

1. A pet action signal identification method is characterized by comprising the following steps:
acquiring action signals of the pet;
searching preset action information matched with the action signal in a preset list;
if the matched preset action information is found, calling identification information corresponding to the preset action information in the preset list to transmit to a user;
entering a related pet online shopping mall, and retrieving the sold commodity corresponding to the identification information according to the content of the identification information;
if the matched preset action information is found, after the step of calling the identification information corresponding to the preset action information in the preset list and transmitting the identification information to the user, the method comprises the following steps:
the preset action information comprises danger information, and if the identification information is preset danger information, corresponding alarm information is sent out;
before the step of searching the preset action information matched with the action signal in the preset list, the method comprises the following steps:
and receiving identification information input by a user, associating the identification information with the acquired preset action information, and adding the identification information to the preset list.
2. The pet motion signal identification method according to claim 1, wherein the step of searching the preset motion information matched with the motion signal in a preset list is preceded by:
receiving preset action information with commonality of pets of the same species in batches and identification information corresponding to the preset action information, and adding the identification information into the preset list.
3. The pet motion signal identification method according to claim 1, wherein the invoking of the identification information corresponding to the preset motion information in the preset list to a user comprises:
calling identification information corresponding to the preset action information in the preset list, converting the identification information into user readable information and displaying the user readable information; alternatively, the first and second electrodes may be,
and calling identification information corresponding to the preset action information in the preset list, converting the identification information into voice information and playing the voice information.
4. A pet action signal recognition device, comprising:
the acquisition unit is used for acquiring action signals of the pet; the searching unit is used for searching preset action information matched with the action signal in a preset list;
the calling and transmitting unit is used for searching the matched preset action information in the searching unit and calling the identification information corresponding to the preset action information in the preset list to transmit to the user; the preset action information comprises danger information;
the retrieval unit is used for entering a related pet online shopping mall and retrieving the sold commodity corresponding to the identification information according to the content of the identification information;
the alarm unit is used for sending out corresponding alarm information if the identification information is preset danger information;
and the input unit is used for receiving the identification information input by the user, associating the identification information with the acquired preset action information and adding the identification information to the preset list.
5. The pet motion signal recognition device of claim 4, further comprising:
the batch import unit is used for receiving the preset action information with the commonality of the pets of the same species in batches and the identification information corresponding to the preset action information, and adding the identification information into the preset list.
6. The pet motion signal recognition device of claim 4, wherein the call transfer unit comprises:
the display module is used for calling identification information corresponding to the preset action information in the preset list, converting the identification information into user readable information and displaying the user readable information; alternatively, the first and second electrodes may be,
and the playing module is used for calling the identification information corresponding to the preset action information in the preset list, converting the identification information into voice information and playing the voice information.
7. A terminal comprising a processor and a memory;
the memory is used for storing a program of the pet action signal recognition device of any one of claims 4 to 6 for executing the pet action signal recognition method of any one of claims 1 to 3;
the processor is configured to execute programs stored in the memory.
CN201610977396.4A 2016-11-04 2016-11-04 Terminal and pet action signal identification method and device Active CN107006389B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610977396.4A CN107006389B (en) 2016-11-04 2016-11-04 Terminal and pet action signal identification method and device
PCT/CN2017/074456 WO2018082225A1 (en) 2016-11-04 2017-02-22 Terminal, and pet motion signal identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610977396.4A CN107006389B (en) 2016-11-04 2016-11-04 Terminal and pet action signal identification method and device

Publications (2)

Publication Number Publication Date
CN107006389A CN107006389A (en) 2017-08-04
CN107006389B true CN107006389B (en) 2021-06-22

Family

ID=59438754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610977396.4A Active CN107006389B (en) 2016-11-04 2016-11-04 Terminal and pet action signal identification method and device

Country Status (2)

Country Link
CN (1) CN107006389B (en)
WO (1) WO2018082225A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573370B (en) * 2018-04-11 2022-05-27 中国疾病预防控制中心寄生虫病预防控制所 Management method for dog registration and insect expelling based on biological identification technology
CN108925456A (en) * 2018-05-31 2018-12-04 广州粤创富科技有限公司 A kind of method, apparatus interacted with pet and wearable device
EP3586618A1 (en) * 2018-06-22 2020-01-01 United Pet Brands NV Determining a condition of an animal
CN109362596A (en) * 2018-09-30 2019-02-22 中山乐心电子有限公司 The exchange method of pet and equipment, device and electronic equipment
CN109315310A (en) * 2018-11-30 2019-02-12 成都普连众通科技有限公司 A kind of pet apparatus for managing and monitoring based on narrowband Internet of Things
CN109745017B (en) * 2019-01-30 2021-04-06 中国科学院电子学研究所 Real-time monitoring system, device and method for animal physiological information and state
CN113392671A (en) * 2020-02-26 2021-09-14 上海依图信息技术有限公司 Commodity retrieval method and device based on customer actions and electronic equipment
CN115176727B (en) * 2022-07-14 2023-06-27 宠步科技(武汉)有限公司 Short kiss dog identification method and pet collar for short kiss dog identification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101141747A (en) * 2007-10-26 2008-03-12 北京握奇数据系统有限公司 Position service based telecom smart card, data traffic system and method
CN101587522A (en) * 2009-06-17 2009-11-25 北京东方微点信息技术有限责任公司 Method and system for identifying script virus
CN102339438A (en) * 2010-07-22 2012-02-01 阿里巴巴集团控股有限公司 Commodity information website publishing method, system and device
CN103355237A (en) * 2013-07-25 2013-10-23 中国水产科学研究院淡水渔业研究中心 Culture method for promoting river crabs to shell and delaying gonad development
CN103914478A (en) * 2013-01-06 2014-07-09 阿里巴巴集团控股有限公司 Webpage training method and system and webpage prediction method and system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050257752A1 (en) * 2004-05-20 2005-11-24 Shirley Langer PET accessory with wireless telephonic voice transmitter
CN2781771Y (en) * 2005-01-20 2006-05-24 乐金电子(中国)研究开发中心有限公司 Pet seeking electronic necklace based on mobile telecom network SMS
NZ553146A (en) * 2007-02-09 2011-05-27 Say Systems Ltd Improvements relating to monitoring and displaying activities
US20130014706A1 (en) * 2011-07-14 2013-01-17 PatPace Ltd. Pet animal collar for health & vital signs monitoring, alert and diagnosis
CN103052027A (en) * 2012-12-31 2013-04-17 合肥寰景信息技术有限公司 Pet positioning system based on LBS (Location Based Service)
CN106455525A (en) * 2014-05-15 2017-02-22 奥格唯有限公司 System and method for pet behavioral identification
CN104932459B (en) * 2015-04-29 2018-04-13 海南大学 A kind of multifunctional pet management monitoring system based on Internet of Things
CN105528074A (en) * 2015-12-04 2016-04-27 小米科技有限责任公司 Intelligent information interaction method and apparatus, and user terminal
CN105494143A (en) * 2015-12-16 2016-04-20 惠州Tcl移动通信有限公司 Intelligent wearable equipment applied to pet
CN105554482B (en) * 2016-03-09 2018-12-21 北京宠小乐科技有限公司 It is a kind of to reinforce the comprehensive management apparatus and method that owner contacts with pet
CN105845144A (en) * 2016-03-21 2016-08-10 陈宁 Intelligent health management system for realizing animal sound and form translation function
CN105706951B (en) * 2016-04-18 2019-03-08 宁波力芯科信息科技有限公司 A kind of intelligent pet necklace and its implementation
CN205585060U (en) * 2016-04-29 2016-09-21 深圳市沃特沃德股份有限公司 Support to remove web's pet intelligence wearing equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101141747A (en) * 2007-10-26 2008-03-12 北京握奇数据系统有限公司 Position service based telecom smart card, data traffic system and method
CN101587522A (en) * 2009-06-17 2009-11-25 北京东方微点信息技术有限责任公司 Method and system for identifying script virus
CN102339438A (en) * 2010-07-22 2012-02-01 阿里巴巴集团控股有限公司 Commodity information website publishing method, system and device
CN103914478A (en) * 2013-01-06 2014-07-09 阿里巴巴集团控股有限公司 Webpage training method and system and webpage prediction method and system
CN103355237A (en) * 2013-07-25 2013-10-23 中国水产科学研究院淡水渔业研究中心 Culture method for promoting river crabs to shell and delaying gonad development

Also Published As

Publication number Publication date
CN107006389A (en) 2017-08-04
WO2018082225A1 (en) 2018-05-11

Similar Documents

Publication Publication Date Title
CN107006389B (en) Terminal and pet action signal identification method and device
KR102022893B1 (en) Pet care method and system using the same
WO2018082227A1 (en) Terminal and pet posture detection method and apparatus
CN107122959A (en) A kind of office management method, computer equipment and storage medium
CN106875460A (en) A kind of picture countenance synthesis method and terminal
CN105264560A (en) Systems, apparatus, and methods for social graph based recommendation
CN108616448B (en) Information sharing path recommendation method and mobile terminal
CN108009288B (en) Recipe pushing method and device
CN107194732A (en) One kind application method for pushing, mobile terminal and computer-readable recording medium
WO2018149213A1 (en) Jigsaw puzzle type task execution control method and device
JPWO2016143404A1 (en) Information processing apparatus, information processing method, and program
CN108390998A (en) A kind of method and mobile terminal for sharing file
CN108833663A (en) Terminal setting method, terminal and computer storage medium based on reality scene
CN107705786A (en) A kind of method of speech processing, device and computer-readable recording medium
JP2010160783A (en) Information providing system, portable information terminal, and information management device
CN107273024B (en) A kind of method and apparatus realized using data processing
CN109660674B (en) Method for setting alarm clock and electronic equipment
CN107563316A (en) A kind of image pickup method, terminal and computer-readable recording medium
CN111753520B (en) Risk prediction method and device, electronic equipment and storage medium
CN111223166B (en) Image display method, device, electronic equipment and medium
CN110378798B (en) Heterogeneous social network construction method, group recommendation method, device and equipment
CN110045892B (en) Display method and terminal equipment
CN107864268A (en) Processing method, mobile terminal and the computer-readable recording medium of expression information
CN107844203B (en) Input method candidate word recommendation method and mobile terminal
CN114283453A (en) Method and device for acquiring information of wandering animal, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210908

Address after: 518000 201, No.26, yifenghua Innovation Industrial Park, Xinshi community, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen waterward Information Co.,Ltd.

Address before: 5 / F, block B, huayuancheng digital building, 1079 Nanhai Avenue, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN WATER WORLD Co.,Ltd.

TR01 Transfer of patent right