CN106577350B - Pet type identification method and device - Google Patents

Pet type identification method and device Download PDF

Info

Publication number
CN106577350B
CN106577350B CN201611032333.8A CN201611032333A CN106577350B CN 106577350 B CN106577350 B CN 106577350B CN 201611032333 A CN201611032333 A CN 201611032333A CN 106577350 B CN106577350 B CN 106577350B
Authority
CN
China
Prior art keywords
pet
matching
type
image
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611032333.8A
Other languages
Chinese (zh)
Other versions
CN106577350A (en
Inventor
王忠山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Waterward Information Co Ltd
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co Ltd filed Critical Shenzhen Water World Co Ltd
Priority to CN201611032333.8A priority Critical patent/CN106577350B/en
Priority to PCT/CN2017/074840 priority patent/WO2018094892A1/en
Publication of CN106577350A publication Critical patent/CN106577350A/en
Application granted granted Critical
Publication of CN106577350B publication Critical patent/CN106577350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a pet type identification method and a device, wherein in the method, an intelligent terminal acquires a pet image; extracting a plurality of body features of the pet according to the pet image; and matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type, so that the user can quickly know the pet type, the understanding of the pet is improved, and the user is helped to select the favorite pet.

Description

Pet type identification method and device
Technical Field
The invention relates to the technical field of image recognition, in particular to a method and a device for recognizing pet types.
Background
Pets generally refer to animals that people have housed for the purpose of eliminating solition or for recreational purposes. Pets are animals and plants housed for mental purposes. Animals, generally mammals or birds, are generally housed for the purpose of eliminating their lives, or for recreation, because they are relatively developed in brain and easy to communicate with humans.
The pet species are various, and the common pets include cats, dogs, fish, birds and the like. The pet cat can be roughly classified into 42 species (using the American CAF standard) such as Gafei cat, Takara cat, Brachymya cat, Persian cat, Siamese cat, Tiger cat, Kimbara, Palman cat, Bombay cat, Burman cat, Bombay cat, Burmese cat, Egypt cat, Burmese cat, European cat, puppet cat, Okinawa cat, Singapore cat, Norway forest cat, fluorescent cat, Somali cat, Turkey cat, and American Brassie cat. In the record of AKC dog famous records, more than 300 dogs are bred worldwide, and 149 dogs approved by AKC include Affyijia dogs, Afghanistan beagle dogs, omnipotence peduncles, autumn dogs, Alaska sleigh dogs and the like.
Clearly, it is not easy to rely on personal experience to identify such a wide variety of pet breeds. However, for a user who likes a small pet, seeing a lovely small pet is an unfortunate thing.
Disclosure of Invention
The invention mainly aims to provide a method and a device for identifying pet types, which help a user identify the pet types.
The invention provides a method for identifying pet types, which comprises the following steps:
acquiring a pet image;
extracting a plurality of body features of the pet according to the pet image;
and matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type.
Preferably, the capturing the pet image by the camera includes:
shooting a pet by a camera to obtain an initial pet image;
recording displacement information of a camera when the initial pet image is shot;
and judging whether the displacement information is in a set displacement threshold range allowing identification, and taking the initial pet image as a pet image when judging that the displacement information is in the displacement threshold range.
Preferably, the extracting a plurality of physical features of the pet according to the pet image includes:
extracting the overall outline of the pet from the pet image;
locating the location areas of the face, torso, tail and limbs from the overall contour of the pet;
extracting face features, trunk features, tail features and limb features of the pet in the position areas of the face, the trunk, the tail and the limbs respectively, wherein the face features comprise eye features, mouth features, nose features, ear features and face contours;
combining the face feature, the trunk feature, the tail feature and the limb feature of the pet together to form a pet body feature set, wherein each feature comprises two parameters of color and shape.
Preferably, the data in the pet physical characteristic database comprises:
the pet type matching image collection system comprises a plurality of pet type matching image collections, wherein each pet type matching image collection comprises one or more preset pet matching images, and each pet matching image collection comprises one pet matching image carrying a pet type mark and is marked as a representative pet image;
extracting a reference pet body feature set from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and the distinguishing characteristic set corresponds to the matching image set of each pet type, represents one or more distinguishing characteristics of a certain type of pet, and is a characteristic used for distinguishing the type of pet from other types of pets.
Preferably, the matching the pet body feature set with the data in the pet body feature database to obtain the matched pet type includes:
judging whether the pet body feature set contains a distinguishing feature set;
if the distinguishing feature set is contained, judging whether the contained distinguishing feature set is more than one;
if yes, recording the included distinguishing feature sets as screening distinguishing feature sets, and screening the screening distinguishing feature sets by adopting a matching mechanism to obtain matched pet types;
and if the pet type matching image set contains one distinguishing feature set, acquiring the pet type mark of the matching image set of the pet type corresponding to the distinguishing feature set to obtain the matched pet type.
Preferably, the screening the plurality of screened distinguishing feature sets by using a matching mechanism to obtain the matched pet type includes:
acquiring a screening matching image set of the pet type corresponding to the screening distinguishing feature set;
calling all pet images in the screened matching image set to obtain a reference pet body characteristic set corresponding to each pet image;
comparing the features in the pet body feature set with the features in the called reference pet body feature set one by one, and calculating the matching degree of the pet body feature set and each called reference pet body feature set;
calculating the matching degree of the features in the pet body feature set and the extracted reference pet body feature set according to the matching degree of all the pet body feature sets in the screened matching image set and each extracted reference pet body feature set, and calculating the average value of the matching degree of the screened matching image set and the pet body feature set to obtain the overall matching degree of the pet image and the pet type;
and taking the pet type corresponding to the screening matching image set with the highest overall matching degree and the reference pet body feature set as the matching pet type.
The invention also provides a device for identifying pet types, which comprises:
the acquisition module is used for acquiring a pet image;
the characteristic extraction module is used for extracting the body characteristics of the pet according to the pet image;
and the matching module is used for matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type.
Preferably, the feature extraction module includes:
the overall contour extraction unit is used for extracting an overall contour of the pet from the pet image;
a body position locating unit for locating the position areas of the face, the trunk, the tail and the limbs from the overall outline of the pet;
a feature extraction unit for extracting face features, trunk features, tail features and limb features of the pet in the position areas of the face, trunk, tail and limb respectively, wherein the face features comprise eye features, mouth features, nose features, ear features and face contour;
combining the face feature, the trunk feature, the tail feature and the limb feature of the pet together to form a pet body feature set, wherein each feature comprises two parameters of color and shape.
Preferably, the pet body feature database comprises:
the pet type matching image collection system comprises a plurality of pet type matching image collections, wherein each pet type matching image collection comprises one or more preset pet matching images, and each pet matching image collection comprises one pet matching image carrying a pet type mark and is marked as a representative pet image;
extracting a reference pet body feature set from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and the distinguishing characteristic set corresponds to the matching image set of each pet type, represents one or more distinguishing characteristics of a certain type of pet, and is a characteristic used for distinguishing the type of pet from other types of pets.
Preferably, the matching module comprises:
the matching unit is used for judging whether the pet body feature set contains a distinguishing feature set or not;
the matching unit is used for judging whether the pet body feature set contains a distinguishing feature set or not;
the result analysis unit is used for analyzing whether more than one distinguishing feature set is contained or not if the distinguishing feature set is contained;
the multi-result processing unit is used for recording the included distinguishing feature sets as screening distinguishing feature sets when the result analysis unit generates a plurality of results, and screening the screening distinguishing feature sets by adopting a matching mechanism to obtain matched pet types;
and the single result processing unit is used for acquiring the pet type mark of the matching image set of the pet type corresponding to the distinguishing characteristic set when the result analysis unit has only one result, so as to obtain the matched pet type.
The invention provides a pet type identification method and a device, wherein in the method, an intelligent terminal acquires a pet image; extracting a plurality of body features of the pet according to the pet image; and matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type, so that the user can quickly know the pet type, the understanding of the pet is improved, and the user is helped to select the favorite pet.
Drawings
FIG. 1 is a flowchart illustrating a first embodiment of a pet type identification method according to the present invention;
FIG. 2 is a diagram illustrating data types in a pet body characteristic database according to the method for identifying pet types of the present invention;
FIG. 3 is a schematic structural diagram of a pet type identification device according to a first embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a pet type identification device according to a second embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a pet type identification device according to a third embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a pet type identification device according to a fifth embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a pet type identification device according to a sixth embodiment of the present invention;
fig. 8 is a conceptual analysis diagram of the pet type recognition method and apparatus according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
As used herein, the singular forms "a", "an", "the" and "the" include plural referents unless the content clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, units, modules, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, units, modules, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, "terminal," "smart terminal," "mobile terminal," "terminal device" includes both devices having wireless signal receivers, which are devices having only wireless signal receivers without transmit capability, and devices having receive and transmit hardware, which have devices having receive and transmit hardware capable of performing two-way communication over a two-way communication link, as will be understood by those skilled in the art. Such a device may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display; PCS (Personal communications service), which may combine voice, data processing, facsimile and/or data communications capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. The "terminal", "smart terminal", "terminal Device" used herein may also be a communication terminal, a web-enabled terminal, and a music/video playing terminal, and may be, for example, a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, and may also be a smart television, a set-top box, and the like.
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a method for identifying pet types, which can be applied to a control terminal, wherein the control terminal can be but is not limited to a tablet computer, a mobile phone or other intelligent equipment. This embodiment and the following embodiments are described by taking a mobile phone as an example. In the image recognition process, the image is generally uploaded to a server through a mobile phone, and the server analyzes the image to obtain a result and then transmits the result back to the mobile phone. Referring to fig. 1, fig. 1 is a flowchart illustrating a pet type identification method according to a first embodiment of the present invention. The invention provides a pet type identification method, which comprises the following steps:
s10, acquiring a pet image;
s20, extracting a plurality of body features of the pet according to the pet image;
s30, matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type.
As described in step S10, the image of the pet is first obtained, and the image may be obtained by shooting with a mobile phone, or by importing an image inside the mobile phone or loading an image on a network.
As described in step S20, the physical features of the pet can be extracted by existing recognition technology.
In step S30, a large amount of types of pet data are stored in the pet body characteristic database. A certain type of pet also stores a large amount of individual pet characteristic information. The database counts the arrangement condition of a certain feature of a certain type of pet, if the feature has larger discreteness, the feature is not the distinguishing feature of the type of pet, and the weight proportion of the feature is negligible or small during matching calculation; if the feature discreteness is small, the feature is the distinguishing feature of the type of pet, and the weight proportion of the feature is large during matching calculation. The specific proportion weight is different according to different pet types. If the pet X comprises the distinguishing characteristic A and the distinguishing characteristic B, the distinguishing characteristic A belongs to the type A pet, and the distinguishing characteristic B belongs to the type B pet. At this time, the device can analyze whether other characteristics of the pet X fall into the range of the A type or the B type pet, and if only the A type is matched and the B type is not matched, the device preferentially transmits the picture and the name of the A type pet. In the matching process, the pet body feature set and the pet body feature database are matched in a mode that all feature variables in the pet body feature set are matched with all types of pet feature data in the pet body feature database one by one to obtain a matching result.
And when the corresponding pet type is matched, the user can be informed of the matched pet type by outputting a matching result to the user. The output result may be a picture or set of pictures that include the name of the pet type, the pet type. The selected pictures are from a pet database containing pet images of corresponding features in a pet body feature database. The selected picture is generally the same as the pet's motion in the image to be identified. The output result may be a single result, but is not limited to a single result. The judgment results can be sequentially output according to the matching degree, the matching degree is displayed preferentially, and the result dissatisfied by a user can be switched to the result with lower matching degree.
Further, the present invention also proposes a second embodiment of the method for pet type identification based on the first embodiment of the method for pet type identification of the present invention, and the step S10 includes, unlike the first embodiment of the method for pet type identification:
s101, shooting a pet through a camera to obtain an initial pet image;
s102, recording displacement information of a camera when the initial pet image is shot;
s103, judging whether the displacement information is in a set displacement threshold range allowing identification, and taking the initial pet image as a pet image when the displacement information is judged to be in the displacement threshold range.
In step S101, in this scenario, the user aims at the pet through the mobile phone to take a picture of the pet, and obtains an image of the pet. The recognition device can read the model of the camera and the imaging condition. The imaging conditions include pixel values, ISO sensitivity coefficients, white balance parameters, and the like. The identification device can optimize the image according to the camera model and the imaging condition of the specific model, so that the pet image data is closer to the actual value.
In steps S102 and S103, the user sees a pet with a mood that is more exciting, and when taking a picture with the mobile phone, the user may not take a picture of the pet smoothly because the user' S hands are exciting. The method of the embodiment can detect whether the user shakes too much during photographing so that the photographed pet image cannot be used for identifying the pet type.
The specific method comprises the steps of obtaining equipment position information when the pet image is shot in the mobile phone through a three-axis gyroscope or other related position obtaining accessories, and recording the position information before imaging as (x)0,y0,z0) The position information after imaging is (x)1,y1,z1) Then its displacement information Δ X can be calculated. The method specifically comprises the following steps:
Figure DEST_PATH_GDA0001164510110000071
and judging whether the displacement information exceeds a displacement threshold value which is set to allow identification. If the image does not exceed the preset threshold value, the image can be used for pet type identification, and if the image exceeds the preset threshold value, the user is reminded to shoot again.
Further, the present invention also proposes a third embodiment of the method for pet type identification based on the first embodiment of the method for pet type identification of the present invention, and the step S20 includes, unlike the first embodiment of the method for pet type identification:
s21, extracting the overall outline of the pet from the pet image;
s22, locating the position areas of the face, the trunk, the tail and the four limbs from the overall outline of the pet;
s23, extracting face features, trunk features, tail features and limb features of the pet in the position areas of the face, the trunk, the tail and the limbs respectively, wherein the face features comprise eye features, mouth features, nose features, ear features and face contours.
In step S21, the pet contour is identified by building a neural network model. Through a large amount of picture training, the device can distinguish the difference between the pet and the background in the pet image, and then find out the outline of the pet.
Generally, recognizing facial features is most important in terms of recognizing the pet type. The recognition device can recognize the face contour of the pet by utilizing a training mechanism of the neural network model. And judging the environmental parameters of the surrounding environment according to the difference between the color of the pet face and the environmental color. And identifying the overall contour of the pet by combining the color in the contour of the face and the environmental parameters. Such as identifying the environment in which a cat is located, whether on bed sheets or on the floor, on grass or on the road. After the background is subtracted, the overall pet contour can be obtained.
In some more complicated backgrounds, the overall pet contour obtained after the background is subtracted is not a real pet contour, and a partial background is also included, so that a model can be positioned through the pet body contour, and a reasonable contour part is reserved.
As shown in step S22, upon identifying the pet contour, the pet contour is divided into four regions: head, torso, limbs and tail. Due to the imaging angle problem, the features of the limbs and the tail may not be significant in the image, and in the case of statistical features, the two features can be ignored according to the actual situation.
In step S23, it is most important to identify the features of the head, which specifically include the shape of the ear, the color of the ear, the size of the ear relative to the face, the position of the ear, etc.; eye shape, eye size relative to the face, eye color, eye position on the face, etc.; nose shape, size of nose relative to face, nose color, location of nose on face, etc.; mouth shape, size of the mouth relative to the face, mouth color, position of the mouth on the face; the size of the face relative to the torso, the face color, the face shape, etc.
The characteristics of the torso mainly include the size of the torso relative to the head, the color and color distribution of the torso, and the like.
The characteristics of the limbs mainly include the length of the legs, the color of the legs, the thickness of the legs, etc.
The characteristics of the tail mainly include tail length, tail color and tail thickness. However, in actual shooting, the tail often cannot be completely appeared in the image, so the device can intelligently screen the characteristics which can be screened.
The above physical characteristics of the pet, each of which is represented by a variable. And extracting all pet features on the pet image, and combining all the features into a pet body feature set. Not every feature can be extracted during the extraction process. At this time, in the pet body feature set, the value of the variable indicating the feature that cannot be extracted is represented as 0.
Further, the present invention also proposes a fourth embodiment of the method for pet type identification based on the third embodiment of the method for pet type identification of the present invention, and unlike the third embodiment of the method for pet type identification, the data in the pet body characteristic database in step S30 includes:
a plurality of pet type matching image sets 301, each of which includes one or more preset pet matching images, including one pet matching image carrying a pet type mark, and is marked as a representative pet image;
a reference pet body feature set 302 extracted from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and a distinguishing feature set 303 corresponding to the matching image set of each pet type, wherein the distinguishing feature set represents one or more distinguishing features of a certain type of pets, and the distinguishing features are features for distinguishing the type of pets from other types of pets.
As shown in fig. 2, the pet body characteristic database includes characteristic data of a plurality of pet types, and the characteristic data of each pet type is divided into three data types, which are respectively a matching image set, a reference pet body characteristic set and a distinguishing characteristic set.
The matching image set 301 refers to an image set of a certain pet type, such as an image set of a Persian cat stored in the pet body feature database for the Persian cat. In order that the data does not excessively fit a particular individual pet, no more than three pet images for an individual pet are available.
The reference pet physical characteristic set 302 is a pet physical characteristic set corresponding to pictures stored in each database. The method of the second embodiment and the method of the third embodiment can be referred to for extracting the pet physical features.
The distinguishing feature set 303 refers to a feature set of a type used to distinguish the type of pet from other types of pets. The body characteristics of each pet image can be extracted by collecting the pet images of the type, and the arrangement condition of each body characteristic can be counted. In the distinguishing feature set 303, important distinguishing features, generally distinguishing features, can be distinguished. In practice, more than 80% of the pet images have the pet feature as the important distinguishing feature, and more than 60% but less than 80% of the pet images have the pet feature set as the general distinguishing feature. In a single distinguishing feature, a plurality of data ranges may be set, and different weights may be defined for different data ranges, for example, a certain feature is divided into three data ranges [0,1], (1,2], (2,3], [0,1] with a weight of 0.8, (1,2] with a weight of 0.6, (2,3] with a weight of 0.4. a weight of 0.8 means that > 80% of the pet images of the type have the feature.
Further, the present invention also proposes a fifth embodiment of the method for pet type identification based on the fourth embodiment of the method for pet type identification of the present invention, and unlike the fourth embodiment of the method for pet type identification, the step S30 further includes:
s31, judging whether the pet body feature set contains a distinguishing feature set;
s32, if the distinguishing feature set is contained, judging whether more than one distinguishing feature set is contained;
s33, if yes, recording the included distinguishing feature sets as screening distinguishing feature sets, and screening the screening distinguishing feature sets by adopting a matching mechanism to obtain matched pet types;
s34, if the distinguishing feature set is included, acquiring the pet type mark of the matching image set of the pet type corresponding to the distinguishing feature set to obtain the matched pet type.
As shown in step S31, assume that the pet body feature set is { a }1,a2,a3…anAnd is marked as a set A. A set of distinguishing features of a certain type is { { X1},{X2},{X3Denoted as set X, it is assumed here that the set of discriminating characteristics X comprises only three discriminating characteristics. Another type of distinguishing feature set Q is { { Q1},{Q2It is assumed here that the set of discriminating characteristics Q comprises only two discriminating characteristics. Setting a1,a2,a3Respectively with { X1},{X2},{X3Is corresponding to, a4,a5Respectively with { Q1},{Q2And (6) corresponding. If a1∈{X1},a2∈{X2},a3∈{X3Then, it can be determined that the set a contains the distinguishing feature set X. If a4∈{Q1},a5∈{Q2And judging that the set A contains a distinguishing feature set Q. If a subset X of the feature set X is distinguishednNot containing the corresponding characteristic element a of set AnThen it can be determined that set a does not contain discriminating feature set X.
As described in steps S32, S33, S34, this step is a processing method regarding judgment of occurrence of a different result. If the set A does not contain any distinguishing feature set, the matching is unsuccessful, and an unidentifiable result is output. And if the set A contains a distinguishing characteristic set, outputting the pet representative image of the pet type corresponding to the distinguishing characteristic set. And clicking the representative pet image, loading a corresponding matched pet image set, facilitating the user to check the characteristics of different individuals of the pet type, and increasing the understanding of the pet type. If the set A comprises a plurality of distinguishing feature sets, a screening mechanism is adopted to further screen the obtained result, the matching degree of each distinguishing feature set is obtained, and the matching result is output according to the matching degree.
Further, based on the fifth embodiment of the method for identifying a pet type of the present invention, the present invention further provides a sixth embodiment of the method for identifying a pet type, and different from the fifth embodiment of the method for identifying a pet type, in step S33, the screening the plurality of screened distinguishing feature sets by using a matching mechanism to obtain a matched pet type, including:
s331, obtaining a screening matching image set of a plurality of corresponding pet types of the screening distinguishing feature set;
s332, calling all pet images in the screening matching image set of each pet type, and obtaining a reference pet body feature set corresponding to each pet image;
s333, comparing the features in the pet body feature set with the features in the called reference pet body feature set one by one, and calculating the matching degree of the pet body feature set and each called reference pet body feature set;
s334, calculating the matching degree of the features in the pet body feature set and the called reference pet body feature set according to the matching degree of all the pet body feature sets in the screened matching image set and each called reference pet body feature set, and calculating the average value of the matching degree of the screened matching image set and the pet body feature set to obtain the overall matching degree of the pet image and the pet type;
and S335, taking the pet type corresponding to the reference pet body feature set with the highest overall matching degree as the matching pet type.
In steps S331 and S332, assuming that the pet type corresponding to the screened distinguishing feature set is B, C, D, a pet image set corresponding to the pet type B, C, D is retrieved, and a reference pet body feature set corresponding to all pet images in the B, C, D pet image set is retrieved. Taking pet type B as an example, if there are n pet images in pet type B, there are n corresponding reference pet body feature sets, which are marked as B1,B2,…Bn. Reference pet body characteristic set Bt={b1,b2,…bnT is a positive integer and has a value range of [1, n }]。
In step S333, the sets A and B are respectively1,B2,…BnComparing and calculating corresponding matching degree η1,η2,…ηnOverall matching degree η for pet type BBMay be solved for η1,η2,…ηnThe overall matching degree η of the pet type C can be obtained by the same methodCOverall matching degree η with pet type DD
In steps S334, S335, the corresponding representative pet images are output in order according to the overall matching degree, the pet type with the highest overall matching degree is displayed first, the representative pet image is clicked, and the corresponding matching pet image set can be loaded, assuming that the overall matching degree is ηBAnd the highest result is that the user can check a plurality of B-type pet images in the pet image set B, visually compare the B-type pet images with the actual pet images and obtain the own judgment result.
Referring to fig. 3, fig. 3 is a schematic structural view of a pet type identification device according to a first embodiment of the present invention. The invention provides a pet type recognition device, which comprises:
the acquisition module 10 is used for acquiring pet images;
the characteristic extraction module 20 is used for extracting the body characteristics of the pet according to the pet image;
and the matching module 30 is used for matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type.
The acquisition module 10 first acquires a pet image, which may be acquired by shooting with a mobile phone, or by importing an image in a mobile phone or loading an image on a network.
The feature extraction module 20 is implemented by establishing a neural network model. Through a large amount of picture training, the device can distinguish the difference between the pet and the background in the pet image, and then find out the outline of the pet.
On the basis of distinguishing the pet contour, the contour of the pet is divided into four areas: head, torso, limbs and tail. Due to the imaging angle problem, the features of the limbs and the tail may not be significant in the image, and in the case of statistical features, the two features can be ignored according to the actual situation.
The most important is the identification of the characteristics of the head, which specifically include the shape of the ear, the color of the ear, the size of the ear relative to the face, and the position of the ear; eye shape, eye size relative to the face, eye color, eye position on the face; nose shape, size of nose relative to face, nose color, location of nose on face; mouth shape, size of the mouth relative to the face, mouth color, position of the mouth on the face; size of the face relative to the torso, face color, face shape.
The characteristics of the torso include mainly the size of the torso relative to the head, the color and color distribution of the torso.
The characteristics of the limbs mainly include the length of the legs, the color of the legs and the thickness of the legs.
The characteristics of the tail mainly include tail length, tail color and tail thickness. However, in actual shooting, the tail often cannot be completely appeared in the image, so the device can intelligently screen the characteristics which can be screened.
In the matching module 30, a large amount of types of pet data are stored in the pet body characteristic database. A certain type of pet also stores a large amount of individual pet characteristic information. The database counts the arrangement condition of a certain feature of a certain type of pet, if the feature has larger discreteness, the feature is not the distinguishing feature of the type of pet, and the weight proportion of the feature is negligible or small during matching calculation; if the feature discreteness is small, the feature is the distinguishing feature of the type of pet, and the weight proportion of the feature is large during matching calculation. The specific proportion weight is different according to different pet types. If the pet X comprises the distinguishing characteristic A and the distinguishing characteristic B, the distinguishing characteristic A belongs to the type A pet, and the distinguishing characteristic B belongs to the type B pet. At this time, the device can analyze whether other characteristics of the pet X fall into the range of the A type or the B type pet, and if only the A type is matched and the B type is not matched, the device preferentially transmits the picture and the name of the A type pet.
The output result is a picture or set of pictures that includes the name of the pet type, the pet type. The selected pictures are from a pet database containing pet images of corresponding features in a pet body feature database. The selected picture is generally the same as the identified image pet action. The output result may be a single result, but is not limited to a single result. The judgment results can be sequentially output according to the matching degree, the matching degree is displayed preferentially, and the result dissatisfied by a user can be switched to the result with lower matching degree.
Fig. 4 is a schematic structural view of a pet type identification device according to a second embodiment of the present invention, as shown in fig. 4. Further, based on the first embodiment of the pet type recognition device of the present invention, the present invention also provides a second embodiment of the pet type recognition device. Unlike the first embodiment of the device for pet type identification, the acquisition module 10 further includes:
the image acquisition unit 101 is used for shooting the pet by a camera to obtain an initial pet image.
A displacement recording unit 102 for recording device displacement information when the pet image is photographed;
an image sharpness judging unit 103, configured to judge whether the displacement information is within a set displacement threshold range that allows identification, and if it is judged that the displacement information is within the displacement threshold range, take the initial pet image as a pet image.
In the image obtaining unit 101, the user aims at the pet through the mobile phone to take a picture of the pet, and obtains an image of the pet. The recognition device can read the model of the camera and the imaging condition. The imaging conditions include pixel values, ISO sensitivity coefficients, white balance parameters, and the like. The identification device can optimize the image according to the camera model and the imaging condition of the specific model, so that the pet image data is closer to the actual value.
In the displacement recording unit 102, the user sees a pet with a mood that is more exciting, and when taking a picture with the mobile phone, the user may not take a pet image smoothly because the mood is exciting. The device of the embodiment can detect whether the hand shake is excessive when the user takes a picture, so that the taken pet image cannot be used for identifying the pet type.
In the image sharpness determining unit 103, a specific device is to obtain the device position information when the pet image is shot in the mobile phone through a three-axis gyroscope or other related position obtaining accessories, and record the position information before imaging as (x)0,y0,z0) The position information after imaging is (x)1,y1,z1) Then its displacement information Δ X can be calculated. The method specifically comprises the following steps:
Figure DEST_PATH_GDA0001164510110000131
and judging whether the displacement information exceeds a displacement threshold range which is set to allow identification, wherein the displacement value range is preset. If the image does not exceed the preset threshold value, the image can be used for pet type identification, and if the image exceeds the preset threshold value, the user is reminded to shoot again.
Fig. 5 is a schematic structural view of a third embodiment of the pet type identification device of the present invention, as shown in fig. 5. Further, based on the first embodiment of the pet type recognition apparatus of the present invention, the present invention also provides a third embodiment of the pet type recognition apparatus, and different from the first embodiment of the pet type recognition apparatus, the feature extraction module 20 includes:
an overall contour extraction unit 21 for extracting an overall contour of the pet from the pet image;
a body position locating unit 22 for locating the position areas of the face, torso, tail and limbs from the overall contour of the pet;
a feature extraction unit 23, configured to extract facial features, trunk features, tail features, and limb features of the pet in the position areas of the face, trunk, tail, and limb, respectively, the facial features including eye features, mouth features, nose features, ear features, and face contour.
The overall contour extraction unit 21 identifies the pet contour by establishing a neural network model. Through a large amount of picture training, the device can distinguish the difference between the pet and the background in the pet image, and then find out the outline of the pet.
Generally, recognizing facial features is most important in terms of recognizing the pet type. The recognition device can recognize the face contour of the pet by utilizing a training mechanism of the neural network model. And judging the environmental parameters of the surrounding environment according to the difference between the color of the pet face and the environmental color. And identifying the overall contour of the pet by combining the color in the contour of the face and the environmental parameters. Such as identifying the environment in which a cat is located, whether on bed sheets or on the floor, on grass or on the road. After the background is subtracted, the overall pet contour can be obtained.
In some more complicated backgrounds, the overall pet contour obtained after the background is subtracted is not a real pet contour, and a partial background is also included, so that a model can be positioned through the pet body contour, and a reasonable contour part is reserved.
In the body position locating unit 22, on the basis of identifying the pet contour, the contour of the pet is divided into four regions: head, torso, limbs and tail. Due to the imaging angle problem, the features of the limbs and the tail may not be significant in the image, and in the case of statistical features, the two features can be ignored according to the actual situation.
In the feature extraction unit 23, it is most important to identify the features of the head, which specifically include the shape of the ear, the color of the ear, the size of the ear relative to the face, the position of the ear, and the like; eye shape, eye size relative to the face, eye color, eye position on the face, etc.; nose shape, size of nose relative to face, nose color, location of nose on face, etc.; mouth shape, size of the mouth relative to the face, mouth color, position of the mouth on the face; the size of the face relative to the torso, the face color, the face shape, etc.
The characteristics of the torso mainly include the size of the torso relative to the head, the color and color distribution of the torso, and the like.
The characteristics of the limbs mainly include the length of the legs, the color of the legs, the thickness of the legs, etc.
The characteristics of the tail mainly include tail length, tail color and tail thickness. However, in actual shooting, the tail often cannot be completely appeared in the image, so the device can intelligently screen the characteristics which can be screened.
The above physical characteristics of the pet, each of which is represented by a variable. And extracting all pet features on the pet image, and combining all the features into a pet body feature set. Not every feature can be extracted during the extraction process. At this time, in the pet body feature set, the value of the variable indicating the feature that cannot be extracted is represented as 0.
Combining the face feature, the trunk feature, the tail feature and the limb feature of the pet together to form a pet body feature set, wherein each feature comprises two parameters of color and shape.
Further, the present invention also proposes a fourth embodiment of the apparatus for pet type identification based on the third embodiment of the apparatus for pet type identification of the present invention, wherein the database of physical characteristics of pets comprises:
a plurality of pet type matching image sets 301, each of which includes one or more preset pet matching images, including one pet matching image carrying a pet type mark, and is marked as a representative pet image;
a reference pet body feature set 302 extracted from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and a distinguishing feature set 303 corresponding to the matching image set of each pet type, wherein the distinguishing feature set represents one or more distinguishing features of a certain type of pets, and the distinguishing features are features for distinguishing the type of pets from other types of pets.
The matching image set 301 refers to an image set of a certain pet type, such as an image set of a Persian cat stored in the pet body feature database for the Persian cat. In order that the data does not excessively fit a particular individual pet, no more than three pet images for an individual pet are available.
The reference pet physical characteristic set 302 is a pet physical characteristic set corresponding to pictures stored in each database. The method of example 2 and example 3 can be referred to for extracting the pet physical characteristics.
The distinguishing feature set 303 refers to a feature set of a type used to distinguish the type of pet from other types of pets. The body characteristics of each pet image can be extracted by collecting the pet images of the type, and the arrangement condition of each body characteristic can be counted. In the distinguishing feature set 303, important distinguishing features, generally distinguishing features, can be distinguished. In practice, more than 80% of the pet images have the pet feature as the important distinguishing feature, and more than 60% but less than 80% of the pet images have the pet feature set as the general distinguishing feature. In a single distinguishing feature, a plurality of data ranges may be set, and different weights may be defined for different data ranges, for example, a certain feature is divided into three data ranges [0,1], (1,2], (2,3], [0,1] with a weight of 0.8, (1,2] with a weight of 0.6, (2,3] with a weight of 0.4. a weight of 0.8 means that > 80% of the pet images of the type have the feature.
Fig. 6 is a schematic structural view of a fifth embodiment of the pet type identification device of the present invention, as shown in fig. 6. Further, the present invention also proposes a fifth embodiment of the apparatus for pet type identification based on the fourth embodiment of the apparatus for pet type identification of the present invention, and the matching module 30 includes:
a matching unit 31, configured to determine whether the pet body feature set includes a distinguishing feature set;
a result analysis unit 32, configured to analyze whether there is more than one distinguishing feature set if the distinguishing feature set is included;
the multi-result processing unit 33 is configured to, when the result analysis unit has multiple results, record the included distinctive feature sets as screening distinctive feature sets, and screen the screening distinctive feature sets by using a matching mechanism to obtain matched pet types;
and a single result processing unit 34, configured to, when the result analysis unit has only one result, obtain a pet type label of the matching image set of the pet type corresponding to the distinguishing feature set, so as to obtain a matched pet type.
In the matching unit 31, the pet body feature set is assumed to be { a }1,a2,a3…anAnd is marked as a set A. A set of distinguishing features of a certain type is { { X1},{X2},{X3Denoted as set X, it is assumed here that the set of discriminating characteristics X comprises only three discriminating characteristics. Another type of distinguishing feature set Q is { { Q1},{Q2It is assumed here that the set of discriminating characteristics Q comprises only two discriminating characteristics. Setting a1,a2,a3Respectively with { X1},{X2},{X3Is corresponding to, a4,a5Respectively with { Q1},{Q2And (6) corresponding. If a1∈{X1},a2∈{X2},a3∈{X3Then, it can be determined that the set a contains the distinguishing feature set X. If a4∈{Q1},a5∈{Q2And judging that the set A contains a distinguishing feature set Q. If a child in feature set X is distinguishedCollection XnNot containing the corresponding characteristic element a of set AnThen it can be determined that set a does not contain discriminating feature set X.
In the result analysis unit 32, if the set a does not include any distinguishing feature set, an unrecognizable result is output. In the single result processing unit 34, if the set a only includes one distinguishing feature set, the pet representative image of the pet type corresponding to the distinguishing feature set is output. And clicking the representative pet image, loading a corresponding matched pet image set, facilitating the user to check the characteristics of different individuals of the pet type, and increasing the understanding of the pet type. In the multi-result processing unit 33, if the set a includes a plurality of distinguishing feature sets, a screening mechanism is adopted to further screen the obtained results to obtain the matching degree of each distinguishing feature set, and the matching result is output according to the matching degree.
Fig. 7 is a schematic structural view of a pet type identification device according to a sixth embodiment of the present invention, as shown in fig. 7. Further, the present invention also proposes a sixth embodiment of the apparatus for pet type recognition based on the fifth embodiment of the apparatus for pet type recognition of the present invention, and the matching mechanism in the result analyzing unit 33 includes:
the calling unit 331 is configured to obtain a screening matching image set of pet types corresponding to the screening distinguishing feature set; calling all pet images in the screened matching image set to obtain a reference pet body characteristic set corresponding to each pet image;
a matching unit 332, configured to compare the features in the pet body feature set with the features in the retrieved reference pet body feature set one by one, and calculate a matching degree between the pet body feature set and each retrieved reference pet body feature set; calculating the matching degree of the features in the pet body feature set and the extracted reference pet body feature set according to the matching degree of all the pet body feature sets in the screened matching image set and each extracted reference pet body feature set, and calculating the average value of the matching degree of the screened matching image set and the pet body feature set to obtain the overall matching degree of the pet image and the pet type;
the adapting unit 333 is configured to use the pet type corresponding to the screened matching image set with the highest overall matching degree reference pet body feature set as the matching pet type.
In the retrieving unit 331, if the pet type corresponding to the screened distinguishing feature set is B, C, D, a pet image set corresponding to the pet type B, C, D is retrieved, and a reference pet body feature set corresponding to all pet images in the B, C, D pet image set is retrieved. Taking pet type B as an example, if there are n pet images in pet type B, there are n corresponding reference pet body feature sets, which are marked as B1,B2,…Bn. Reference pet body characteristic set Bt={b1,b2,…bnT is a positive integer and has a value range of [1, n }]。
In the matching unit 332, the sets A and B are respectively1,B2,…BnComparing and calculating corresponding matching degree η1,η2,…ηnOverall matching degree η for pet type BBMay be solved for η1,η2,…ηnThe overall matching degree η of the pet type C can be obtained by the same methodCOverall matching degree η with pet type DD
In the adaptation unit 333, the corresponding representative pet images are output in order according to the overall matching degree, the pet type with the highest overall matching degree is preferentially displayed, the representative pet image is clicked, and the corresponding matching pet image set can be loaded, assuming that the overall matching degree is ηBAnd the highest result is that the user can check a plurality of B-type pet images in the pet image set B, visually compare the B-type pet images with the actual pet images and obtain the own judgment result.
As shown in fig. 8, fig. 8 is a conceptual analysis diagram of the pet type recognition method and apparatus according to the present invention. The pet type identification method provided by the invention comprises the steps of judging the position of a pet (namely the outline of the pet) through a training mechanism, dividing the outline of the pet into four parts, namely a head part, a trunk part, four limbs and a tail part, extracting the characteristics of each part, matching the characteristics with data in a pet body characteristic database, and outputting a matching result.
The invention provides a pet type identification method and a device, wherein in the method, an intelligent terminal acquires a pet image; extracting a plurality of body features of the pet according to the pet image; and matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain the matched pet type, so that the user can quickly know the pet type, the understanding of the pet is improved, and the user is helped to select the favorite pet.
It is noted that the pet type identification method and the pet type identification apparatus of the present invention can be implemented as a computer readable medium of a computer program product. Further, the present invention provides a readable medium for pet type identification, wherein the readable medium for pet type identification stores a pet type identification method or a pet type identification device, and the readable medium for pet type identification has the functions of identifying an image and measuring the content of the image.
In other words, the pet type identifying method and the pet type identifying apparatus may be implemented in various forms, for example: hardware embodiments, software embodiments (including firmware, resident software, or microcode, etc.). In addition, the pet type identification method and the pet type identification apparatus can be implemented as software and hardware embodiments, for example: a pet type identification electronic module, and a pet type identification embedded device, but not limited thereto. In practical applications, the pet type identification method and the pet type identification device of the present invention can be implemented as a computer program product in a tangible medium, wherein the computer program product has a plurality of image program codes, and the image program codes include the pet type identification method or the pet type identification device.
Combinations of one or more computer usable or readable media may be utilized. For example, the computer-usable or computer-readable medium may be, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples of computer readable media include the following (non-limiting examples): an electrical connection consisting of one or more wires, a portable computer diskette, a hard disk drive, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc (CD-ROM), an optical storage device, a transmission media such as an Internet or intranet based connection, or a magnetic storage device. Note that the computer-usable or computer-readable medium could also be paper or any suitable medium upon which the program is printed for the purpose of electronically rendering the program, such as by optically scanning the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored again in a computer memory. In this context, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program code for use by or in connection with the instruction execution system, apparatus, or device. The computer usable medium may include a propagated data signal with computer usable program code stored thereon, either in baseband or partially carrier form. The computer usable program code may be transmitted using any suitable medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the C programming language or similar programming languages. The program code may execute entirely on the user's computer or partly on the user's computer, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server, in the form of a stand-alone software suite. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made through an external computer (for example, through an Internet service provider).
The present invention is described with reference to flowchart illustrations and/or block diagrams of systems, apparatuses, methods and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and any combination of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the functions or acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the function or act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus create means for implementing the functions or acts specified in the flowchart and/or block diagram block or blocks.
The architecture, functionality, and operation of possible implementations of systems, devices, methods and computer program products according to various embodiments of the present invention are illustrated in the accompanying drawings. Accordingly, each block in the flowchart or block diagrams may represent a module, segment, or portion of program code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, blocks shown in two or more figures may be executed substantially concurrently, or the functions may be executed in reverse order, depending on the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method of pet type identification, comprising:
acquiring a pet image;
extracting a plurality of body features of the pet according to the pet image;
matching the plurality of physical characteristics of the pet with data in a pet physical characteristic database to obtain matched pet types; wherein the pet type is not limited to a single result, and the results are output in sequence according to the matching degree, the result with high matching degree is preferentially displayed in all the results, and the dissatisfaction of the user on the result with preferential display can be switched to the next result with lower matching degree;
carrying out discrete judgment on the extracted multiple body characteristics;
if the discreteness is large, the corresponding body features are not distinguishing features of the type of pets, and when the matching calculation is carried out, the weight proportion is small or the weight proportion is ignored; if the discreteness is small, the corresponding body is the distinguishing characteristic of the type of pet, and the weight ratio is large when the matching calculation is carried out.
2. The method of pet type identification as claimed in claim 1, wherein said obtaining a pet image comprises:
shooting a pet by a camera to obtain an initial pet image;
recording displacement information of a camera when the initial pet image is shot;
and judging whether the displacement information is in a set displacement threshold range allowing identification, and taking the initial pet image as a pet image when judging that the displacement information is in the displacement threshold range.
3. The method for pet type recognition according to claim 1, wherein said extracting a plurality of physical features of the pet from the pet image comprises:
extracting the overall outline of the pet from the pet image;
locating the location areas of the face, torso, tail and limbs from the overall contour of the pet;
extracting face features, trunk features, tail features and limb features of the pet in the position areas of the face, the trunk, the tail and the limbs respectively, wherein the face features comprise eye features, mouth features, nose features, ear features and face contours;
combining the face feature, the trunk feature, the tail feature and the limb feature of the pet together to form a pet body feature set, wherein each feature comprises two parameters of color and shape.
4. A method of pet type identification as claimed in claim 3 wherein the data in the pet physical characteristics database comprises:
the pet type matching image collection system comprises a plurality of pet type matching image collections, wherein each pet type matching image collection comprises one or more preset pet matching images, and each pet matching image collection comprises one pet matching image carrying a pet type mark and is marked as a representative pet image;
extracting a reference pet body feature set from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and the distinguishing characteristic set corresponds to the matching image set of each pet type, represents one or more distinguishing characteristics of a certain type of pet, and is a characteristic used for distinguishing the type of pet from other types of pets.
5. The method for pet type identification according to claim 4, wherein said matching a plurality of physical characteristics of said pet with data in a database of physical characteristics of pets, resulting in a matched pet type, comprises:
judging whether the pet body feature set contains a distinguishing feature set;
if the distinguishing feature set is contained, judging whether the contained distinguishing feature set is more than one;
if yes, recording the included distinguishing feature sets as screening distinguishing feature sets, and screening the screening distinguishing feature sets by adopting a matching mechanism to obtain matched pet types;
and if the pet type matching image set contains one distinguishing feature set, acquiring the pet type mark of the matching image set of the pet type corresponding to the distinguishing feature set to obtain the matched pet type.
6. The method of claim 5, wherein the screening the plurality of screened distinctive feature sets by using a matching mechanism to obtain the matched pet type comprises:
acquiring a screening matching image set of the pet type corresponding to the screening distinguishing feature set;
calling all pet images in the screened matching image set to obtain a reference pet body characteristic set corresponding to each pet image;
comparing the features in the pet body feature set with the features in the called reference pet body feature set one by one, and calculating the matching degree of the pet body feature set and each called reference pet body feature set;
calculating the average value of the matching degrees of the screened matching image set and the pet body feature set according to the matching degrees of all the reference pet body feature sets in the screened matching image set, and taking the average value as the integral matching degree of the pet image and the pet type;
and taking the pet type corresponding to the screened matching image set with the highest overall matching degree as the matching pet type.
7. An apparatus for pet type identification, comprising:
the acquisition module is used for acquiring a pet image;
the characteristic extraction module is used for extracting the body characteristics of the pet according to the pet image;
the matching module is used for matching the plurality of physical characteristics of the pet with the data in the pet physical characteristic database to obtain a matched pet type; wherein the pet type is not limited to a single result, and the results are output in sequence according to the matching degree, the result with high matching degree is preferentially displayed in all the results, and the dissatisfaction of the user on the result with preferential display can be switched to the next result with lower matching degree;
carrying out discrete judgment on the extracted multiple body characteristics;
if the discreteness is large, the corresponding body features are not distinguishing features of the type of pets, and when the matching calculation is carried out, the weight proportion is small or the weight proportion is ignored; if the discreteness is small, the corresponding physical characteristics are distinguishing characteristics of the type of pets, and the weight ratio is large when the matching calculation is carried out.
8. The apparatus for pet type identification as claimed in claim 7, wherein the feature extraction module comprises:
the overall contour extraction unit is used for extracting an overall contour of the pet from the pet image;
a body position locating unit for locating the position areas of the face, the trunk, the tail and the limbs from the overall outline of the pet;
a feature extraction unit for extracting face features, trunk features, tail features and limb features of the pet in the position areas of the face, trunk, tail and limb respectively, wherein the face features comprise eye features, mouth features, nose features, ear features and face contour;
combining the face feature, the trunk feature, the tail feature and the limb feature of the pet together to form a pet body feature set, wherein each feature comprises two parameters of color and shape.
9. The pet type identification device of claim 8, wherein the pet body characteristic database comprises:
the pet type matching image collection system comprises a plurality of pet type matching image collections, wherein each pet type matching image collection comprises one or more preset pet matching images, and each pet matching image collection comprises one pet matching image carrying a pet type mark and is marked as a representative pet image;
extracting a reference pet body feature set from each pet matching image, wherein the reference pet body feature set comprises a plurality of features of the pet in the pet matching image;
and the distinguishing characteristic set corresponds to the matching image set of each pet type, represents one or more distinguishing characteristics of a certain type of pet, and is a characteristic used for distinguishing the type of pet from other types of pets.
10. The apparatus for pet type identification of claim 9, wherein the matching module comprises:
the matching unit is used for judging whether the pet body feature set contains a distinguishing feature set or not;
the result analysis unit is used for analyzing whether more than one distinguishing feature set is contained or not if the distinguishing feature set is contained;
the multi-result processing unit is used for recording the included distinguishing feature sets as screening distinguishing feature sets when the result analysis unit generates a plurality of results, and screening the screening distinguishing feature sets by adopting a matching mechanism to obtain matched pet types;
and the single result processing unit is used for acquiring the pet type mark of the matching image set of the pet type corresponding to the distinguishing characteristic set when the result analysis unit has only one result, so as to obtain the matched pet type.
CN201611032333.8A 2016-11-22 2016-11-22 Pet type identification method and device Active CN106577350B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201611032333.8A CN106577350B (en) 2016-11-22 2016-11-22 Pet type identification method and device
PCT/CN2017/074840 WO2018094892A1 (en) 2016-11-22 2017-02-24 Pet type recognition method and device, and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611032333.8A CN106577350B (en) 2016-11-22 2016-11-22 Pet type identification method and device

Publications (2)

Publication Number Publication Date
CN106577350A CN106577350A (en) 2017-04-26
CN106577350B true CN106577350B (en) 2020-10-09

Family

ID=58591589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611032333.8A Active CN106577350B (en) 2016-11-22 2016-11-22 Pet type identification method and device

Country Status (2)

Country Link
CN (1) CN106577350B (en)
WO (1) WO2018094892A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875564A (en) * 2018-05-02 2018-11-23 浙江工业大学 A kind of pet face recognition method
CN109409319A (en) * 2018-11-07 2019-03-01 北京旷视科技有限公司 A kind of pet image beautification method, device and its storage medium
CN109274891B (en) * 2018-11-07 2021-06-22 北京旷视科技有限公司 Image processing method, device and storage medium thereof
CN113366525A (en) 2019-02-01 2021-09-07 雀巢产品有限公司 Pet food recommendation device and method
CN112101070B (en) * 2019-06-18 2022-09-02 财团法人农业科技研究院 Animal identity recognition system and method for improving recognition rate by nasal print
CN110704646A (en) * 2019-10-16 2020-01-17 支付宝(杭州)信息技术有限公司 Method and device for establishing stored material file
CN111666441A (en) * 2020-04-24 2020-09-15 北京旷视科技有限公司 Method, device and electronic system for determining personnel identity type
CN112132026B (en) * 2020-09-22 2024-07-05 深圳赛安特技术服务有限公司 Animal identification method and device
CN113091248B (en) * 2021-03-29 2022-09-06 青岛海尔空调器有限总公司 Control method, device and equipment of air conditioner and storage medium
CN113657318B (en) * 2021-08-23 2024-05-07 平安科技(深圳)有限公司 Pet classification method, device, equipment and storage medium based on artificial intelligence
CN115546838B (en) * 2022-10-20 2023-11-24 星宠王国(北京)科技有限公司 Wedding system and method based on dog face image recognition technology

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5134277A (en) * 1983-11-07 1992-07-28 Australian Meat And Live-Stock Corporation Remote data transfer system with ambient light insensitive circuitry
EP0821912A2 (en) * 1996-07-25 1998-02-04 Oki Electric Industry Co., Ltd. Animal body identifying device and body identifying system
CN101309357A (en) * 2007-05-18 2008-11-19 卡西欧计算机株式会社 Image pickup apparatus equipped with function of detecting image shaking
CN201409150Y (en) * 2008-12-19 2010-02-17 康佳集团股份有限公司 Mobile phone capable of identifying pet
CN102523380A (en) * 2011-11-10 2012-06-27 深圳市同洲电子股份有限公司 Method for preventing camera of mobile terminal from shaking, and mobile terminal
JP2012129933A (en) * 2010-12-17 2012-07-05 Casio Comput Co Ltd Imaging apparatus and program
CN102594857A (en) * 2010-10-11 2012-07-18 微软公司 Image identification and sharing on mobile devices
CN103124311A (en) * 2013-01-24 2013-05-29 广东欧珀移动通信有限公司 Mobile phone photographing method
WO2014018531A1 (en) * 2012-07-23 2014-01-30 Clicrweight, LLC Body condition score determination for an animal
CN105744164A (en) * 2016-02-24 2016-07-06 惠州Tcl移动通信有限公司 Photographing method and system for mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101320319B (en) * 2008-05-14 2010-06-09 山东大学 Apparatus for timely recognizing aquatic animal and working method thereof
AU2012253551A1 (en) * 2011-05-09 2014-01-09 Catherine Grace Mcvey Image analysis for determining characteristics of animal and humans
CN104573745B (en) * 2015-01-21 2017-10-03 中国计量学院 Fruit-fly classified method based on magnetic resonance imaging
CN105954281B (en) * 2016-04-21 2018-07-27 南京农业大学 A kind of paddy goes mouldy the method for fungus colony non-damage drive

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5134277A (en) * 1983-11-07 1992-07-28 Australian Meat And Live-Stock Corporation Remote data transfer system with ambient light insensitive circuitry
EP0821912A2 (en) * 1996-07-25 1998-02-04 Oki Electric Industry Co., Ltd. Animal body identifying device and body identifying system
CN101309357A (en) * 2007-05-18 2008-11-19 卡西欧计算机株式会社 Image pickup apparatus equipped with function of detecting image shaking
CN201409150Y (en) * 2008-12-19 2010-02-17 康佳集团股份有限公司 Mobile phone capable of identifying pet
CN102594857A (en) * 2010-10-11 2012-07-18 微软公司 Image identification and sharing on mobile devices
JP2012129933A (en) * 2010-12-17 2012-07-05 Casio Comput Co Ltd Imaging apparatus and program
CN102523380A (en) * 2011-11-10 2012-06-27 深圳市同洲电子股份有限公司 Method for preventing camera of mobile terminal from shaking, and mobile terminal
WO2014018531A1 (en) * 2012-07-23 2014-01-30 Clicrweight, LLC Body condition score determination for an animal
CN103124311A (en) * 2013-01-24 2013-05-29 广东欧珀移动通信有限公司 Mobile phone photographing method
CN105744164A (en) * 2016-02-24 2016-07-06 惠州Tcl移动通信有限公司 Photographing method and system for mobile terminal

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
人脸信息识别技术研究及在虚拟宠物上的应用;刘子山;《中国优秀硕士学位论文全文数据库 信息科技辑》;20130615(第6期);全文 *
基于图像识别的濒危动物信息采集系统设计;高旭 等;《电子世界》;20160815(第15期);第96、99页 *
宠物猫脸检测的方法研究;谢素仪;《中国优秀硕士学位论文全文数据库 信息科技辑》;20101215(第12期);全文 *
面向珍稀野生动物保护的图像监测与识别技术研究;曾陈颖;《中国优秀硕士学位论文全文数据库 信息科技辑》;20151015(第10期);全文 *

Also Published As

Publication number Publication date
CN106577350A (en) 2017-04-26
WO2018094892A1 (en) 2018-05-31

Similar Documents

Publication Publication Date Title
CN106577350B (en) Pet type identification method and device
US8923570B2 (en) Automated memory book creation
KR20180081104A (en) A method for acquiring and analyzing aerial images
JP5875917B2 (en) Moving body image discrimination apparatus and moving body image discrimination method
CN111753594B (en) Dangerous identification method, device and system
US20160179846A1 (en) Method, system, and computer readable medium for grouping and providing collected image content
EP2709062B1 (en) Image processing device, image processing method, and computer readable medium
CN111723729A (en) Intelligent identification method for dog posture and behavior of surveillance video based on knowledge graph
KR100956159B1 (en) Apparatus and auto tagging method for life-log
US20170200068A1 (en) Method and a System for Object Recognition
US8593557B2 (en) Shooting assist method, program product, recording medium, shooting device, and shooting system
US20210390335A1 (en) Generation of labeled synthetic data for target detection
CN109685106A (en) A kind of image-recognizing method, face Work attendance method, device and system
WO2019198611A1 (en) Feature estimation device and feature estimation method
CN109120844A (en) Video camera controller, camera shooting control method and storage medium
CN110728188A (en) Image processing method, device, system and storage medium
JP2022100358A (en) Search method and device in search support system
JP2023526390A (en) Golf course divot detection system and detection method using the same
KR20180094554A (en) Apparatus and method for reconstructing 3d image
US20180260620A1 (en) Scouting image management system
CN110581950B (en) Camera, system and method for selecting camera settings
CN109960965A (en) Methods, devices and systems based on unmanned plane identification animal behavior
US10860876B2 (en) Image presentation system, image presentation method, program, and recording medium
CN112733809B (en) Intelligent image identification method and system for natural protection area monitoring system
CN111247790A (en) Image processing method and device, image shooting and processing system and carrier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210915

Address after: 518000 201, No.26, yifenghua Innovation Industrial Park, Xinshi community, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen waterward Information Co.,Ltd.

Address before: 5 / F, block B, huayuancheng digital building, 1079 Nanhai Avenue, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN WATER WORLD Co.,Ltd.