CN114283452A - Animal character analysis method, device, electronic equipment and storage medium - Google Patents

Animal character analysis method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114283452A
CN114283452A CN202111626480.9A CN202111626480A CN114283452A CN 114283452 A CN114283452 A CN 114283452A CN 202111626480 A CN202111626480 A CN 202111626480A CN 114283452 A CN114283452 A CN 114283452A
Authority
CN
China
Prior art keywords
animal
character
nose
information
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111626480.9A
Other languages
Chinese (zh)
Inventor
彭永鹤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Ruipeng Pet Healthcare Group Co Ltd
Original Assignee
New Ruipeng Pet Healthcare Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Ruipeng Pet Healthcare Group Co Ltd filed Critical New Ruipeng Pet Healthcare Group Co Ltd
Priority to CN202111626480.9A priority Critical patent/CN114283452A/en
Publication of CN114283452A publication Critical patent/CN114283452A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses an animal character analysis method, an animal character analysis device, electronic equipment and a storage medium, wherein the method comprises the following steps: determining variety information corresponding to the animal and acquiring the nasal print information of the animal; determining a character database corresponding to the animal according to the breed information of the animal; acquiring subjective feedback information of a user; and determining the character of the animal according to the nose print information, the subjective feedback information and the character database. In the embodiment of the application, the electronic device can preliminarily determine the character of the animal through the nose print information of the animal.

Description

Animal character analysis method, device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of deep learning, in particular to an animal character analysis method and device, electronic equipment and a storage medium.
Background
Different animals have different characters, or are mild, or quiet, or enthusiastic, or fierce, etc., due to the growing environment of the animal and the character development of the animal itself.
However, when a lead wants to lead an animal, the character of the animal is unknown because the lead first contacts the animal. It can lead to the lead being difficult to decide which animal to lead to, and even lead to the lead being abandoned because of the poor animal's performance.
Disclosure of Invention
The embodiment of the application provides an animal character analysis method and device, electronic equipment and a storage medium. The animal character analysis method can analyze the character of the animal.
In a first aspect, the present embodiments provide a method for analyzing animal character, including:
determining variety information corresponding to the animal, and acquiring the nasal print information of the animal;
determining a character database corresponding to the animal according to the breed information of the animal;
acquiring subjective feedback information of a user;
and determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
In a second aspect, embodiments of the present application provide an animal personality analysis device, including:
the first determining module is used for determining breed information corresponding to the animal and acquiring the nose print information of the animal;
the second determining module is used for determining a character database corresponding to the animal according to the breed information of the animal;
the acquisition module is used for acquiring subjective feedback information of a user;
and the analysis module is used for determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
In a third aspect, an electronic device is provided in an embodiment of the present application, including a memory storing executable program code, a processor coupled to the memory; the processor calls the executable program code stored in the memory to execute the steps in the animal character analysis method provided by the embodiment of the application.
In a fourth aspect, embodiments of the present application provide a storage medium storing a plurality of instructions, where the instructions are suitable for being loaded by a processor to perform steps in the animal character analysis method provided by embodiments of the present application.
In the embodiment of the application, the electronic equipment determines the breed information corresponding to the animal and acquires the nose print information of the animal; determining a character database corresponding to the animal according to the breed information of the animal; acquiring subjective feedback information of a user; and determining the character of the animal according to the nose print information, the subjective feedback information and the character database. In the embodiment of the application, the electronic device can preliminarily determine the character of the animal through the nose print information of the animal.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a first process of an animal character analysis method provided in an embodiment of the present application.
Fig. 2 is a scene schematic diagram of acquiring nasal print information according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a second process of the animal character analysis method provided in the embodiment of the present application.
Fig. 4 is a schematic structural diagram of an animal character analysis device provided in an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Different animals have different characters, or are mild, or quiet, or enthusiastic, or fierce, etc., due to the growing environment of the animal and the character development of the animal itself.
However, when a lead wants to lead an animal, the character of the animal is unknown because the lead first contacts the animal. It can lead to the lead being difficult to decide which animal to lead to, and even lead to the lead being abandoned because of the poor animal's performance.
Therefore, preliminary determination of animal character is a technical problem to be solved.
In order to solve the technical problem, embodiments of the present application provide an animal character analysis method, an animal character analysis device, an electronic device, and a storage medium. Wherein the animal personality analysis can analyze the personality of the animal.
The animal character analysis method can be applied to common electronic equipment such as computers, mobile phones and tablet computers, and is also suitable for wearable electronic equipment such as intelligent glasses, intelligent watches and intelligent rings. And are not intended to be limiting herein.
Referring to fig. 1, fig. 1 is a first flowchart of a method for analyzing animal characters according to an embodiment of the present disclosure. The animal character analysis method can comprise the following steps:
110. determining variety information corresponding to the animal, and acquiring the nasal print information of the animal.
In some embodiments, the electronic device may capture a picture of the animal through the camera, and then analyze breed information of the animal through the picture of the animal. For example, the electronic device may recognize the shape of the animal to determine which family the animal belongs to, recognize the characteristics of the animal such as the pattern, the hair color, and the voice to determine which animal the animal belongs to, and determine the breed information of the animal.
The user may also actively enter breed information for the animal on the electronic device.
Referring to fig. 2, fig. 2 is a schematic view of a scene for acquiring nose print information according to an embodiment of the present disclosure.
The electronic device S10 is provided with a sensor S11, and the sensor S11 can scan the nose of the animal to obtain the nose print information of the animal. The sensor S11 can be a laser sensor, an ultrasonic sensor, an image sensor, etc., and the electronic device can scan the nose of the animal within a certain distance, thereby obtaining the nose print information of the animal.
Specifically, when the electronic device scans through laser, the nose of the animal can be scanned through a tof (time of fly) sensor, after the laser irradiates the nose, because the depths of the nasal veins corresponding to different nasal veins are different, the time of the laser irradiating different nasal veins is also different, so that the time of the different laser signals reflecting to the laser sensor is also different, a phase difference can be generated between the emitted laser signals and the corresponding reflected laser signals, and the electronic device constructs a three-dimensional space nasal vein image corresponding to the nasal veins of the animal, namely a 3D nasal vein image, according to the phase difference.
The electronic equipment can also transmit the nasal veins of the animal scanned by ultrasonic waves to the nose of the animal through the ultrasonic waves, because the depth of the nasal veins corresponding to different nasal veins is different, the time for the ultrasonic waves to propagate to different nasal veins is also different, so that the time for different ultrasonic signals to be reflected to the ultrasonic sensor is also different, a phase difference can be generated between the transmitted ultrasonic signals and the corresponding reflected ultrasonic signals, and the electronic equipment can construct a three-dimensional space nasal vein image corresponding to the nasal veins of the animal according to the phase difference.
The electronic device can determine the number of the nasal wrinkles and the depth of the nasal wrinkles according to the three-dimensional space nasal pattern. For example, peaks and valleys exist in the three-dimensional nose pattern map, and the peaks and the valleys are distributed in a staggered manner to form the nose pattern, wherein the number of the nose patterns can be determined according to the number of the valleys, namely the number of the valleys can be the number of the nose patterns. And the depth from the wave crest to the wave trough is the depth corresponding to the nose line.
The electronic equipment can also shoot images of the nose of the animal through the camera, then optimize the images of the nose to obtain a plurality of optimized images, then obtain a nose pattern image of the animal according to the plurality of optimized images, and then identify the nose pattern image so as to determine the number of the nose patterns of the animal and the distribution situation of the nose patterns.
For example, the electronic device may acquire images of noses of a plurality of animals at the same shooting position, perform grayscale processing on the images of noses of the plurality of animals to obtain a plurality of grayscale images, and perform processing methods such as brightness adjustment, contrast adjustment, and sharpening adjustment on the grayscale images to obtain a plurality of optimized images. And the electronic equipment performs image fusion processing on the plurality of optimized images to obtain the nose pattern image.
The electronic equipment can identify and process the nose pattern image so as to determine the number of the nose patterns corresponding to the final animal nose pattern and the distribution condition of the nose patterns.
For example, since the peaks and valleys corresponding to the nose pattern are different in the nose pattern image, the pixel information corresponding to the peaks and valleys of the nose pattern on the nose pattern image is different. For example, the gray values of the peaks and the valleys on the nose pattern image are different, and the gray value corresponding to the peak region pixel is lower than the gray value corresponding to the valley region pixel on the nose pattern image. The electronic device can identify the gray value corresponding to the pixel to determine the peak and the trough, and finally the electronic device can determine the number of the nose wrinkles according to the number of the troughs. The distribution condition of the nose lines can be determined according to the distribution conditions of the wave crests and the wave troughs.
After the electronic device obtains the nose image of the animal, the nose image of the animal can be input into the neural network model for image segmentation, for example, the image segmentation models such as a U-Net model and an Encoder-Decoder model can be adopted to segment the image of the nose of the animal, namely, the image corresponding to the wave crest and the image corresponding to the wave trough on the nose are segmented, so that the image corresponding to the wave crest and/or the image corresponding to the wave trough in the nose pattern are obtained, and finally, the number of the nose patterns and the distribution condition of the nose patterns are determined according to the image corresponding to the wave crest and/or the image corresponding to the wave trough.
It should be noted that the nasal veins on the animal nose are formed by staggering peaks and troughs, and after the image corresponding to the peak or the image corresponding to the trough is acquired, the number of the nasal veins can be determined according to the number of the peaks or the number of the troughs.
In some embodiments, after the electronic device obtains the number of the animal nose lines, the depth of the nose lines and the distribution of the nose lines, the corresponding nose line information of the animal can be generated according to the information.
Specifically, the electronic device may determine at least one target area on the nose of the animal, and obtain the nasal print information corresponding to the at least one target area.
The nasal striations are more pronounced in some areas of the nose of the animal, such as in the middle of the nose of the animal. While the area on the nose, such as the edge of the animal's nose, has no obvious nasal print features. At least one target area can be determined in the middle of the nose of the animal, and then the corresponding nose line information of the target area is obtained.
The electronic device may also determine a nostril region and a nose edge region, remove the nostril region and the nose edge region, determine a remaining region of the nose as a region where a nose ridge may be acquired, and then determine at least one target region on the region where the nose ridge may be acquired.
In some embodiments, the electronic device can obtain the number of nasal wrinkles and/or the depth of the nasal wrinkles corresponding to at least one target area, and then determine the nasal wrinkle information of the animal according to the number of nasal wrinkles and/or the depth of the nasal wrinkles.
For example, the electronic device may determine a distribution of the nasal print depth of each nasal print, obtain a nasal print depth distribution map, and then determine the nasal print information of the animal according to the nasal print depth distribution map. As can be seen from the above description, the depth of each nasal print in the target region may be obtained by laser scanning, ultrasonic scanning, or the like, and then a nasal print depth distribution map is constructed according to the depth of each nasal print.
In some embodiments, the electronic device may further determine a plurality of target peak points and a plurality of target valley points within a target region of a three-dimensional nose pattern (nose pattern depth profile), where the target peak points are peak points having a height higher than a first preset height, and the target valleys are valley points having a height lower than a second preset height. And then determining the vector distance between each target peak point and the nearest target valley point, and generating the nose pattern information corresponding to the animal according to the vector distance between each target peak point and the nearest target valley point.
In some embodiments, the electronic device may further determine a plurality of target sub-regions in the target region, determine the number of nasal prints corresponding to each target sub-region, and then determine the nasal print information of the animal according to the number of nasal prints corresponding to each target sub-region.
For example, after the electronic device determines the target area on the nose of the animal, the target area may be divided into a plurality of target sub-areas, and then the number of nose prints in each target sub-area is determined. And the electronic equipment generates the nasal print information of the animal according to the number of the nasal prints corresponding to each target subregion.
When the target area is divided, the target area may be divided according to a preset division rule, for example, the shape of the nose of the animal is determined first, and then the shape corresponding to each target sub-area is determined according to the shape of the nose of the animal. Then, the number of the divided target sub-regions is determined according to the area covered by the target region, for example, the larger the area of the target region is, the larger the number of the divided target sub-regions is, and the smaller the area of the target region is, the smaller the number of the divided target sub-regions is.
In some embodiments, after the electronic device determines at least one target area on the animal's nasal print, the electronic device may obtain the number of the nasal prints of the target area and the corresponding nasal print depth of each nasal print, and then generate the animal's nasal print information according to the number of the nasal prints and the corresponding nasal print depth of each nasal print.
In the above description, the acquisition of the animal nose print information is only an example, and the nose print information of the animal may be determined in another manner in the manner of actually acquiring the animal nose print information. Wherein, the nasal print information of the animal can reflect the identity of the animal.
120. And determining a character database corresponding to the animal according to the breed information of the animal.
In some embodiments, the corresponding character is different for different breeds of animals. For example, in canine animals, the character of golden-haired dogs is mild, the character of teddy dogs is more active, and the character of poodle dogs is more active and lovely. Each animal corresponds to a plurality of characters, and the animals of the variety generally correspond to one character, for example, most of the golden retriever characters are mild and smooth.
The electronic equipment can determine the character database corresponding to the animal of the breed according to the breed information, for example, the character corresponding to the noble dog has characters such as lively and lovely, frightened and afraid of people, and afraid of fear, and the characters can form the character database corresponding to the noble dog.
130. And acquiring subjective feedback information of the user.
When the user touches the animal, the character of the animal can be roughly judged according to some action behaviors of the animal, the cry of the animal, the expression of the animal and the like.
Then the user can input the approximate character of the animal judged according to self consciousness on the electronic equipment, for example, when the user encounters a cat, the cat can actively approach the user and then send out a voice, the cat can also actively rub and rub a human, then the user can judge that the cat is a parent cat according to the subjective consciousness, and the input character of the cat can be characters such as warm and smooth, sticky and the like.
140. And determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
After the electronic device obtains the nose print information, the subjective feedback information and the character database, the electronic device can generate an input sample according to the nose print information and the subjective feedback information, and then input the input sample into the character analysis model to determine the character analysis result of the animal. And finally, matching the character analysis result with a character database to determine the character of the animal.
In some embodiments, before the input sample is input into the character analysis model, the base model corresponding to the character analysis model needs to be trained to obtain the character analysis model.
For example, a large amount of data corresponding to the nasal print of the animal and the character of the animal may be obtained, and the data may be determined as training data, in which a plurality of sets of data corresponding to the nasal print of the animal a and the character of the animal a are included, such as a set of data corresponding to the nasal print of the animal a and the character of the animal B, and a set of data corresponding to the nasal print of the animal B and the character of the animal B.
It should be noted that the training data includes positive sample data and negative sample data, the positive sample data may be a correct correspondence between an animal nose print and an animal character, and the negative sample data may be an incorrect correspondence between an animal nose print and an animal character.
And setting a corresponding loss function for the basic model, inputting training data into the basic model, training the basic model, and considering that the training of the basic model is finished through multiple rounds of training until the basic model is converged to obtain the character analysis model.
Specifically, the electronic device may compare the character analysis result output by the basic model with the actual animal character, and if the similarity between the two is greater than a preset similarity threshold, the basic model is considered to have been trained, and at this time, the basic model may be determined as the character analysis model.
In some embodiments, the electronic device may generate a first feature vector based on the nose print information, generate a second feature vector based on the subjective feedback information, and finally generate the input sample based on the first feature vector and the second feature vector.
In some embodiments, the electronic device may normalize the first feature vector and the second feature vector to obtain a first normalized vector and a second normalized vector, and then generate the input sample from the first normalized vector and the second normalized vector. Such as adding the first normalized vector and the second normalized vector to generate the input sample.
In the process of normalizing the first feature vector and the second feature vector, a normalization function may be used to normalize the respective values of the first feature vector and the second feature vector to a certain range. For example, the Sigmoid function is adopted to normalize the numerical values respectively corresponding to the first feature vector and the second feature vector to be within the numerical range of 0-1. Therefore, the processing of the subsequent character analysis model on the input sample is facilitated, and the prediction result of the character analysis model is improved.
In some embodiments, after the input samples are input to the personality analysis model, the personality analysis model outputs corresponding personality analysis results based on the input samples. The electronic device can then match the results of the personality analysis to a personality database to determine the personality of the animal.
Specifically, the electronic device may determine cosine similarity between the character analysis result and each character in the character database, and if the cosine similarity is greater than a preset similarity threshold, determine that the character corresponding to the cosine similarity is the character of the animal.
In the embodiment of the application, the electronic equipment determines the breed information corresponding to the animal and acquires the nose print information of the animal; determining a character database corresponding to the animal according to the breed information of the animal; acquiring subjective feedback information of a user; and determining the character of the animal according to the nose print information, the subjective feedback information and the character database. In the embodiment of the application, the electronic device can preliminarily determine the character of the animal through the nose print information of the animal.
For a more detailed understanding of the animal character analysis method provided in the embodiments of the present application, please refer to fig. 3, and fig. 3 is a second flowchart of the animal character analysis method provided in the embodiments of the present application. The animal character analysis method can comprise the following steps:
201. and determining breed information corresponding to the animal.
In some embodiments, the electronic device may capture a picture of the animal through the camera, and then analyze breed information of the animal through the picture of the animal. For example, the electronic device may recognize the shape of the animal to determine which family the animal belongs to, recognize the characteristics of the animal such as the pattern, the hair color, and the voice to determine which animal the animal belongs to, and determine the breed information of the animal.
The user may also actively enter breed information for the animal on the electronic device.
202. At least one target area is determined on the nose of the animal.
The nasal striations are more pronounced in some areas of the nose of the animal, such as in the middle of the nose of the animal. While the area on the nose, such as the edge of the animal's nose, has no obvious nasal print features. At least one target area can be determined in the middle of the nose of the animal, and then the corresponding nose line information of the target area is obtained.
The electronic device may also determine a nostril region and a nose edge region, remove the nostril region and the nose edge region, determine a remaining region of the nose as a region where a nose ridge may be acquired, and then determine at least one target region on the region where the nose ridge may be acquired.
203. And acquiring the number and/or the depth of the nasal veins corresponding to at least one target area.
Specifically, when the electronic device scans through laser, the nose of the animal can be scanned through a tof (time of fly) sensor, after the laser irradiates the nose, because the depths of the nasal veins corresponding to different nasal veins are different, the time of the laser irradiating different nasal veins is also different, so that the time of the different laser signals reflecting to the laser sensor is also different, a phase difference can be generated between the emitted laser signals and the corresponding reflected laser signals, and the electronic device constructs a three-dimensional space nasal vein image corresponding to the nasal veins of the animal, namely a 3D nasal vein image, according to the phase difference.
The electronic equipment can also transmit the nasal veins of the animal scanned by ultrasonic waves to the nose of the animal through the ultrasonic waves, because the depth of the nasal veins corresponding to different nasal veins is different, the time for the ultrasonic waves to propagate to different nasal veins is also different, so that the time for different ultrasonic signals to be reflected to the ultrasonic sensor is also different, a phase difference can be generated between the transmitted ultrasonic signals and the corresponding reflected ultrasonic signals, and the electronic equipment can construct a three-dimensional space nasal vein image corresponding to the nasal veins of the animal according to the phase difference.
The electronic device can determine the number of the nasal wrinkles and the depth of the nasal wrinkles according to the three-dimensional space nasal pattern. For example, peaks and valleys exist in the three-dimensional nose pattern map, and the peaks and the valleys are distributed in a staggered manner to form the nose pattern, wherein the number of the nose patterns can be determined according to the number of the valleys, namely the number of the valleys can be the number of the nose patterns. And the depth from the wave crest to the wave trough is the depth corresponding to the nose line.
The electronic equipment can also shoot images of the nose of the animal through the camera, then optimize the images of the nose to obtain a plurality of optimized images, then obtain a nose pattern image of the animal according to the plurality of optimized images, and then identify the nose pattern image so as to determine the number of the nose patterns of the animal and the distribution situation of the nose patterns.
For example, the electronic device may acquire images of noses of a plurality of animals at the same shooting position, perform grayscale processing on the images of noses of the plurality of animals to obtain a plurality of grayscale images, and perform processing methods such as brightness adjustment, contrast adjustment, and sharpening adjustment on the grayscale images to obtain a plurality of optimized images. And the electronic equipment performs image fusion processing on the plurality of optimized images to obtain the nose pattern image.
The electronic equipment can identify and process the nose pattern image so as to determine the number of the nose patterns corresponding to the final animal nose pattern and the distribution condition of the nose patterns.
For example, since the peaks and valleys corresponding to the nose pattern are different in the nose pattern image, the pixel information corresponding to the peaks and valleys of the nose pattern on the nose pattern image is different. For example, the gray values of the peaks and the valleys on the nose pattern image are different, and the gray value corresponding to the peak region pixel is lower than the gray value corresponding to the valley region pixel on the nose pattern image. The electronic device can identify the gray value corresponding to the pixel to determine the peak and the trough, and finally the electronic device can determine the number of the nose wrinkles according to the number of the troughs. The distribution condition of the nose lines can be determined according to the distribution conditions of the wave crests and the wave troughs.
After the electronic device obtains the nose image of the animal, the nose image of the animal can be input into the neural network model for image segmentation, for example, the image segmentation models such as a U-Net model and an Encoder-Decoder model can be adopted to segment the image of the nose of the animal, namely, the image corresponding to the wave crest and the image corresponding to the wave trough on the nose are segmented, so that the image corresponding to the wave crest and/or the image corresponding to the wave trough in the nose pattern are obtained, and finally, the number of the nose patterns and the distribution condition of the nose patterns are determined according to the image corresponding to the wave crest and/or the image corresponding to the wave trough.
It should be noted that the nasal veins on the animal nose are formed by staggering peaks and troughs, and after the image corresponding to the peak or the image corresponding to the trough is acquired, the number of the nasal veins can be determined according to the number of the peaks or the number of the troughs.
204. Determining the nasal print information of the animal according to the number of nasal prints and/or the depth of the nasal prints.
In some embodiments, the electronic device may determine a distribution of the nasal print depth of each nasal print, obtain a nasal print depth distribution map, and then determine the nasal print information of the animal according to the nasal print depth distribution map.
For example, the electronic device may further determine a plurality of target peak points and a plurality of target valley points within the target region of the nose pattern depth profile, where the target peak points are peak points having a height higher than a first preset height, and the target valleys are valley points having a height lower than a second preset height. And then determining the vector distance between each target peak point and the nearest target valley point, and generating the nose pattern information corresponding to the animal according to the vector distance between each target peak point and the nearest target valley point.
In some embodiments, the electronic device may further determine a plurality of target sub-regions in the target region, determine the number of nasal prints corresponding to each target sub-region, and then determine the nasal print information of the animal according to the number of nasal prints corresponding to each target sub-region.
For example, after the electronic device determines the target area on the nose of the animal, the target area may be divided into a plurality of target sub-areas, and then the number of nose prints in each target sub-area is determined. And the electronic equipment generates the nasal print information of the animal according to the number of the nasal prints corresponding to each target subregion.
In some embodiments, after the electronic device determines at least one target area on the animal's nasal print, the electronic device may obtain the number of the nasal prints of the target area and the corresponding nasal print depth of each nasal print, and then generate the animal's nasal print information according to the number of the nasal prints and the corresponding nasal print depth of each nasal print.
205. And generating an input sample according to the nasal print information and the subjective feedback information.
In some embodiments, the electronic device may generate a first feature vector based on the nose print information, generate a second feature vector based on the subjective feedback information, and finally generate the input sample based on the first feature vector and the second feature vector.
In some embodiments, the electronic device may normalize the first feature vector and the second feature vector to obtain a first normalized vector and a second normalized vector, and then generate the input sample from the first normalized vector and the second normalized vector. Such as adding the first normalized vector and the second normalized vector to generate the input sample.
In the process of normalizing the first feature vector and the second feature vector, a normalization function may be used to normalize the respective values of the first feature vector and the second feature vector to a certain range. For example, the Sigmoid function is adopted to normalize the numerical values respectively corresponding to the first feature vector and the second feature vector to be within the numerical range of 0-1. Therefore, the processing of the subsequent character analysis model on the input sample is facilitated, and the prediction result of the character analysis model is improved.
206. And inputting the input sample into a character analysis model, and determining the character analysis result of the animal.
In some embodiments, the input samples are input into a personality analysis model, which may predict corresponding personality analysis results from the input samples.
207. Determining cosine similarity between the result of the character analysis and each character in the character database.
The electronic device may determine a first target vector corresponding to the result of the character analysis and a second target vector corresponding to each character.
And then determining cosine similarity between the first target vector and each second target vector, thereby determining cosine similarity between the character analysis result and each character in the character database.
208. And if the cosine similarity is greater than a preset similarity threshold, determining that the character corresponding to the cosine similarity is the character of the animal.
In some embodiments, the cosine similarity may be set according to the prediction accuracy of the character analysis model. For example, in some lightweight personality analysis models, due to the fact that the number of neural network layers in the model is small, the number of corresponding convolution kernels is small, the number of corresponding input feature maps is small, or the number of input channels is small, the prediction accuracy of the lightweight personality analysis model tends to be low. At this time, a lower preset similarity threshold may be set.
For the character analysis model with more training times and more complex model structure, a higher preset similarity threshold value can be correspondingly set due to higher prediction accuracy of the character analysis model.
It should be noted that other ways may be used to match the character analysis result with the character database, and the above process of comparing the character analysis result with each character in the character database is only an example and should not be limited to the present application.
In the embodiment of the application, the electronic device determines at least one target area on the nose of an animal by determining the breed information corresponding to the animal, obtains the number and/or the depth of the nasal striations corresponding to the at least one target area, determines the nasal striation information of the animal according to the number and/or the depth of the nasal striations, and generates the input sample according to the nasal striation information and the subjective feedback information. And inputting the input sample into a character analysis model, and determining the character analysis result of the animal. Determining cosine similarity between the result of the character analysis and each character in the character database. And if the cosine similarity is greater than a preset similarity threshold, determining that the character corresponding to the cosine similarity is the character of the animal. In the embodiment of the application, the electronic device can preliminarily determine the character of the animal through the nose print information of the animal.
Correspondingly, the embodiment of the present application further provides an animal character analysis device, as shown in fig. 4, fig. 4 is a schematic structural diagram of the animal character analysis device provided in the embodiment of the present application. The animal character analysis comprises the following steps:
the first determining module 310 is configured to determine breed information corresponding to the animal, and acquire nose print information of the animal.
The first determination module 310 is further configured to determine at least one target area on the nose of the animal; and acquiring the corresponding nose print information of at least one target area.
The first determining module 310 is further configured to obtain a number of nasal wrinkles and/or a depth of the nasal wrinkles corresponding to at least one target region; determining the nasal print information of the animal according to the number of nasal prints and/or the depth of the nasal prints.
And a second determining module 320, configured to determine a character database corresponding to the animal according to the breed information of the animal.
The obtaining module 330 is configured to obtain subjective feedback information of the user.
And the analysis module 340 is used for determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
The analysis module 340 is further configured to generate an input sample according to the nasal print information and the subjective feedback information; inputting the input sample into a character analysis model, and determining the character analysis result of the animal; the results of the character analysis are matched against a database of characters to determine the character of the animal.
The analysis module 340 is further configured to generate a first feature vector according to the nose print information; generating a second eigenvector according to the subjective feedback information; an input sample is generated from the first feature vector and the second feature vector.
The analysis module 340 is further configured to perform normalization processing on the first feature vector and the second feature vector to obtain a first normalized vector and a second normalized vector; the first normalized vector and the second normalized vector are added to generate an input sample.
The analysis module 340 is further configured to determine cosine similarity between the result of the character analysis and each character in the character database; and if the cosine similarity is greater than a preset similarity threshold, determining that the character corresponding to the cosine similarity is the character of the animal.
In the embodiment of the application, the electronic equipment determines the breed information corresponding to the animal and acquires the nose print information of the animal; determining a character database corresponding to the animal according to the breed information of the animal; acquiring subjective feedback information of a user; and determining the character of the animal according to the nose print information, the subjective feedback information and the character database. In the embodiment of the application, the electronic device can preliminarily determine the character of the animal through the nose print information of the animal.
Accordingly, an electronic device may include, as shown in fig. 5, a memory 401 having one or more computer-readable storage media, an input unit 402, a display unit 403, a sensor 404, a processor 405 having one or more processing cores, and a power supply 406. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the memory 401 may be used to store software programs and modules, and the processor 405 executes various functional applications and data processing by operating the software programs and modules stored in the memory 401. The memory 401 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device, and the like. Further, the memory 401 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 401 may further include a memory controller to provide the processor 405 and the input unit 402 with access to the memory 401.
The input unit 402 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, input unit 402 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 405, and can receive and execute commands sent by the processor 405. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 402 may include other input devices in addition to a touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 403 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 403 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 405 to determine the type of touch event, and then the processor 405 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 5 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The electronic device may also include at least one sensor 404, such as a light sensor, motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the electronic device is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the motion sensor is stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of an electronic device, vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device, detailed descriptions thereof are omitted.
The processor 405 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 401 and calling data stored in the memory 401, thereby performing overall monitoring of the electronic device. Optionally, processor 405 may include one or more processing cores; preferably, the processor 405 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 405.
The electronic device also includes a power source 406 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 405 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 406 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device may further include a camera, a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 405 in the electronic device loads the computer program stored in the memory 401, and the processor 405 loads the computer program, thereby implementing various functions:
determining variety information corresponding to the animal, and acquiring the nasal print information of the animal;
determining a character database corresponding to the animal according to the breed information of the animal;
acquiring subjective feedback information of a user;
and determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium having stored therein a plurality of instructions that can be loaded by a processor to perform the steps of any of the animal character analysis methods provided by the embodiments of the present application. For example, the instructions may perform the steps of:
determining variety information corresponding to the animal, and acquiring the nasal print information of the animal;
determining a character database corresponding to the animal according to the breed information of the animal;
acquiring subjective feedback information of a user;
and determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Because the instructions stored in the storage medium can execute the steps in any animal character analysis method provided in the embodiments of the present application, the beneficial effects that can be achieved by any animal character analysis method provided in the embodiments of the present application can be achieved, and the detailed description is omitted here for the details, see the foregoing embodiments.
The animal character analysis method, the animal character analysis device, the electronic device and the storage medium provided by the embodiments of the present application are described in detail above, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the description of the embodiments is only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A method of animal personality analysis comprising:
determining variety information corresponding to the animal, and acquiring the nasal print information of the animal;
determining a character database corresponding to the animal according to the breed information of the animal;
acquiring subjective feedback information of a user;
and determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
2. The animal personality analysis method of claim 1, wherein the obtaining nasal print information for the animal comprises:
determining at least one target area on the nose of the animal;
and acquiring the corresponding nose print information of the at least one target area.
3. The animal personality analysis method of claim 2, wherein the obtaining nose print information corresponding to the at least one target region comprises:
acquiring the number and/or depth of the nasal veins corresponding to the at least one target area;
determining nasal print information for the animal based on the number of nasal prints and/or the nasal print depth.
4. The animal personality analysis method of claim 1, wherein the determining the personality of the animal from the nose print information, the subjective feedback information, and the personality database comprises:
generating an input sample according to the nose print information and the subjective feedback information;
inputting the input sample into a character analysis model, and determining a character analysis result of the animal;
matching the results of the personality analysis with the personality database to determine the personality of the animal.
5. The animal personality analysis method of claim 4, wherein generating input samples from the nose print information and the subjective feedback information comprises:
generating a first feature vector according to the nose pattern information;
generating a second eigenvector according to the subjective feedback information;
generating the input sample from the first feature vector and the second feature vector.
6. The animal personality analysis method of claim 5, wherein the generating the input samples from the first feature vector and the second feature vector comprises:
normalizing the first feature vector and the second feature vector to obtain a first normalized vector and a second normalized vector;
adding the first normalized vector and the second normalized vector to generate the input sample.
7. The animal personality analysis method of claim 4, wherein the matching the personality analysis results with the personality database to determine the personality of the animal comprises:
determining cosine similarity between the character analysis result and each character in the character database;
and if the cosine similarity is greater than a preset similarity threshold, determining that the character corresponding to the cosine similarity is the character of the animal.
8. An animal personality analysis device, comprising:
the first determining module is used for determining breed information corresponding to the animal and acquiring the nose print information of the animal;
the second determination module is used for determining a character database corresponding to the animal according to the breed information of the animal;
the acquisition module is used for acquiring subjective feedback information of a user;
and the analysis module is used for determining the character of the animal according to the nose print information, the subjective feedback information and the character database.
9. An electronic device, comprising:
a memory storing executable program code, a processor coupled with the memory;
the processor invokes the executable program code stored in the memory to perform the steps in the animal personality analysis method of any one of claims 1-7.
10. A storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the animal personality analysis method of any one of claims 1-7.
CN202111626480.9A 2021-12-28 2021-12-28 Animal character analysis method, device, electronic equipment and storage medium Pending CN114283452A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111626480.9A CN114283452A (en) 2021-12-28 2021-12-28 Animal character analysis method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111626480.9A CN114283452A (en) 2021-12-28 2021-12-28 Animal character analysis method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114283452A true CN114283452A (en) 2022-04-05

Family

ID=80877010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111626480.9A Pending CN114283452A (en) 2021-12-28 2021-12-28 Animal character analysis method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114283452A (en)

Similar Documents

Publication Publication Date Title
US11181985B2 (en) Dynamic user interactions for display control
US10832039B2 (en) Facial expression detection method, device and system, facial expression driving method, device and system, and storage medium
US10786895B2 (en) Operation method for activation of home robot device and home robot device supporting the same
US10496159B2 (en) User input processing with eye tracking
CN111989537A (en) System and method for detecting human gaze and gestures in an unconstrained environment
EP3845282A1 (en) Interaction method of application scenario, and mobile terminal and storage medium
JP2016502721A (en) Gesture detection management for electronic devices
CN110737335B (en) Interaction method and device of robot, electronic equipment and storage medium
KR20190105403A (en) An external device capable of being combined with an electronic device, and a display method thereof.
WO2022142830A1 (en) Application device and air gesture recognition method thereof
RU2671990C1 (en) Method of displaying three-dimensional face of the object and device for it
CN115131604A (en) Multi-label image classification method and device, electronic equipment and storage medium
CN113986093A (en) Interaction method and related device
CN114283943A (en) Health prediction method and device, electronic equipment and storage medium
CN114332932A (en) Health prediction method and device, electronic equipment and storage medium
CN114283452A (en) Animal character analysis method, device, electronic equipment and storage medium
CN114287887A (en) Disease diagnosis method, disease diagnosis device, electronic apparatus, and storage medium
CN113170018A (en) Sleep prediction method, device, storage medium and electronic equipment
CN111796980B (en) Data processing method and device, electronic equipment and storage medium
CN111919250B (en) Intelligent assistant device for conveying non-language prompt
CN114242224A (en) Doctor recommendation method and device, electronic equipment and storage medium
Horng et al. Building an Adaptive Machine Learning Object-Positioning System in a Monocular Vision Environment
KR20210157052A (en) Object recognition method and object recognition apparatus
CN111866192B (en) Pet interaction method, system and device based on pet ball and storage medium
US20240069641A1 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination