CN112040132A - Animal external feature obtaining method and device and computer equipment - Google Patents

Animal external feature obtaining method and device and computer equipment Download PDF

Info

Publication number
CN112040132A
CN112040132A CN202010940011.3A CN202010940011A CN112040132A CN 112040132 A CN112040132 A CN 112040132A CN 202010940011 A CN202010940011 A CN 202010940011A CN 112040132 A CN112040132 A CN 112040132A
Authority
CN
China
Prior art keywords
animal
detected
shooting
image
key feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010940011.3A
Other languages
Chinese (zh)
Inventor
陶品
颜嘉辰
史元春
韩亮
刘志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Qinghai University
Original Assignee
Tsinghua University
Qinghai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Qinghai University filed Critical Tsinghua University
Priority to CN202010940011.3A priority Critical patent/CN112040132A/en
Priority to PCT/CN2020/118607 priority patent/WO2022052189A1/en
Publication of CN112040132A publication Critical patent/CN112040132A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a method, a device and computer equipment for acquiring external features of animals, which utilize a common image collector to acquire a target image of an animal to be detected and shooting attitude information when the image collector shoots the target image, extracting the features of the target image to obtain information of a plurality of key feature points of the animal to be detected, determining the horizontal shooting distance between the central point of the image collector and the animal to be detected according to the position information and the shooting posture information of the first key feature point, thereby obtaining the corresponding external characteristic value to be measured according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the body ruler to be measured, avoiding the injury to the animal to be measured or the measuring user caused by manual measurement, and the equipment cost and professional requirements on the shooting user and the shooting site are reduced, and the universality of the animal external feature acquisition method provided by the application is improved.

Description

Animal external feature obtaining method and device and computer equipment
Technical Field
The application relates to the technical field of image processing, in particular to a method and a device for acquiring external features of animals and computer equipment.
Background
In the field of livestock breeding, in order to realize scientific and effective feeding and management of animals, external characteristics such as weight, body size (such as height, length, chest circumference, body slant length and the like) and the like of the animals are generally required to be measured to determine growth and nutrition conditions of the corresponding animals. However, since animals often cannot listen to human commands quietly, it is difficult to measure external features of animals, especially for animals with large sizes, and in the manual measurement process, the animals may be harmed or measurement personnel may be injured because the animals are not fixed in a matching manner.
In order to avoid the above-mentioned damage, it is proposed to scan the animal by using a laser detector or the like, and obtain the external characteristic value of the animal to be measured according to the scanned three-dimensional reconstruction result, but this measurement method needs to build a special lane for the animal to pass through, and the measurement can be completed, and the required measurement equipment is expensive and has high requirement on the measurement environment, which limits the application and popularization of this measurement method.
Disclosure of Invention
In view of this, in order to avoid the threat of injury to the animal to be measured or the measurement user caused by manual measurement, an embodiment of the present application provides an animal external feature obtaining method, where the method includes:
acquiring a target image of an animal to be detected and shooting posture information when an image collector shoots the target image;
extracting features of the target image to obtain a plurality of pieces of key feature point information of the animal to be detected, wherein the key feature point information comprises position information of corresponding key feature points in the target image;
acquiring a horizontal shooting distance between the center point of the image collector and the animal to be detected according to the shooting posture information and position information of a first key feature point, wherein the first key feature point is a key feature point, the distance between the first key feature point and the ground is smaller than a first threshold value, and the first key feature point is a plurality of key feature points of the animal to be detected;
and obtaining the corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the external characteristic to be detected.
Optionally, the external feature to be detected includes at least one body ruler to be detected and/or at least one attribute to be detected of the animal to be detected, and the external feature value to be detected corresponding to the animal to be detected is obtained according to the horizontal shooting distance and the position information of the second key feature point corresponding to the external feature to be detected, including
Obtaining corresponding parameters of the body scale to be measured of the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body scale to be measured;
and obtaining a corresponding attribute value to be detected of the animal to be detected according to the obtained at least one body size parameter to be detected.
Optionally, the shooting attitude information includes a shooting pitch angle and a shooting roll angle, and the horizontal shooting distance between the image collector center point and the animal to be tested is obtained according to the shooting attitude information and the position information of the first key feature point, including:
determining a first distance between the first key feature point and a horizontal center line of the image collector by using the position information of the first key feature point;
obtaining a first included angle of the first key characteristic point relative to the horizontal center line by using the first distance and the shooting roll angle;
obtaining a second included angle formed by the first key characteristic point and a first vertical point relative to the central point of the image collector by using the first included angle and the shooting pitch angle, wherein the first vertical point is a vertical projection point of the central point of the image collector relative to a ground plane on which the animal to be detected stands;
acquiring the shooting height of the image collector when the target image is shot;
and obtaining the horizontal shooting distance between the central point of the image collector and the animal to be detected by utilizing the shooting height and performing tangent function operation.
Optionally, the obtaining of the corresponding body size parameter to be measured of the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body size to be measured includes:
according to the respective position information of any two key feature points of the animal to be detected, obtaining a third included angle formed by the two corresponding key feature points relative to the central point of the image collector;
obtaining a characteristic point distance between two corresponding key characteristic points by using the third included angle and the horizontal shooting distance;
and obtaining corresponding parameters of the body scale to be measured of the animal to be measured by using the obtained characteristic point distance between the at least two second key characteristic points corresponding to each body scale to be measured.
Optionally, if a first height difference exists between the ground where the operating body and the ground where the to-be-measured animal is located, the acquiring of the shooting height of the image collector when the target image is shot includes:
calling the height of an operation body for shooting the target image and the first height difference;
and acquiring the shooting height of the image collector relative to the ground plane on which the animal to be detected stands according to the height of the operating body and the first height difference.
Optionally, the performing feature extraction on the target image to obtain information of a plurality of key feature points of the animal to be tested, which is included in the target image, includes:
inputting the target image into an image feature extraction model to obtain a plurality of key feature points of the animal to be detected, wherein the key feature points are contained in the target image;
outputting a plurality of key feature points of the animal to be detected contained in the target image;
and responding to the position adjustment operation aiming at the output at least one key characteristic point to obtain a plurality of pieces of key characteristic point information of the animal to be detected.
Optionally, the method further includes:
performing integrity verification on the obtained multiple key characteristic points of the animal to be tested;
and if the verification fails, outputting first prompt information, wherein the first prompt information is used for reminding an operator to newly shoot the target image of the animal to be detected.
The application also provides an animal appearance characteristic acquisition device, the device includes:
the information acquisition module is used for acquiring a target image of an animal to be detected and shooting posture information when the image collector shoots the target image;
the characteristic extraction module is used for extracting characteristics of the target image to obtain a plurality of pieces of key characteristic point information of the animal to be detected, wherein the key characteristic point information comprises position information of corresponding key characteristic points in the target image;
the horizontal shooting distance obtaining module is used for obtaining a horizontal shooting distance between the center point of the image collector and the animal to be detected according to the shooting posture information and the position information of a first key feature point, wherein the first key feature point is a key feature point, the distance between the first key feature point and the ground is smaller than a first threshold value, and the first key feature point is a plurality of key feature points of the animal to be detected;
and the external characteristic value obtaining module is used for obtaining the corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the body scale to be detected.
The present application further proposes a computer device, the computer device comprising:
a communication interface;
a memory for storing a program for implementing the animal attribute value acquisition method as described above;
and the processor is used for loading and executing the program to realize the steps of the animal attribute value acquisition method.
Optionally, the computer device is a mobile terminal or a server;
when the computer device is the mobile terminal, the computer device further includes:
the image collector is used for shooting an image of an animal to be detected;
the display screen is used for outputting the image of the animal to be detected, which is shot by the image collector;
the multi-posture sensor is used for sensing shooting posture information when the image collector shoots a target image of the animal to be detected;
when the computer equipment is a server, the communication interface is used for receiving a target image of the animal to be detected, which is shot and sent by the image collector, and shooting attitude information when the image collector shoots the target image; and feeding back the corresponding external characteristic value to be detected of the animal to be detected to a preset mobile terminal for outputting.
Based on the above technical solution, the present application provides an animal external feature obtaining method, an apparatus and a computer device, in order to avoid the danger to the animal to be measured or the measurement user caused by manual measurement when the external features of the animal to be measured, such as weight and height, need to be obtained, in this embodiment, an external feature value to be measured of the animal to be measured is obtained by using an image analysis method, specifically, a target image of the animal to be measured and shooting posture information when the image collector shoots the target image are obtained by using an image collector, a plurality of key feature point information, such as position information, of the animal to be measured are obtained by performing feature extraction on the target image, and then, a horizontal shooting distance between a center point of the image collector and the animal to be measured can be determined according to the position information of the first key feature point and the shooting posture information, and acquiring a corresponding external characteristic value to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the external characteristic to be detected. Because the target image can be shot by a common image collector, a professional depth camera is required to be used, the equipment cost for obtaining the external features of the animal is greatly reduced, the professional requirements for a user who shoots the image and the requirements for the site where the animal to be detected is located are reduced, and the universality of the method for obtaining the external features of the animal provided by the application is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1a is a schematic diagram of a hardware configuration of an alternative example of a computer device proposed in the present application;
FIG. 1b is a schematic diagram of a hardware configuration of yet another alternative example of a computer device proposed in the present application;
fig. 2 is a schematic diagram of an alternative application scenario applicable to the animal appearance feature acquisition method proposed in the present application;
fig. 3 is a schematic diagram of another alternative application scenario applicable to the animal appearance feature acquisition method proposed in the present application;
fig. 4 is a schematic diagram of another alternative application scenario applicable to the animal appearance feature acquisition method proposed in the present application;
fig. 5 is a schematic flow chart of an alternative example of the method for obtaining the external characteristics of the animal provided by the application;
fig. 6 is a schematic view of a shooting posture of an image collector shooting an image of an animal to be detected in the method for obtaining external characteristics of an animal according to the present application;
fig. 7 is a schematic view of another alternative animal shooting scene suitable for the animal appearance feature acquisition method proposed in the present application;
fig. 8 is a schematic flow chart of yet another alternative example of the method for obtaining the external characteristics of the animal proposed by the present application;
fig. 9a is an alternative schematic diagram of a target image of an animal to be tested obtained in the method for obtaining external characteristics of an animal according to the present application;
fig. 9b is an alternative schematic diagram of a target image of an animal to be tested obtained in the method for obtaining external characteristics of an animal according to the present application;
fig. 10 is a schematic view of an optional scene for adjusting key feature points of an animal to be detected in the animal external feature obtaining method provided in the present application;
fig. 11 is a schematic view of a further alternative animal shooting scene suitable for the animal appearance feature acquisition method proposed in the present application;
fig. 12 is a schematic structural view of an alternative example of the animal external feature obtaining apparatus proposed in the present application;
fig. 13 is a schematic structural view of still another alternative example of the animal external feature obtaining apparatus proposed in the present application;
fig. 14 is a schematic structural view of still another alternative example of the animal external feature obtaining apparatus proposed by the present application.
Detailed Description
In order to solve the prior art problems described in the background art section, an image analysis mode is proposed to achieve the acquisition of the external features of the animal, specifically, a depth camera is used to take a picture of the animal to be detected, key feature point information on the image of the animal to be detected is acquired, and then an external feature value of at least one external feature to be detected of the animal to be detected is obtained through calculation according to the distance measured by the depth camera.
Based on this, this application hopes to utilize the ordinary camera that electronic equipment that has popularized at present has to take a picture to the animal that awaits measuring, and through carrying out analysis processes to the image of shooing again, obtains the required measurement external characteristics to reduce equipment cost avoids the animal external characteristics to obtain the in-process simultaneously, causes the injury to the animal that awaits measuring and measurement personnel, and can be applicable to various application scenarios. The electronic equipment can be a terminal with an image acquisition function, such as a smart phone, a tablet computer, wearable equipment and the like, the equipment type of the electronic equipment is not limited, and the electronic equipment can be determined according to the actual requirements of an application scene.
It should be noted that, with respect to how the present application obtains a specific implementation process of an external feature to be measured of an animal to be measured by using a shot general two-dimensional image, the following will clearly and completely describe a technical solution in an embodiment of the present application with reference to the drawings in the embodiment of the present application, and it is obvious that the described embodiment is only a part of the embodiment of the present application, and is not a whole embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. Also, for convenience of description, only portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements. An element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two. The terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature.
Additionally, flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Referring to fig. 1a, a schematic diagram of a hardware structure of a computer device provided in an embodiment of the present application is shown, where the computer device may include, but is not limited to: a communication interface 11, a memory 12, and a processor 13, wherein:
in the embodiment of the present application, the number of the communication interface 11, the memory 12, and the processor 13 may be at least one, and the communication interface 11, the memory 12, and the processor 13 may complete communication with each other through a communication bus.
Optionally, the communication interface 11 may include an interface of a communication module, such as a GSM module, a WIFI module, an interface for implementing data communication of a mobile communication network (e.g., a 5G network or a 6G network), and in practical application, it may be used to implement data interaction with other devices, such as transmitting an image of an animal to be tested, obtaining an external feature of the animal to be tested, and the like; the communication interface 11 may further include interfaces such as a USB interface, serial/parallel interfaces, and the like, for implementing data interaction between internal components of the computer device, such as various intermediate data, input parameters, and the like generated or required in the implementation process of the animal external feature obtaining method provided by the present application, which may be determined according to the requirements of the actual application scenario, and details of the present application are not described in detail.
The memory 12 may store a program formed by a plurality of instructions for implementing the animal external feature obtaining method provided in the embodiment of the present application, and the processor 13 calls and loads the program stored in the memory 12, so as to implement the animal external feature obtaining method provided in the embodiment of the present application, and the specific implementation process may refer to, but is not limited to, the description of the corresponding embodiment below.
In the present embodiment, the memory 12 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device or other volatile solid state storage device. The processor 13 may be a Central Processing Unit (CPU), an application-specific integrated circuit (ASIC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA), or other programmable logic devices.
In one possible implementation, the memory 12 may include a program storage area and a data storage area, and the program storage area may store an operating system, and application programs required for at least one function (such as an image display function), programs for implementing the animal appearance acquisition method proposed in the present application, and the like; the data storage area can store data generated in the use process of the computer equipment, such as shot images of each frame of an animal to be detected, input shooting parameters of an image collector (such as a camera), the height of an operation body using the image collector to shoot images and the like, and the data can be determined according to the situation.
It should be understood that the structure of the computer device shown in fig. 1a does not constitute a limitation of the computer device in the embodiment of the present application, and in practical applications, the computer device may include more or less components than those shown in fig. 1a, and the present application is not specifically described herein.
In some embodiments, the computer device may be a mobile terminal with certain image processing capability, such as a smart phone, a tablet computer, and the like, in which case, as shown in fig. 1b, the computer device may further include:
and the image collector 14 is used for shooting an image of the animal to be detected. The image collector can be a built-in camera of the mobile terminal, such as a common optical camera, and the structure and the working principle of the image collector are not detailed in the application.
And the display screen 15 is used for outputting the image of the animal to be detected, which is shot by the image collector 14.
In practical application, the display screen 15 may be a touch display screen, which includes a touch sensing unit capable of sensing a touch event on the touch display panel, and a user using the mobile terminal may perform touch operations such as zooming in, zooming out, and selecting a shooting focus based on an image displayed on the display screen to obtain a target image of an animal to be tested that meets requirements.
Of course, the display screen 15 may also be a non-touch display screen, and the user may utilize the input device of the mobile terminal to perform an input operation on the content presented on the display screen 15, which includes, but is not limited to, the above-listed various operations on the display image, as the case may be.
And the various attitude sensors 16 are used for sensing shooting attitude information when the image collector shoots a target image of the animal to be detected. The attitude sensor 16 may specifically include, but is not limited to, a gyroscope sensor, various angle sensors, a position sensor, a distance sensor, an acceleration sensor, a touch sensor, a temperature sensor, and the like, and may be determined according to the use requirements of the mobile terminal itself, and the like. Therefore, the posture sensor of the mobile terminal can be utilized to meet the requirement for acquiring the external characteristics of the animal, and the specific implementation process can refer to the description of the corresponding part of the following embodiment, which is not described in detail herein. In the case that the computer device is a mobile terminal, the communication interface 11 may include various memory interfaces, earphone interfaces, antenna communication interfaces, and the like, and may be determined according to the actual communication requirements of the external device and the communication requirements of the internal components, which is not described in detail in this embodiment.
It should be understood that the structure of the mobile terminal shown in fig. 1b does not constitute a limitation of the mobile terminal in the embodiment of the present application, and in practical applications, the mobile terminal may include more or less components than those shown in fig. 1b, or may combine some components, such as a power management module, a battery, a charging management module, a vibration mechanism, an indicator light, an audio module, a speaker (i.e., a loudspeaker), a microphone, a receiver (i.e., a receiver), a mobile communication module, an antenna, and other input devices and output devices, which are not listed herein.
In still other embodiments, the computer device may also be a server, in which case, a user only needs to upload a target image (which may be a picture or a video, etc.) of an animal to be tested and related parameters to the server, and the server obtains an external feature value of the corresponding animal to be tested according to the animal external feature obtaining method provided by the present application, and a device for capturing an image of the animal to be tested does not need to have high image processing capability, so that requirements on the device are reduced, for example, a common camera, a video camera, etc. may be used.
Therefore, in the application scenario of the method for obtaining external characteristics of an animal provided by the present application through data interaction between the image collector and the server, the communication interface 11 is configured to receive a target image of the animal to be detected, which is shot and sent by the image collector, and shooting posture information of the image collector when shooting the target image, and simultaneously, feed back the obtained external characteristic value to be detected corresponding to the animal to be detected to the preset mobile terminal for output. Based on the above analysis, in practical applications of various landing scenes, in some embodiments, after capturing images of an animal to be tested by various image capturing devices such as various cameras and mobile devices with image collectors shown in fig. 2, the images may be transmitted to a computer device (which may be a server, or if the images of the animal to be tested are captured by cameras deployed in a livestock farm, as shown in fig. 3, the computer device here may also be an electronic device with data processing capability, but is not limited thereto), and the computer device obtains external feature values corresponding to the animal to be tested according to an animal external feature obtaining method described in the following method embodiment.
In still other embodiments, as shown in fig. 4, for an application scenario of the animal external feature obtaining method provided in the present application, a user may also hold a mobile terminal with certain image processing capability and a built-in image collector to capture an image of an animal to be tested, and the mobile terminal executes the animal external feature obtaining method provided in the present application to obtain and output information related to the external feature of the animal to be tested.
Therefore, the computer device suitable for the animal external feature obtaining method and the animal external feature obtaining device provided by the application can be a server or a mobile terminal which is integrated with an image collector and has certain image processing capability, the computer device can be configured with a program corresponding to the animal external feature obtaining method provided by the embodiment of the application, the animal external feature obtaining method provided by the embodiment of the application is implemented, the program can be stored in a memory of the computer device, and a processor calls the program to implement a program function, and a specific implementation process is not limited.
The method for obtaining the external characteristics of the animal provided by the embodiment of the present application is described below from the perspective of a computer device, and the method steps described below can be implemented by the computer device executing a corresponding program.
Referring to fig. 5, a schematic flow chart of an alternative example of the animal external feature obtaining method provided in the embodiment of the present application is shown, the method may be applied to a computer device, and it can be understood from the foregoing analysis that the present application does not limit the device type of the computer device, and the device type can be determined according to a specific application scenario, and the application scenario may refer to, but is not limited to, the several animal external feature obtaining scenarios listed above, which is only exemplified by the present application. As shown in fig. 5, the method for acquiring the external characteristics of the animal according to the present embodiment may include, but is not limited to, the following steps:
step S51, acquiring a target image of an animal to be detected and shooting posture information when the image collector shoots the target image;
with reference to the description of the technical concept of the present application in the foregoing embodiment, the image shooting of the animal to be tested may be implemented by a mobile terminal having an image collector, that is, an operation body (such as an animal manager, a measurer, and the like) may hold the mobile terminal by hand, start the image collector (such as a camera) of the mobile terminal, and direct the image collecting direction toward the current direction of the animal to be tested, so that the animal to be tested appears in the preview frame, and click the shooting button to obtain the target image of the animal to be tested.
Of course, in some embodiments, the image of each animal to be tested may also be captured by a camera deployed in the livestock breeding area, and then transmitted to a computer device with data processing capability, such as the above mobile terminal or the server, through a wireless communication manner, where the computer device obtains the target image of the animal to be tested, and the like.
It should be noted that, during the process of image shooting of the animal to be tested, it is possible to continuously shoot the multi-frame image of the animal to be tested (for example, continuous multi-frame images or videos, for video shooting, continuous multi-frame images corresponding to the animal to be tested can be extracted from the video by the computer device, and the implementation process is not described in detail), however, since the application needs characteristic points of the whole body of the animal to be tested, the shot image needs to contain a complete image of the animal to be tested, therefore, the target image of the present embodiment may refer to a target image including a complete animal to be tested, under the scene of the image shot by the mobile terminal held by the operating body, the whole body of the animal to be detected can be determined to appear in the image acquisition preview frame, when the preview image displayed on the display screen of the mobile terminal contains a complete animal to be detected, confirming shooting, and determining the image obtained by shooting at the moment as a target image of the animal to be detected; for other scenes, one frame of image including the complete animal to be detected can be screened as a target image directly from the multi-frame images of the animal to be detected, and the details are not described herein.
In the process of shooting the image of the animal to be detected, because the relative height between the image collector and the animal to be detected is uncertain, the relative height is often influenced by various factors such as the heights of different animals to be detected, the height difference between an operation body (namely a user shooting the image) and the ground where the animal to be detected is located, the distance between the operation body and the animal to be detected, the height of the operation body, the installation height of the image collector (mainly referring to the scene according to a camera in a livestock breeding place) and the like, and in order to shoot a complete target image of the animal to be detected, parameters such as the shooting angle, the shooting height, the focal length and the like of the image collector may need to be. Therefore, the shooting parameters of the image collector have a certain relation with the relative position relation between the operation body and the animal to be detected in the current shooting scene, and therefore when the horizontal distance between the image collector and the animal to be detected needs to be predicted in the application, the shooting parameters can be combined with the relevant shooting parameters of the image collector during shooting.
Experiments or analysis show that in the embodiment of the application, when the image collector shoots the target image of the animal to be detected, shooting parameters such as shooting attitude information of the image collector may influence the shooting horizontal distance between the image collector and the animal to be detected when the target image is shot. Therefore, in this embodiment, the shooting attitude information of the image collector when shooting the target image needs to be acquired, which may specifically include a shooting pitch angle, a shooting roll angle, a shooting zoom magnification, and the like of the image collector.
As shown in fig. 6, the shooting Pitch angle (Pitch) of the image collector may also be referred to as a tilt angle, which may refer to an angle at which the image collector rotates around an X axis (which may refer to an X axis in a camera coordinate system and may also be represented as a horizontal center line of the image collector), and the Pitch angle of the image collector may be changed. In practical application, the shooting pitch angle is usually determined according to the horizontal distance between the animal to be detected and the image collector, the height of the animal to be detected and other factors, so as to ensure that the image collected by the image collector can contain the complete animal to be detected, of course, in the process, the image collector can be zoomed, the display size of the animal to be detected in the whole image preview frame is adjusted, the shooting requirement is met, the shooting pitch angle can be specifically determined according to the requirement of an actual shooting scene, and the shooting pitch angle is not detailed herein.
A roll angle (roll) in shooting may refer to an angle of rotation of an image collector around a Z axis (which generally refers to a collecting direction of the image collector, i.e. a direction toward which a lens faces), as shown in fig. 6, in an actual shooting process, for a mobile terminal having the image collector, the image collector may rotate around an axis perpendicular to a screen of the mobile terminal; similarly, the heading angle Yaw of the image collector may refer to an angle of rotation of the image collector around the Y axis, that is, an angle of rotation of the image collector around the electronic device in the upward direction.
In practical application, for the shooting attitude information such as the shooting pitch angle, the shooting roll angle, and the like described above, the shooting attitude information may be determined by parameters sensed by various sensors of the mobile terminal with the image collector or the independent image collector itself, the sensors may include a gyroscope, various angle sensors, position sensors, and the like, the types and the number of the sensors may be determined according to actual detection requirements, but the detection method is not limited to the description of the embodiment.
Step S52, extracting the features of the target image to obtain a plurality of key feature point information of the animal to be detected contained in the target image;
in this embodiment of the application, the key feature point information includes position information of a corresponding key feature point in the target image, and since the key feature point may be a certain pixel point of the animal to be detected in the target image, the corresponding position information may be determined by a three-dimensional coordinate of the pixel point in an image coordinate system (for example, a color value of R, G, B in an RGB coordinate system, but is not limited thereto), or by converting a coordinate of the pixel point in the image coordinate system to a three-dimensional coordinate in a camera coordinate system, and the like, as the case may be, the content included in each piece of key feature point information is not limited in the application.
The key feature points of the animal to be detected may refer to feature points capable of representing different external features of the animal to be detected, and are usually located on the body surface of the animal to be detected, that is, edge feature points (which refer to edge feature points of a three-dimensional image) of the animal to be detected in the target image, and specifically, information on what position of the animal to be detected needs to be known to determine that the animal to be detected obtains each external feature value according to the type of the animal to be detected and information on external feature types (such as body size parameter type to be detected and attribute type to be detected) to be measured, so that the feature point corresponding to the position is regarded as one key feature point.
Therefore, for the animals to be detected with different types and sizes, the positions and the number of the corresponding key feature points may be different, and the method can be determined in advance by combining with the actual measurement requirements according to the manner, so that when the key feature points of the animal to be detected are extracted from the shot target image of the animal to be detected, the feature points at the positions, which need to be extracted by the animal to be detected, of the animal to be detected can be known as the key feature points directly according to the preset corresponding relationship between each animal to be detected and each key feature point, and the specific implementation process is not limited.
For example, in combination with the application scenarios shown in fig. 2 to 4, if a yak of an animal to be tested needs to measure external feature values (which may be referred to as body size parameters) such as height and body length of the yak, in order to obtain the body size parameter of the yak height, a distance between a highest point (i.e., a highest point on the back of the yak) and a lowest point (i.e., a contact point between the hoof of the yak and the ground) in the vertical direction of the yak needs to be known, so after a target image of the yak is obtained, a feature point with a higher position in feature points on the back of the yak in the target image may be recorded as a key feature point, and a feature point with which the hoof of the yak is in contact with the ground may be recorded as a key feature point, it should be understood that, in order to improve image analysis reliability, in determining various key feature points, a plurality of feature points adjacent to the positions may be selected, the detection accuracy is reduced.
Similarly, in order to obtain the body length of the yak, according to the analysis mode, one or more feature points which protrude most at the mouth of the yak in the obtained target image of the yak can be determined as key feature points, one or more feature points which protrude most at the tail of the yak can be determined as key feature points, and the distance between the key feature points which are farthest away in the two types of key feature points is sequentially calculated to obtain the body length of the yak. The acquisition process of other body chi parameters of waiting to survey of yak, and the acquisition process of each body chi parameter of waiting to survey of other animals that await measuring are similar, and this application does not do the detail one by one.
In some embodiments, a preset feature extraction algorithm may be adopted to process a target image of an animal to be detected to obtain key feature points, position information thereof, and the like of the animal to be detected in the target image, or a corresponding feature extraction model may be trained for the animal to be detected in advance, so that after the target image of the animal to be detected is obtained, the target image is directly input into the feature extraction model to be processed, and the key feature point information and the like of the animal to be detected are output.
Step S53, acquiring a horizontal shooting distance between the image collector center point and the animal to be detected according to the shooting posture information and the position information of the first key feature point;
it should be noted that, the first key feature point in this embodiment refers to a key feature point whose distance from the ground is smaller than a first threshold value among a plurality of key feature points of the animal to be tested, and in the above yak example, the first key feature point may refer to a feature point on the hoof of the yak, which contacts the ground, and/or an adjacent feature point thereof, and the position and the number of the first key feature point may be determined according to the category of the animal to be tested. It should be understood that the distance between the first key feature point and the ground surface may be zero or a small value, and therefore, the first threshold value is usually a relatively small value, but the specific value of the first threshold value is not limited in the present application.
In combination with the above description of the shooting attitude information, in the process of shooting the animal to be detected, the adjusted shooting attitude information of the image collector is determined according to the horizontal shooting distance between the center point of the image collector and the animal to be detected, the height of the animal to be detected and other factors, and the reverse reasoning is performed, in the embodiment of the application, the horizontal shooting distance between the animal to be detected and the center point of the image collector can be obtained by calculating the shooting attitude information and the position information of the first key feature point when the shooting target image is obtained, during the acquisition process of the horizontal shooting distance, the acquisition can be realized by combining corresponding auxiliary parameters according to actual requirements, if the height of an operation body for shooting the image of the animal to be detected by holding the electronic equipment with the image collector, the shooting height of an independent fixed image collector for shooting the image of the animal to be detected and the like are met, the specific acquisition process of the horizontal shooting distance is not limited in the application.
Referring to the schematic diagram of the animal shooting scene shown in fig. 7, assuming that the mobile terminal is held by the operating body to shoot the image of the animal to be detected, the horizontal shooting distance calculated in the above manner may be a horizontal distance GP between the center point of the image collector and the first key feature point of the animal to be detected, that is, the horizontal shooting distance represents a ground distance between the image collector and the animal to be detected, rather than a distance in the X-axis direction of the image collector coordinate system.
Step S54, obtaining a corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the external characteristic to be detected;
following the description of the above corresponding parts, for different external features to be measured of the animal to be measured, the position information of the key feature points that may not be located at different positions of the animal to be measured is determined, in this embodiment of the application, the key feature point corresponding to any external feature to be measured is recorded as a second key feature point, and the second key feature point may include the first key feature point and other key feature points, and may specifically be determined according to the measurement requirement of each external feature to be measured, which is not described in detail herein.
Based on this, after obtaining the position information of the second key feature point corresponding to the external feature value to be detected according to the preset corresponding relationship for any external feature value to be detected of the animal to be detected, such as the body size parameter to be detected and the attribute value to be detected (such as the weight), since the number of the second key feature points corresponding to one external feature value to be detected is often multiple, such as two, three or even more, in this embodiment, the corresponding external feature value to be detected can be obtained by calculating the position information of the multiple second key feature points.
It should be understood that, the present application needs to obtain the true external feature value of the animal to be detected, rather than the external feature value in the image, so that, in the operation process, the conversion of the coordinate system may be performed to obtain the position information of the second key feature point at the corresponding position of the true animal to be detected, so that the obtained external feature value to be detected is the actual external feature value of the animal to be detected, of course, after the external feature value to be detected of the animal to be detected in the image is obtained by calculation, the actual external feature value of the animal to be detected may be obtained by performing processing such as coordinate conversion, and the conversion process is not described in detail in the present application.
In some embodiments, in combination with the analysis, the external characteristic of the test animal to be tested may include at least one body size (such as height, length, oblique length, chest circumference, etc.) of the test animal to be tested, and/or at least one property (such as weight) to be tested. In practical application, the attribute value of the animal to be measured is usually calculated according to the corresponding body size parameter, so that in practical measurement, each body size parameter to be measured of the animal to be measured can be obtained first, and then the attribute value to be measured is further obtained according to the obtained body size parameter.
Based on this, the step S54 may specifically include: and obtaining corresponding parameters of the body scale to be measured of the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body scale to be measured, and obtaining corresponding attribute values to be measured of the animal to be measured according to the obtained at least one parameter of the body scale to be measured.
In this optional embodiment, in combination with the above description of the correspondence between the external characteristic value of the animal to be measured and the key characteristic point, the external characteristic value of the body scale parameter to be measured is similar to the correspondence between the key characteristic points, that is, the key characteristic points at the positions required for obtaining various body scale parameters to be measured can be determined in advance, that is, the correspondence between the various body scales to be measured and the key characteristic points at the positions of different animals to be measured is established.
Then, since there is often a certain operational relationship between the body scale parameter of the animal to be measured and the attribute to be measured (for example, the operational relationship is expressed by using a formula method), which can be determined according to actual measurement experience, in this embodiment, after at least one body scale parameter to be measured of the animal to be measured is obtained according to the above method, the attribute value to be measured of the corresponding attribute to be measured of the animal to be measured, for example, the weight of the animal to be measured, and the like, can be further estimated according to the predetermined operational relationship.
It should be understood that, for different attributes to be measured of the animal to be measured, the body size parameters to be measured required for obtaining the attribute value to be measured may be different, so that the position and the number of the second key feature points of the animal to be measured, which need to be obtained, are often different for different body size parameters to be measured, in combination with the above analysis, therefore, in practical application, the body size parameter to be measured, which is required for obtaining the attribute value to be measured, may be determined for the attribute to be measured of the animal to be measured, so as to determine the second key feature points and other parameters required for calculating the body size parameters to be measured, and then, the position information and other parameters of the second key feature points may be obtained according to the above analysis manner.
To sum up, in the embodiment of the present application, in order to avoid the danger to the animal to be measured or the measurement user caused by manual measurement when external features of the animal to be measured, such as weight, height, length, etc., need to be obtained, in this embodiment, an image analysis method is provided to obtain the external features to be measured of the animal to be measured, specifically, an image collector is used to obtain a target image of the animal to be measured and shooting posture information of the image collector when shooting the target image, feature extraction is performed on the target image to obtain a plurality of key feature point information, such as position information, of the animal to be measured, and then a horizontal shooting distance between a center point of the image collector and the animal to be measured can be determined according to the position information of a first key feature point and the shooting posture information, so as to determine a horizontal shooting distance and position information of a second key feature point corresponding to the body scale to be measured, and obtaining corresponding external features to be tested. Because the target image can be shot by a common image collector, a professional depth camera is required to be used, the equipment cost for obtaining the external features of the animal is greatly reduced, the professional requirements for a user who shoots the image and the requirements for the site where the animal to be detected is located are reduced, and the universality of the method for obtaining the external features of the animal provided by the application is improved.
Referring to fig. 8, a schematic flow chart of yet another alternative example of the animal appearance feature obtaining method provided in the embodiment of the present application is shown, and this embodiment may be an alternative refined implementation of the animal appearance feature obtaining method proposed in the foregoing embodiment, but is not limited to the refined implementation described in this embodiment, and as shown in fig. 8, the method may include:
step S81, acquiring a target image of an animal to be detected, and a shooting pitch angle and a shooting roll angle when the image collector shoots the target image;
for a specific implementation process of step S81, reference may be made to the description of the corresponding parts in the foregoing embodiments, and details are not repeated.
With regard to the shooting pitch angle and the shooting roll angle, in combination with the description of the meaning and the obtaining mode thereof in the above embodiment, if the image collector shoots a target image, a certain shooting pitch angle exists, as shown in fig. 7, the horizontal center line (i.e., the X axis) of the image collector and the horizon line are not parallel but intersect; if a certain shooting roll angle exists, the image of the animal to be detected in the shot image is inclined as shown in fig. 9a, but not the horizontal state of the image of the animal to be detected obtained by horizontal shooting as shown in fig. 9b, parameter calculation can be performed in a corresponding mode according to the actually obtained presenting state of the animal to be detected in the target image, or inclination calibration can be performed on the target image first, and after the image after calibration as shown in fig. 9b is obtained, subsequent calculation is performed, which is not limited by the present application and can be determined according to the situation.
In the process of shooting the animal to be detected by the image collector, because the characteristic point information of different positions of the animal to be detected is needed, a complete image of the animal to be detected needs to be shot, an image preview interface of the image collector or computer equipment can display a preview image of the shot image to be detected under the current shooting lens, and a shooting user can adjust the shooting angle, distance and the like of the image collector according to actual requirements, so that the animal to be detected is located at the foremost central position in the preview image, namely the animal to be detected is located at the focusing central position, and the complete animal to be detected is shot, and then a shooting button is pressed down to obtain a target image of the animal to be detected, but the method is not limited to this implementation mode.
In still other embodiments, for each acquired frame of preview image, the computer device may acquire profile information (i.e., edge information) of an animal to be tested in the preview image by using an edge detection algorithm, and determine whether the acquired profile information meets a shooting requirement, if yes, determine the frame of preview image as a target image, if not, instruct the image collector to adjust shooting posture information, such as adjusting a shooting angle, a shooting distance, a zoom magnification (i.e., adjusting by using a digital zoom function or an optical zoom function of the image collector and its belonging computer device to change a size of the animal to be tested in the whole preview image according to a matching degree between the acquired profile information and the shooting requirement, and the specific zoom adjustment process is not described in detail in this embodiment), so that the image collector with the adjusted post-shooting posture information, the contour information of the animal to be detected contained in the obtained preview image satisfies the shooting requirement, and the specific implementation process is not described in detail in this embodiment.
It should be noted that, the specific content of the edge detection algorithm on which the contour information of the animal to be detected is obtained is not limited, for example, the target detection algorithm based on semantic segmentation, the edge detection algorithm based on search, the edge detection algorithm based on zero crossing, and the like can be flexibly selected according to the actual application requirements, and the detailed description is not given in the present application.
Step S82, inputting the target image into an image feature extraction model to obtain a plurality of key feature points of the animal to be detected contained in the target image;
in this embodiment, the image feature extraction model may be obtained by training sample images of each sample animal based on a machine learning algorithm, and is used to extract key feature points of any sample animal, the machine learning algorithm may be flexibly selected according to requirements of an actual application scenario, such as a deep neural network, and the like.
In some embodiments provided by the application, in order to improve the reliability of the animal external feature acquisition method, a complete key feature point of the animal to be detected needs to be acquired, so that after a plurality of key feature points of the animal to be detected included in the target image are extracted, integrity verification can be performed on the plurality of acquired key feature points, and if the verification passes, subsequent steps can be continuously executed; if the verification fails, first prompt information can be output to remind the operator (namely the shooting user) to shoot the target image of the animal to be detected again.
Specifically, in a possible implementation manner, the present embodiment may obtain an overall score of the extracted multiple key feature points according to the confidence of each key feature point, a topological position relationship formed by the respective position information of the multiple key feature points, and the like, and if the overall score reaches a preset score, it may be considered that the key feature points on the animal to be tested are completely extracted this time; otherwise, the integral score does not reach the preset score, and the plurality of key feature points are considered to be incomplete and not enough to calculate the body size parameter to be measured of the animal to be measured, in this case, corresponding prompt information can be output to remind a shooting user to perform image acquisition on the animal to be measured again, and according to the above mode, feature extraction and verification are performed on the acquired target image again to ensure that the plurality of key feature points of the animal to be measured are complete.
It should be noted that how to verify whether the plurality of key feature points of the mentioned animal to be tested are complete is not limited to the implementation method described above in this embodiment, and may be flexibly adjusted according to the actual application requirements, and the detailed description of the present application is omitted here.
Step S83, outputting a plurality of key feature points of the animal to be detected contained in the target image;
step S84, responding to the position adjustment operation aiming at least one output key feature point to obtain a plurality of key feature point information of the animal to be detected;
in practical application of the embodiment, for the key feature points of the animal to be tested extracted from the target image, there may be a case where the identification is not accurate enough, and in order to ensure reliability of parameter calculation by using the position information of the key feature points, the application proposes to calibrate the identified key feature points, but is not limited to the manual calibration scheme described in the embodiment.
As described above, in the calibration implementation process of the key feature points provided in this embodiment, the identified key feature points of the animal to be tested may be output first, referring to the output schematic diagram of the key feature points of the animal to be tested shown in fig. 10, in the output target image including the animal to be tested, each extracted key feature point of the animal to be tested may be represented by a black dot, and a user may visually see where the extracted key feature points are on the animal to be tested through a display interface of the mobile terminal, according to experience, if it is determined that the position of one or more currently proposed key feature points is not accurate enough, the key feature point may be selected and dragged to a more accurate position, as shown in fig. 10, the user may click the key feature point with a hand to move the position of the key feature point in an arrow direction, and releasing the fingers until reaching the position which is considered to be more accurate by the user, and realizing the position adjustment of the key characteristic point.
In the process of adjusting the position of the key feature point, for the electronic device, it may respond to the position adjustment operation for at least one output key feature point, and obtain corresponding key feature point information according to the adjusted position information of the key feature point.
It should be noted that the implementation manner of the position of the key feature point extracted by the manual calibration provided by the present application is not limited to the adjustment manner described in the above embodiment, and may also be implemented by using corresponding function keys (e.g., direction keys of a keyboard, preset click operation of a mouse, etc.) on an input device (e.g., a mouse and a keyboard) of the electronic device, and may be determined according to an actual application scenario, or may be determined by a user according to an individual operation habit, which is not limited in the present application.
In addition, in practical application, under the condition that the accuracy of the output result of the image feature extraction model trained in advance is sufficient, after the key feature points of the animal to be detected contained in the target image are obtained, the corresponding key feature point information can be directly obtained, and the extracted key feature points do not need to be subjected to position calibration.
Step S85, determining a first distance between the first key feature point and the horizontal center line of the image collector by using the position information of the first key feature point;
in combination with the description of the corresponding part of the above embodiment, as the first key feature point is a key feature point, of the plurality of key feature points of the animal to be tested, whose distance from the ground is smaller than the first threshold, referring to fig. 7, assuming that the first key feature point is G, the horizontal center line of the image acquisition unit is the X axis in fig. 7, starting from the first key feature point G, generating a ray perpendicular to the X axis, and determining the distance between the first key feature point G and the intersection point of the ray and the X axis as the first distance; referring to fig. 9a and 9b, the first distance may be a vertical distance from the pixel point g to the X axis, and the specific obtaining manner of the first distance is not limited in the present application.
Step S86, obtaining a first included angle of the first key feature point relative to the horizontal center line of the image collector by using the first distance and the shooting roll angle;
in combination with the above description about the shooting rollover angle, referring to the scene schematic diagram shown in fig. 7, after obtaining the first distance from the first key feature point to the horizontal center line of the image collector, the present embodiment may obtain a first included angle ≧ XOG, which is relative to the horizontal center line OX of the image collector, of the first key feature point G by using the first distance and the shooting rollover angle according to the coordinate conversion relationship between different coordinate systems (such as an image coordinate system, a world coordinate system, and the like). According to actual requirements, the calculation of the first included angle can be realized by combining the current configuration parameters of the image collector, such as image resolution, zoom magnification and the like, and the specific acquisition mode of the first included angle is not limited.
Step S87, obtaining a second included angle formed by the first key characteristic point and the first vertical point relative to the central point of the image collector by using the first included angle and the shooting pitch angle;
the first vertical point is a vertical projection point of the central point of the image collector relative to the ground plane on which the animal to be detected stands, as shown in fig. 7, an angle x XOY formed by the horizontal central line OX and the vertical central line OY of the image collector is a right angle, so as shown in fig. 7, a second included angle x GOP formed by the first key feature point G and the first vertical point P relative to the central point O of the image collector forms an angle x XOY with the obtained first included angle x XOG and shooting pitch angle x YOP, and the second included angle x GOP is 90 ° -YOP-XOG. It should be understood that the first angle, the photographing pitch angle, and the second angle in step S87 are all determined by converting to the same coordinate system, so as to ensure that these angles can reliably execute step S87 according to the above-mentioned angular relationship.
Step S88, acquiring the shooting height of the image collector when shooting the target image;
in the image shooting process, the animal to be tested and the user usually stand on a substantially flat ground, and the distance between the animal to be tested and the user is 5-10 meters (but not limited thereto, and adaptive adjustment can be performed according to actual needs) for shooting, and the shooting can be specifically determined according to parameters such as habits, body types and the like of the animal to be tested, and at this time, the user and the animal to be tested who shoot the image can be considered to stand on the same ground plane, as shown in fig. 7. The shooting height of the image collector is determined according to the height of an ordinary person, for example, the height of a user is 1.6-1.8 meters (the height of the user can be adaptively adjusted according to actual conditions), and the height of the image collector is generally close to the height of eyes of the user when the user shoots an image, so that the shooting height of the image collector is determined to be the height of the user minus 0.1 meter (but not limited to the height, and the image collector can be adaptively adjusted according to actual conditions, which is only described in the present application by way of example), namely, the shooting height is 1.5-1.7 meters, but not limited to the above, and the shooting height of the current user when the image collector is used to shoot an image of an animal to be detected can be input in advance according to actual conditions.
However, in some shooting scenes, there may be a height difference between the ground on which the user who takes the image stands and the ground on which the animal to be tested stands, which is denoted as a first height difference, as shown in fig. 11, the first height difference may be determined in advance and input as a configuration parameter of the image collector, so that when the shot target image is processed in the above manner, the target image can be processed in combination with the height difference. It should be noted that, in the shooting scene, the shooting height of the image acquirer acquired in this embodiment may be a relative shooting height, that is, a height of the center point O of the image acquirer relative to a ground plane on which the animal to be detected stands, such as a shooting height OP in fig. 11, which includes a first height difference between the different ground planes on which the animal to be detected stands.
In a possible implementation manner, the step S88 may include: the height and the first height difference of an operating body (namely, the user) for shooting the target image are called, and the shooting height of the image collector relative to the ground plane on which the animal to be detected stands is obtained according to the height and the first height difference of the operating body, but the method is not limited to the implementation mode.
Step S89, obtaining the horizontal shooting distance between the image collector center point and the animal to be detected through tangent function operation by utilizing the shooting height;
referring to the shooting scenes shown in fig. 7 and fig. 11, in combination with the operation principle of the trigonometric function, the horizontal shooting distance GP between the image collector central point O and the animal to be tested can be obtained according to the calculation formula of the horizontal shooting distance GP ═ H-0.1+ m) × tan (· GOP), where H can represent the height of the user who shoots the image of the animal to be tested; 0.1 may represent the height difference between the user and the image collector; m may represent the above-mentioned first height difference, and it should be understood that in the case where there is no first height difference between the ground on which the user taking the image stands and the ground on which the animal to be tested stands, m is 0; tan () represents a tangent function.
It should be noted that the calculation formula for calculating the horizontal shooting distance is not limited to the above-listed calculation formula, and can be flexibly adjusted according to the requirements of the actual application scenario, and the detailed description of the present application is omitted here.
Step S810, according to the respective position information of any two key feature points of the animal to be detected, obtaining a third included angle formed by the two corresponding key feature points relative to the central point of the image collector;
after the horizontal shooting distance GP from the image collector center point O to the animal to be detected is obtained in the above manner, the horizontal shooting distance GP can be used as a basic parameter to realize calculation of various external features of the animal to be detected, specifically, in combination with the above analysis, two key feature points required for obtaining the external feature to be detected are determined for a certain external feature to be detected, and then, a third included angle formed by the two key feature points relative to the image collector center point O can be calculated and obtained by using respective position information of the two key feature points.
Illustratively, referring to the diagram of the shooting scene shown in fig. 7, for key feature points a and B (which may correspond to key feature points a and B shown in fig. 9a and 9B), the corresponding third included angle is represented as ═ AOB. Similarly, for any other two key feature points on the animal to be tested, the acute angle formed by connecting the key feature points with the central point O of the image collector can be used as the third included angle according to the mode, and the implementation process of how to obtain the corresponding third included angle by using the position information of the two key feature points and calculating is not detailed in the application.
Step S811, obtaining a feature point distance between two corresponding key feature points by using the third included angle and the horizontal shooting distance;
following the above analysis, the third included angle AOB shown in fig. 7 is still taken as an example for explanation, and on the basis of obtaining the shooting horizontal distance GP between the animal to be detected and the image collector central point O, the feature point distance AB between the two key feature points may be calculated and obtained by combining with the operation principle of the trigonometric function as GP sin (angle AOB).
Step S812, obtaining corresponding body size parameters to be measured of the animal to be measured by using the obtained feature point distance between at least two second key feature points corresponding to each body size to be measured;
step S813, obtaining a corresponding attribute value to be measured of the animal to be measured according to the obtained at least one body size parameter to be measured.
In combination with the above description of the relationship between the body scale parameter to be measured and the key feature points of the animal to be measured, the second key feature points corresponding to the body scale parameter to be measured can be determined according to the body scale parameter to be measured which is actually required to be measured, and then the body scale parameter to be measured can be calculated and obtained by using the feature point distance between the determined second key feature points. As for the key feature points a and B exemplified above, the chest circumference size of the animal to be tested can be calculated according to the feature point distance between the two; as shown in key feature points C and D in fig. 10, the feature point distance between the key feature points C and D is obtained in the above manner, so as to obtain the body length of the animal to be tested, and the like.
It should be understood that for some body size parameters of the animal to be measured, the feature point distances between a plurality of second key feature points may be required to be obtained, and the body size parameters to be measured are not limited to the above-listed feature point distances between two second key feature points, and are directly calculated and obtained, and may be determined according to actual needs, and the detailed description of the present application is omitted.
After the body size parameter to be measured of the animal to be measured is obtained, the corresponding attribute value to be measured can be further calculated according to an empirical formula between the body size parameter and the attribute to be measured of the animal to be measured, for example, a regression analysis mode is adopted to determine a calculation relation between the body size parameter and the weight of the animal to be measured, and then after the corresponding body size parameter to be measured is obtained according to the mode, the weight of the animal to be measured is obtained according to the calculation relation, and the acquisition mode of the attribute values of other types of attributes is similar, which is not described in detail in the application.
In summary, in the embodiment, in order to avoid the danger of injury to the animal to be measured or the measurement user caused by manual measurement when the external characteristics of the animal to be measured, such as weight, height, etc., need to be obtained, a mode of utilizing image analysis is provided in the embodiment, so that the external characteristics of the animal to be measured are measured without contact, and heavy physical labor caused by the manual measurement mode is also eliminated.
Specifically, in order to reduce equipment cost and professional requirements on shooting personnel, an ordinary optical camera (namely, the image collector) can be used for shooting an image of an animal to be detected, then, relevant shooting parameters of a shot target image and position information of key feature points of the animal to be detected in the target image are utilized for operation, so that a shooting horizontal distance between the animal to be detected and a central point of the image collector is obtained, a feature point distance between two key feature points is obtained accordingly, a body size parameter to be detected and an attribute value to be detected are obtained through calculation, the measurement requirements of an application scene on various external features of the animal to be detected are met, and the universality of the animal external feature obtaining method is improved.
In the above embodiment, for a clearer understanding of the scheme of the animal external feature obtaining method provided by the present application, a computer device is taken as a mobile phone of a user and is taken as an example for measuring the weight of a calving, it should be understood that, for a computer device for executing the scheme of the present application, including but not limited to a mobile phone, and an animal to be measured, including but not limited to a calving, and also other large animals such as a camel, a horse, an elephant, etc., the property to be measured (i.e., the external feature value of the animal body) of the animal to be measured, including but not limited to the weight, may be determined according to the actual animal measurement requirement, and the present application is not described in detail.
In combination with the application scenario diagram shown in fig. 4, in the case that the weight of the cow needs to be measured, the user may use the camera of the mobile phone to align the cow to be measured, check whether the currently acquired cow image meets the measurement requirement through the preview frame of the display screen, and if the cow to be measured is located in the center of the preview frame and includes the complete profile of the cow to be measured, press the shooting button if the measurement requirement is met, take the obtained image as a target image, perform feature extraction on the target image through an image feature extraction model built in the mobile phone, and display a plurality of extracted key feature points of the cow in the image, such as the 16 key feature points shown in fig. 10, but not limited thereto. During this period, the user may check whether each displayed key feature point of the consumable cattle is accurate, and if not, the user may manually adjust the position of the corresponding key feature point to obtain an ideal image of the consumable cattle to be measured, and the adjustment process may refer to, but is not limited to, the description of the corresponding part of the above embodiment. Of course, the mobile phone may also use default parameters, and perform the subsequent processing on the obtained image directly without performing the above adjustment step, which may be determined according to actual situations.
Then, the mobile phone may utilize the shooting attitude information (the content of which may refer to the description of the corresponding part of the above embodiment) when shooting the above ideal target image, parameters input by the user, such as the height H of the user, the first height difference (which may be controlled within 10 cm or determined according to the actual situation) between the user and the ground plane on which the animal to be detected stands, the height difference (such as 0.1 meter, but not limited to) between the shooting height of the camera and the height of the user, and the position of the contact point between the cow to be detected and the ground plane, and so on, according to the processing manner described in the above embodiment, the shooting horizontal distance between the cow to be detected and the user is estimated and extracted, and then the body size parameter of the cow to be detected is calculated by combining the position information of each key feature point, and the weight of the cow to be detected is further estimated according to the body size.
It should be understood that the process of acquiring other attribute values of the consumable cattle to be detected and the process of acquiring each external feature value of other animals to be detected are similar, and detailed description is not given in this application. In addition, the description of each application scenario of the animal external feature obtaining method provided by the application is not limited to the application scenario of shooting the consumable cattle to be tested by using a mobile phone, and the application scenario can be adaptively adjusted according to actual needs.
Referring to fig. 12, a schematic structural diagram of an alternative example of the animal external feature obtaining apparatus proposed in the present application, which may be applied to the computer device, as shown in fig. 12, may include:
the information acquisition module 121 is configured to acquire a target image of an animal to be detected and shooting posture information of the image collector when shooting the target image;
a feature extraction module 122, configured to perform feature extraction on the target image to obtain a plurality of pieces of key feature point information of the animal to be tested, where the key feature point information includes position information of a corresponding key feature point in the target image;
in one possible implementation, the feature extraction module 122 may include:
the model processing unit is used for inputting the target image into an image feature extraction model to obtain a plurality of key feature points of the animal to be detected contained in the target image;
the key feature point output unit is used for outputting a plurality of key feature points of the animal to be detected contained in the target image;
and the position adjusting unit is used for responding to the position adjusting operation aiming at the output at least one key characteristic point to obtain a plurality of pieces of key characteristic point information of the animal to be detected.
Further, in order to improve the processing reliability and accuracy, the animal external feature obtaining apparatus provided by the present application may further include:
the verification module is used for carrying out integrity verification on the obtained multiple key characteristic points of the animal to be tested; if the verification is passed, executing a subsequent process;
and the prompt information output module is used for outputting first prompt information under the condition that the result of the verification module is that the verification fails, wherein the first prompt information is used for reminding the operator to newly shoot the target image of the animal to be detected.
A horizontal shooting distance obtaining module 123, configured to obtain a horizontal shooting distance between the image collector center point and the animal to be tested according to the shooting posture information and position information of a first key feature point, where the first key feature point is a key feature point, of a plurality of key feature points of the animal to be tested, whose distance from the ground is smaller than a first threshold;
and the external characteristic value obtaining module 124 is configured to obtain a corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the body ruler to be detected.
In some embodiments, since the external feature to be measured may include at least one body size to be measured and/or at least one attribute to be measured of the animal to be measured, as shown in fig. 13, the external feature value obtaining module 124 to be measured may include:
the body ruler parameter obtaining unit is used for obtaining the body ruler parameters to be measured corresponding to the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body ruler to be measured;
and the attribute value obtaining unit is used for obtaining the corresponding attribute value to be tested of the animal to be tested according to the obtained at least one body size parameter to be tested.
In some embodiments provided in the application, the shooting attitude information may include shooting pitch angle, shooting roll angle, shooting zoom magnification, and other information, and the meaning and the obtaining manner expressed by each piece of shooting attitude information may refer to the description of the corresponding part of the above embodiments, which is not described in detail. Based on this, as shown in fig. 13, the above-mentioned horizontal shooting distance obtaining module 123 may include:
a first distance determining unit 1231, configured to determine, by using the position information of the first key feature point, a first distance between the first key feature point and a horizontal center line of the image collector;
a first included angle obtaining unit 1232, configured to obtain, by using the first distance and the shooting roll angle, a first included angle of the first key feature point with respect to the horizontal center line;
a second included angle obtaining unit 1233, configured to obtain, by using the first included angle and the shooting pitch angle, a second included angle formed by the first key feature point and a first vertical point with respect to the center point of the image collector, where the first vertical point is a vertical projection point of the center point of the image collector with respect to a ground plane on which the animal to be detected stands;
a shooting height acquiring unit 1234, configured to acquire a shooting height of the image collector when the target image is shot;
in a possible implementation manner, if there is a first height difference between the operating body and the ground where the animal to be tested is located, as shown in fig. 11, the shooting height acquiring unit 1234 may include:
the parameter calling unit is used for calling the height of an operation body for shooting the target image and the first height difference;
and the shooting height obtaining unit is used for obtaining the shooting height of the image collector relative to the ground plane on which the animal to be detected stands according to the height of the operation body and the first height difference.
And a horizontal shooting distance obtaining unit 1235, configured to obtain, by using the shooting height and through tangent function operation, a horizontal shooting distance between the image collector center point and the animal to be detected.
In combination with the description of the above embodiment, as shown in fig. 14, the above body size parameter obtaining unit may include:
a third included angle obtaining unit 1241, configured to obtain, according to respective position information of any two key feature points of the animal to be detected, a third included angle formed by the two corresponding key feature points with respect to the center point of the image collector;
a feature point distance obtaining unit 1242, configured to obtain a feature point distance between two corresponding key feature points by using the third included angle and the horizontal shooting distance;
and a to-be-measured body ruler parameter obtaining unit 1243, configured to obtain a to-be-measured body ruler parameter corresponding to the to-be-measured animal by using the obtained feature point distance between the at least two second key feature points corresponding to each type of body ruler to be measured.
It should be noted that, various modules, units, and the like in the embodiments of the foregoing apparatuses may be stored in the memory as program modules, and the processor executes the program modules stored in the memory to implement corresponding functions, and for the functions implemented by the program modules and their combinations and the achieved technical effects, reference may be made to the description of corresponding parts in the embodiments of the foregoing methods, which is not described in detail in this embodiment.
The present application also provides a storage medium on which a computer program can be stored, the computer program can be called and loaded by a processor to implement the steps of the animal external feature obtaining method described in the above embodiments.
The application also provides an animal external feature acquisition system which can comprise a mobile terminal and a server.
In some embodiments, in combination with the analysis, the mobile terminal may have an image collector and a certain image processing capability, and has the structure described in the computer device, so that each step of the method for obtaining the external features of the animal provided by the present application can be executed.
In practical application, in order to facilitate subsequent query of the state of each animal to be tested and the external characteristic information of different stages, the mobile terminal can also associate the obtained external characteristic value of the animal to be tested with the corresponding acquisition time, and upload the external characteristic value to the server for storage. According to the requirement, the shot target image of the animal to be detected and other shooting parameters can be uploaded to the server for storage at the same time, so that the target image and other shooting parameters can be called and checked later.
In still other embodiments, the mobile terminal is only used as an image capturing device, and does not perform image processing, so that, in combination with the description of the corresponding part of the above embodiments, a user may use the mobile terminal to capture a target image of an animal to be tested, and upload the relevant shooting parameters to the server, and the server executes the steps of the animal external feature obtaining method provided by the present application, obtains a required external feature value and then feeds the obtained external feature value back to the mobile terminal for output, so as to reduce the processing load of the mobile terminal and improve the processing efficiency.
In addition, in the application scenario, a user may use the mobile terminal to shoot images of a plurality of animals to be tested and upload the images to the server, so that external features of the plurality of animals to be tested can be obtained at the same time, and a specific implementation process may refer to the description of the corresponding part in the above embodiment, which is not described in detail in this embodiment.
Finally, it should be noted that, in the present specification, the embodiments are described in a progressive or parallel manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. The device, the computer device and the system disclosed by the embodiment correspond to the method disclosed by the embodiment, so that the description is relatively simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for obtaining external features of an animal, the method comprising:
acquiring a target image of an animal to be detected and shooting posture information when an image collector shoots the target image;
extracting features of the target image to obtain a plurality of pieces of key feature point information of the animal to be detected, wherein the key feature point information comprises position information of corresponding key feature points in the target image;
acquiring a horizontal shooting distance between the center point of the image collector and the animal to be detected according to the shooting posture information and position information of a first key feature point, wherein the first key feature point is a key feature point, the distance between the first key feature point and the ground is smaller than a first threshold value, and the first key feature point is a plurality of key feature points of the animal to be detected;
and obtaining the corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the external characteristic to be detected.
2. The method according to claim 1, wherein the external feature to be measured comprises at least one body size to be measured and/or at least one attribute to be measured of the animal to be measured, and the obtaining of the external feature value to be measured corresponding to the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the external feature to be measured comprises
Obtaining corresponding parameters of the body scale to be measured of the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body scale to be measured;
and obtaining a corresponding attribute value to be detected of the animal to be detected according to the obtained at least one body size parameter to be detected.
3. The method of claim 2, wherein the shooting attitude information includes a shooting pitch angle and a shooting roll angle, and the obtaining of the horizontal shooting distance between the image collector center point and the animal to be tested according to the shooting attitude information and the position information of the first key feature point comprises:
determining a first distance between the first key feature point and a horizontal center line of the image collector by using the position information of the first key feature point;
obtaining a first included angle of the first key characteristic point relative to the horizontal center line by using the first distance and the shooting roll angle;
obtaining a second included angle formed by the first key characteristic point and a first vertical point relative to the central point of the image collector by using the first included angle and the shooting pitch angle, wherein the first vertical point is a vertical projection point of the central point of the image collector relative to a ground plane on which the animal to be detected stands;
acquiring the shooting height of the image collector when the target image is shot;
and obtaining the horizontal shooting distance between the central point of the image collector and the animal to be detected by utilizing the shooting height and performing tangent function operation.
4. The method according to claim 2 or 3, wherein the obtaining of the body size parameter of the animal to be measured according to the horizontal shooting distance and the position information of the second key feature point corresponding to the body size to be measured comprises:
according to the respective position information of any two key feature points of the animal to be detected, obtaining a third included angle formed by the two corresponding key feature points relative to the central point of the image collector;
obtaining a characteristic point distance between two corresponding key characteristic points by using the third included angle and the horizontal shooting distance;
and obtaining corresponding parameters of the body scale to be measured of the animal to be measured by using the obtained characteristic point distance between the at least two second key characteristic points corresponding to each body scale to be measured.
5. The method according to claim 3, wherein if a first height difference exists between the ground where the operating body and the ground where the animal to be tested are respectively located, the obtaining of the shooting height of the image collector when the target image is shot comprises:
calling the height of an operation body for shooting the target image and the first height difference;
and acquiring the shooting height of the image collector relative to the ground plane on which the animal to be detected stands according to the height of the operating body and the first height difference.
6. The method according to claim 1, wherein the performing feature extraction on the target image to obtain a plurality of pieces of key feature point information of the animal to be tested, which are included in the target image, comprises:
inputting the target image into an image feature extraction model to obtain a plurality of key feature points of the animal to be detected, wherein the key feature points are contained in the target image;
outputting a plurality of key feature points of the animal to be detected contained in the target image;
and responding to the position adjustment operation aiming at the output at least one key characteristic point to obtain a plurality of pieces of key characteristic point information of the animal to be detected.
7. The method of claim 6, further comprising:
performing integrity verification on the obtained multiple key characteristic points of the animal to be tested;
and if the verification fails, outputting first prompt information, wherein the first prompt information is used for reminding an operator to newly shoot the target image of the animal to be detected.
8. An animal external feature obtaining apparatus, characterized in that the apparatus comprises:
the information acquisition module is used for acquiring a target image of an animal to be detected and shooting posture information when the image collector shoots the target image;
the characteristic extraction module is used for extracting characteristics of the target image to obtain a plurality of pieces of key characteristic point information of the animal to be detected, wherein the key characteristic point information comprises position information of corresponding key characteristic points in the target image;
the horizontal shooting distance obtaining module is used for obtaining a horizontal shooting distance between the center point of the image collector and the animal to be detected according to the shooting posture information and the position information of a first key feature point, wherein the first key feature point is a key feature point, the distance between the first key feature point and the ground is smaller than a first threshold value, and the first key feature point is a plurality of key feature points of the animal to be detected;
and the external characteristic value obtaining module is used for obtaining the corresponding external characteristic value to be detected of the animal to be detected according to the horizontal shooting distance and the position information of the second key characteristic point corresponding to the body scale to be detected.
9. A computer device, characterized in that the computer device comprises:
a communication interface;
a memory for storing a program for implementing the animal attribute value acquisition method according to any one of claims 1 to 7;
a processor for loading and executing the program to realize the steps of the animal attribute value acquisition method according to any one of claims 1 to 7.
10. The computer device of claim 9, wherein the computer device is a mobile terminal or a server;
when the computer device is the mobile terminal, the computer device further includes:
the image collector is used for shooting an image of an animal to be detected;
the display screen is used for outputting the image of the animal to be detected, which is shot by the image collector;
the multi-posture sensor is used for sensing shooting posture information when the image collector shoots a target image of the animal to be detected;
when the computer equipment is a server, the communication interface is used for receiving a target image of the animal to be detected, which is shot and sent by the image collector, and shooting attitude information when the image collector shoots the target image; and feeding back the corresponding external characteristic value to be detected of the animal to be detected to a preset mobile terminal for outputting.
CN202010940011.3A 2020-09-09 2020-09-09 Animal external feature obtaining method and device and computer equipment Pending CN112040132A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010940011.3A CN112040132A (en) 2020-09-09 2020-09-09 Animal external feature obtaining method and device and computer equipment
PCT/CN2020/118607 WO2022052189A1 (en) 2020-09-09 2020-09-29 Method and device for acquiring external features of animal, and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010940011.3A CN112040132A (en) 2020-09-09 2020-09-09 Animal external feature obtaining method and device and computer equipment

Publications (1)

Publication Number Publication Date
CN112040132A true CN112040132A (en) 2020-12-04

Family

ID=73584451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010940011.3A Pending CN112040132A (en) 2020-09-09 2020-09-09 Animal external feature obtaining method and device and computer equipment

Country Status (2)

Country Link
CN (1) CN112040132A (en)
WO (1) WO2022052189A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518180A (en) * 2021-05-25 2021-10-19 宁夏宁电电力设计有限公司 Vehicle-mounted camera mounting method for electric power working vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114931112B (en) * 2022-04-08 2024-01-26 南京农业大学 Sow body ruler detection system based on intelligent inspection robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120213412A1 (en) * 2011-02-18 2012-08-23 Fujitsu Limited Storage medium storing distance calculation program and distance calculation apparatus
CN103206919A (en) * 2012-07-31 2013-07-17 广州三星通信技术研究有限公司 Device and method used for measuring object size in portable terminal
CN107180438A (en) * 2017-04-26 2017-09-19 清华大学 Estimate yak body chi, the method for body weight and corresponding portable computer device
US20180045504A1 (en) * 2015-02-11 2018-02-15 Huawei Technologies Co., Ltd. Object Dimension Measurement Method and Apparatus
CN108240793A (en) * 2018-01-26 2018-07-03 广东美的智能机器人有限公司 Dimension of object measuring method, device and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5927050B2 (en) * 2012-06-07 2016-05-25 一般財団法人電力中央研究所 Method for measuring the size of any part of an animal photographed with a camera trap
CN103759701B (en) * 2014-01-13 2016-03-09 南通大学 Based on the cell phone intelligent distance-finding method of Android platform
CN107764233B (en) * 2016-08-15 2020-09-04 杭州海康威视数字技术股份有限公司 Measuring method and device
JP7284500B2 (en) * 2018-04-25 2023-05-31 株式会社ノア weight estimator
CN110782467B (en) * 2019-10-24 2023-05-30 新疆农业大学 Horse body ruler measuring method based on deep learning and image processing
CN110693499B (en) * 2019-11-14 2023-10-24 河北农业大学 System and method for detecting animal body ruler and weight

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120213412A1 (en) * 2011-02-18 2012-08-23 Fujitsu Limited Storage medium storing distance calculation program and distance calculation apparatus
CN103206919A (en) * 2012-07-31 2013-07-17 广州三星通信技术研究有限公司 Device and method used for measuring object size in portable terminal
US20180045504A1 (en) * 2015-02-11 2018-02-15 Huawei Technologies Co., Ltd. Object Dimension Measurement Method and Apparatus
CN107180438A (en) * 2017-04-26 2017-09-19 清华大学 Estimate yak body chi, the method for body weight and corresponding portable computer device
CN108240793A (en) * 2018-01-26 2018-07-03 广东美的智能机器人有限公司 Dimension of object measuring method, device and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518180A (en) * 2021-05-25 2021-10-19 宁夏宁电电力设计有限公司 Vehicle-mounted camera mounting method for electric power working vehicle
CN113518180B (en) * 2021-05-25 2022-08-05 宁夏宁电电力设计有限公司 Vehicle-mounted camera mounting method for electric power working vehicle

Also Published As

Publication number Publication date
WO2022052189A1 (en) 2022-03-17

Similar Documents

Publication Publication Date Title
US11627726B2 (en) System and method of estimating livestock weight
CN111627072B (en) Method, device and storage medium for calibrating multiple sensors
US20130083990A1 (en) Using Videogrammetry to Fabricate Parts
CN108781267A (en) Image processing equipment and method
CN106524909B (en) Three-dimensional image acquisition method and device
WO2019080046A1 (en) Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle
CN112040132A (en) Animal external feature obtaining method and device and computer equipment
CN110428372B (en) Depth data and 2D laser data fusion method and device and storage medium
CN103986854A (en) Image processing apparatus, image capturing apparatus, and control method
CN112434546A (en) Face living body detection method and device, equipment and storage medium
CN113132717A (en) Data processing method, terminal and server
US20130300899A1 (en) Information processing device, information processing method, and program
CN110731076A (en) Shooting processing method and device and storage medium
CN106197366A (en) Distance information processing method and device
JP7207561B2 (en) Size estimation device, size estimation method, and size estimation program
CN113237556A (en) Temperature measurement method and device and computer equipment
WO2019198611A1 (en) Feature estimation device and feature estimation method
KR20180094554A (en) Apparatus and method for reconstructing 3d image
KR102313437B1 (en) Method for estimation of bathymetry using hyperspectral image
WO2020008711A1 (en) Learning device, learning system, and learning method
US9159131B2 (en) Data matching method and data matching apparatus for matching data groups according to alignment parameters
JP2020194454A (en) Image processing device and image processing method, program, and storage medium
US20220172306A1 (en) Automated mobile field scouting sensor data and image classification devices
CN110930344B (en) Target quality determination method, device and system and electronic equipment
JPWO2022070956A5 (en)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201204

RJ01 Rejection of invention patent application after publication