CN112465910A - Target shooting distance obtaining method and device, storage medium and electronic equipment - Google Patents

Target shooting distance obtaining method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112465910A
CN112465910A CN202011356698.2A CN202011356698A CN112465910A CN 112465910 A CN112465910 A CN 112465910A CN 202011356698 A CN202011356698 A CN 202011356698A CN 112465910 A CN112465910 A CN 112465910A
Authority
CN
China
Prior art keywords
image
shooting distance
target
face
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011356698.2A
Other languages
Chinese (zh)
Other versions
CN112465910B (en
Inventor
赵小诣
吕文勇
程序
刘洪江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu New Hope Finance Information Co Ltd
Original Assignee
Chengdu New Hope Finance Information Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu New Hope Finance Information Co Ltd filed Critical Chengdu New Hope Finance Information Co Ltd
Priority to CN202011356698.2A priority Critical patent/CN112465910B/en
Publication of CN112465910A publication Critical patent/CN112465910A/en
Application granted granted Critical
Publication of CN112465910B publication Critical patent/CN112465910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

In the method and the device for acquiring the shooting distance of the target, when the image completely contains the target, the ratio of the area of the target to the equivalent area of the image is acquired, and the first shooting distance of the target is determined according to the ratio. Compared with the prior art, when the shooting distance is acquired, redundant equipment needs to be added, or a reference object with a known size in the image needs to be added. And acquiring a ratio value through the target area and the image equivalent area, and determining the shooting distance according to the ratio value. Neither a reference nor other (binocular or infrared) equipment is required, thereby overcoming the problems in the prior art.

Description

Target shooting distance obtaining method and device, storage medium and electronic equipment
Technical Field
The application relates to the field of image recognition, in particular to a target shooting distance obtaining method and device, a storage medium and electronic equipment.
Background
With the development of computer networks, recognition technologies relying on machines are widely applied by people, such as face recognition, license plate recognition and the like. Among them, recognition of a portrait shooting distance based on computer vision is also of interest to those skilled in the art.
At present, there are three main methods for identifying a portrait shooting distance based on computer vision. First, based on a triangular transformation in the image; the method needs to search an article with a known size in an image as a reference distance reference, and obtains the face distance through complex geometric transformation through the relation such as triangular transformation of the position relation in the image. However, in an actual environment, the background of the customer photographing is very different, and it is difficult to clearly find an article with a size as a reference distance reference. Second, based on binocular disparity; according to the method, two cameras are required to carry out image acquisition on the same object at a certain distance, and the distance of the object in the image is calculated by extracting and comparing key points in the image. And thirdly, depth ranging based on TOF novel hardware. The method adopts a method of actively emitting infrared rays by hardware to collect and analyze infrared data, thereby calculating the shooting distance of the human face. The second and third types are relatively more costly in hardware and are not suitable for use in cell phones.
How to identify the portrait shooting distance based on computer vision on the premise of overcoming the defects becomes a difficult problem to be solved urgently by technical personnel in the field.
Disclosure of Invention
An object of the present application is to provide a method, an apparatus, a storage medium, and an electronic device for obtaining a target shooting distance, so as to at least partially improve the above problems.
In order to achieve the above purpose, the embodiments of the present application employ the following technical solutions:
in a first aspect, an embodiment of the present application provides a method for obtaining a target shooting distance, where the method includes:
when the image completely contains the target, acquiring the ratio of the area of the target to the equivalent area of the image;
wherein the image equivalent area is an area matching a width of the image;
and determining a first shooting distance of the target according to the ratio.
In a second aspect, an embodiment of the present application provides an apparatus for acquiring a shooting distance of a target, where the apparatus includes:
the proportion acquiring unit is used for acquiring the ratio of the area of the target to the equivalent area of the image when the image completely contains the target;
wherein the image equivalent area is an area matching a width of the image;
and the distance acquisition unit is used for determining a first shooting distance of the target according to the ratio value.
In a third aspect, the present application provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method described above.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a processor and memory for storing one or more programs; the one or more programs, when executed by the processor, implement the methods described above.
Compared with the prior art, in the target shooting distance obtaining method, the target shooting distance obtaining device, the storage medium and the electronic device, when the image completely contains the target, the ratio of the area of the target to the equivalent area of the image is obtained, and the first shooting distance of the target is determined according to the ratio. Compared with the prior art, when the shooting distance is acquired, redundant equipment needs to be added, or a reference object with a known size in the image needs to be added. And acquiring a ratio value through the target area and the image equivalent area, and determining the shooting distance according to the ratio value. Neither a reference nor other (binocular or infrared) equipment is required, thereby overcoming the problems in the prior art.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and it will be apparent to those skilled in the art that other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a target shooting distance obtaining method according to an embodiment of the present application;
fig. 3 is a schematic view of face images with different aspect ratios according to an embodiment of the present application;
fig. 4 is a graph of similarity between an effective area of an image and a real area of the image, where the value of the aspect ratio corresponds to that provided in the embodiment of the present application;
fig. 5 is a schematic flowchart of a target shooting distance obtaining method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a target shooting distance obtaining method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a regression verification result provided in the embodiment of the present application;
FIG. 8 is a schematic diagram of a sample provided by an embodiment of the present application;
fig. 9 is a schematic unit diagram of a target shooting distance acquisition apparatus according to an embodiment of the present application.
In the figure: 10-a processor; 11-a memory; 12-a bus; 13-a communication interface; 201-proportion obtaining unit; 202-distance acquisition unit.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In the description of the present application, it should be noted that the terms "upper", "lower", "inner", "outer", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings or orientations or positional relationships conventionally found in use of products of the application, and are used only for convenience in describing the present application and for simplification of description, but do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present application.
In the description of the present application, it is also to be noted that, unless otherwise explicitly specified or limited, the terms "disposed" and "connected" are to be interpreted broadly, e.g., as being either fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
At present, there are three main methods for identifying a portrait shooting distance based on computer vision. First, based on a triangular transformation in the image; the method needs to search an article with a known size in an image as a reference distance reference, and obtains the face distance through complex geometric transformation through the relation such as triangular transformation of the position relation in the image. However, in an actual environment, the background of the customer photographing is very different, and it is difficult to clearly find an article with a size as a reference distance reference. Second, based on binocular disparity; according to the method, two cameras are required to carry out image acquisition on the same object at a certain distance, and the distance of the object in the image is calculated by extracting and comparing key points in the image. And thirdly, depth ranging based on TOF novel hardware. The method adopts a method of actively emitting infrared rays by hardware to collect and analyze infrared data, thereby calculating the shooting distance of the human face. The second and third types are relatively more costly in hardware and are not suitable for use in cell phones. That is, in the current shooting distance recognition method, either a reference object with a known size needs to be present in the image, or extra hardware cost needs to be added.
The embodiment of the application provides a method for acquiring a target shooting distance, which is used for identifying a portrait shooting distance based on computer vision on the premise of overcoming the defects, and is specifically referred to as the following.
The embodiment of the application provides electronic equipment which can be computer equipment or other intelligent terminals. Please refer to fig. 1, a schematic structural diagram of an electronic device. The electronic device comprises a processor 10, a memory 11, a bus 12. The processor 10 and the memory 11 are connected by a bus 12, and the processor 10 is configured to execute an executable module, such as a computer program, stored in the memory 11.
The processor 10 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the target shooting distance obtaining method may be implemented by an integrated logic circuit of hardware in the processor 10 or instructions in the form of software. The Processor 10 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
The Memory 11 may comprise a high-speed Random Access Memory (RAM) and may further comprise a non-volatile Memory (non-volatile Memory), such as at least one disk Memory.
The bus 12 may be an ISA (Industry Standard architecture) bus, a PCI (peripheral Component interconnect) bus, an EISA (extended Industry Standard architecture) bus, or the like. Only one bi-directional arrow is shown in fig. 1, but this does not indicate only one bus 12 or one type of bus 12.
The memory 11 is used for storing programs, such as programs corresponding to the target shooting distance acquisition means. The target shooting distance acquiring means includes at least one software function module that may be stored in the memory 11 in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the electronic device. The processor 10, upon receiving the execution instruction, executes the program to implement the target shooting distance acquisition method.
Possibly, the electronic device provided by the embodiment of the present application further includes a communication interface 13. The communication interface 13 is connected to the processor 10 via a bus. The processor 10 may receive an image transmitted by another terminal through the communication interface 13, and optionally, the image includes an object with a shooting distance to be identified.
It should be understood that the structure shown in fig. 1 is merely a structural schematic diagram of a portion of an electronic device, which may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
The method for obtaining the shooting distance of the target according to the embodiment of the present invention can be applied to, but is not limited to, the electronic device shown in fig. 1, and please refer to fig. 2:
and S108, when the image completely contains the target, acquiring the ratio of the area of the target to the equivalent area of the image.
Wherein the image equivalent area is an area matching the width of the image.
In actual production, the sizes and the aspect ratios of images photographed by different terminals are not consistent, and the original image areas can be calculated by directly adopting the heights and the widths of the original images to have larger differences.
The image equivalent area is related only to the width of the image and not to the height of the image. Therefore, the interference of inconsistent aspect ratios caused by uncertain factors such as different image cutting, different image shooting tools, different image shooting equipment and the like can be effectively avoided, and the interference of inconsistent aspect ratios on the shooting distance identification of the target is avoided. Fig. 3 is a schematic diagram of different aspect ratios caused by uncertain factors such as different image cropping, different image capturing tools, different image capturing devices, and the like. Note that the short side in the original image is wide.
And S109, determining a first shooting distance of the target according to the ratio.
Compared with the prior art, when the shooting distance is acquired, redundant equipment needs to be added, or a reference object with a known size in the image needs to be added. In the embodiment of the application, the ratio is obtained through the target area and the image equivalent area, and the shooting distance is determined according to the ratio. Neither a reference nor other (binocular or infrared) equipment is required, thereby overcoming the problems in the prior art.
To sum up, in the method for acquiring the shooting distance of the target provided by the embodiment of the application, when the image completely contains the target, the ratio of the area of the target to the equivalent area of the image is acquired, and the first shooting distance of the target is determined according to the ratio. Compared with the prior art, when the shooting distance is acquired, redundant equipment needs to be added, or a reference object with a known size in the image needs to be added. And acquiring a ratio value through the target area and the image equivalent area, and determining the shooting distance according to the ratio value. Neither a reference nor other (binocular or infrared) equipment is required, thereby overcoming the problems in the prior art.
On the basis of fig. 2, regarding the expression of the equivalent area and the ratio of the image, the embodiment of the present application also provides a possible implementation manner, please refer to the following.
The expression of the image equivalent area and the ratio is as follows:
Rface=Sface÷S′image
S′image=(p·W)·W;
wherein p represents a preset aspect ratio, S'imageCharacterizing the equivalent area of the image, the width of the W image, RfaceCharacterization ratio, SfaceThe area of the target is characterized.
Alternatively, Sface=Hface*WfaceWherein H isfaceHeight of the object, WfaceThe width of the target is characterized.
Note that, in the examples of the present application, the image equivalent area S'imageAnd is directly equal to the product of the height and width of the image, but passes through a preset aspect ratio and the width of the imageThe obtained image equivalent area is only related to the width of the image and is not related to the actual height of the image.
For the value of the aspect ratio, the embodiment of the present application further provides a possible implementation manner, please refer to fig. 4, where fig. 4 is a graph of the similarity between the image effective area and the image real area corresponding to the value of the aspect ratio.
As shown in fig. 4, the coverage ratio of the image equivalent area to the image real area calculated by the aspect ratio varies with the value of the aspect ratio. When p is equal to 1.27, 1.39, 90% of images have no difference from the actual area by using the area calculation of the equivalent area of the images. When p is 1.3310, the image equivalent area is closest to the image real area.
The inventor analyzes the height and the width of the image of the real production environment through a large amount of practice summaries, and finds that the value of the height of the image in the production environment is 84 in total, and is distributed in the interval of H e [612, 2454], and the interval range is 1842. And the width values are only 19, and are distributed in the interval of W e [459, 1080], and the interval range is 621. Therefore, the adoption of the height-width ratio (p) instead of the height-width ratio (1/p) can well avoid errors caused by large fluctuation range and wide value range of the height.
On the basis of fig. 2, as to how to calculate the shooting distance of the target according to the ratio, the embodiment of the present application also provides a possible implementation manner, please refer to the following.
The formula for determining the shooting distance of the target according to the ratio is as follows:
Distance=a·(Rface)b
wherein Distance represents a first shooting Distance, RfaceAnd (5) characterizing the ratio, wherein a and b are constants.
Alternatively, a-16.027 and b-0.557.
On the basis of fig. 2, when the target is a human face, a possible implementation manner is further provided in the embodiment of the present application with respect to determining whether the image completely includes the target, please refer to fig. 5, where the target shooting distance obtaining method further includes:
s101, when the image contains a human face, a human face image is obtained.
Optionally, before executing S101, in the target shooting distance method provided in the embodiment of the present application, the following steps need to be executed.
Firstly, an image to be identified is extracted from a database, and the image to be identified is preprocessed, wherein the preprocessing comprises image data standardization processing and image HSV color standardization adjustment processing.
Then, the image after data preprocessing is extracted with basic image information, and the extracted content includes, but is not limited to, image length and width pixel information, for example, 1080 × 720 length and width pixel information.
The image is then input to a face detection module to determine whether the image contains a face. If yes, S101 is carried out; if not, stopping.
And segmenting the face image from the original image so as to facilitate subsequent processing and identification. Possibly, the coordinate information of the face image relative to the original image is determined, and the face image can be obtained according to the coordinate boundary of the face image.
And S102, detecting key points of the human face on the human face image.
Specifically, the key points of the human face to be detected are matched with the human face image one by one. The face key point detection is used for determining the number of face key points which can be matched by the face image. Optionally, the face keypoint detection may also determine the type and location (coordinates) of face keypoints that the face image can match.
And S104, judging whether the number of the key points matched with the face image exceeds a preset threshold value. If yes, executing S107; if not, go to S105.
When the number of the key points matched with the face image exceeds a preset threshold value, the face image in the original image is close to a complete face, the image is determined to completely contain a target at the moment, and S107 is executed; otherwise, S105 is executed.
Optionally, in the embodiment of the present application, the face key point detection is a 68-point detection method, and the preset threshold is 70% of the total number of key points included in the detection method.
And S105, determining that the image does not completely contain the target.
And S107, determining that the image completely contains the target.
On the basis of fig. 5, regarding how to further ensure the accuracy of the distance recognition result, the embodiment of the present application further provides a possible implementation manner, please refer to fig. 6, where the target shooting distance obtaining method further includes:
and S103, inputting the attribute data of the image and the attribute data of the human face into a tree regression model to obtain a second shooting distance.
And the second shooting distance is the face shooting distance output by the tree regression model. The attribute data of the image comprises the height, the width and the image area of the image, and the attribute data of the face comprises the width, the height and the area of the face and the number and the positions (coordinates) of key points of the face which can be matched with the face image.
The tree regression model may be a machine learning or conventional linear model based training regression model, optionally an Xgboost based tree regression model.
And S106, taking the second shooting distance as the final shooting distance.
When the image does not completely contain the face, the second shooting distance in S103 is taken as the final shooting distance.
S110, the average value of the second shooting distance and the first shooting distance is used as the final shooting distance.
Specifically, in order to further improve the accuracy of the shooting distance, the average value of the second shooting distance and the first shooting distance is taken as the final shooting distance.
Referring to fig. 7, fig. 7 is a schematic diagram illustrating a regression verification result according to an embodiment of the present application. The solid line in fig. 7 is a regression result obtained by mathematically averaging the first shooting distance and the second shooting distance, and the two dotted lines in fig. 7 are error range value curves of the regression result. The dots in fig. 7 are the measured distances from the real face to the photographing terminal. R2To determine the coefficients, a goodness-of-fit indicator representing the regression results, with closer to 1 indicating higher fitting accuracy.
The actual regression results and the test data of the real image capturing distance in the examples of the present application are shown in table 1 below.
Figure BDA0002802768460000121
TABLE 1
Wherein R is2To determine the coefficients, coeffient of determination; MSE is the mean square error;
Figure BDA0002802768460000123
root mean square error (or expressed as RMSE); MAE is the mean absolute error.
Optionally, the inventor makes statistics on the calculation speed of a single thread on the CPU in the target shooting distance acquisition method provided in the embodiment of the present application. As shown in Table 2, the average computation time on the CPU was 58.6 ms.
Figure BDA0002802768460000122
Figure BDA0002802768460000131
TABLE 2
Optionally, please refer to fig. 8, and fig. 8 is a schematic diagram of a sample provided in an embodiment of the present application. The last two images are obtained based on the same shooting distance, the height-width ratio of the images is changed after the images are cut, and the results show that the shooting distance of the human face can still perform accurate regression calculation within an error range.
Referring to fig. 9, fig. 9 is a diagram of a target shooting distance obtaining apparatus according to an embodiment of the present disclosure, where optionally, the target shooting distance obtaining apparatus is applied to the electronic device described above.
The target shooting distance acquisition apparatus includes: a ratio acquisition unit 201 and a distance acquisition unit 202.
The proportion acquiring unit 201 is configured to acquire a ratio of an area of the target to an equivalent area of the image when the image completely contains the target.
Wherein the image equivalent area is an area matching the width of the image. Alternatively, the proportion obtaining unit 201 may be S108 described above.
A distance obtaining unit 202, configured to determine a first shooting distance of the target according to the ratio value. Alternatively, the distance acquisition unit 202 may be S109 described above.
Optionally, the expression of the equivalent area of the image is:
S′image=(p·W)·W;
wherein p represents a preset aspect ratio, S'imageThe equivalent area of the image is represented, and the width of the W image is represented.
It should be noted that the target shooting distance obtaining apparatus provided in this embodiment may execute the method flows shown in the above method flow embodiments to achieve the corresponding technical effects. For the sake of brevity, the corresponding contents in the above embodiments may be referred to where not mentioned in this embodiment.
The embodiment of the invention also provides a storage medium, wherein the storage medium stores computer instructions and programs, and the computer instructions and the programs execute the target shooting distance obtaining method of the embodiment when being read and run. The storage medium may include memory, flash memory, registers, or a combination thereof, etc.
The following provides an electronic device, which may be a computer device, a server device, or other intelligent terminal, and as shown in fig. 1, the electronic device may implement the above-mentioned target shooting distance obtaining method; specifically, the electronic device includes: processor 10, memory 11, bus 12. The processor 10 may be a CPU. The memory 11 is used to store one or more programs, which when executed by the processor 10, perform the target shooting distance acquisition method of the above-described embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A method for acquiring a shooting distance of a target is characterized by comprising the following steps:
when the image completely contains the target, acquiring the ratio of the area of the target to the equivalent area of the image;
wherein the image equivalent area is an area matching a width of the image;
and determining a first shooting distance of the target according to the ratio.
2. The object shooting distance acquisition method according to claim 1, wherein the expression of the image equivalent area is:
S′image=(p·W)·W;
wherein p represents a preset aspect ratio, S'imageCharacterizing the equivalent area of the image, W the width of the image.
3. The object shooting distance acquisition method according to claim 1, wherein the calculation formula for determining the shooting distance of the object based on the fraction value is:
Distance=a·(Rface)b
wherein Distance represents the first shooting Distance, RfaceAnd characterizing the ratio, wherein a and b are constants.
4. The method for acquiring the shooting distance to the target according to claim 1, wherein the target is a human face, the method further comprising:
when the image contains a human face, acquiring a human face image;
performing face key point detection on the face image, and judging whether the number of key points matched with the face image exceeds a preset threshold value or not;
if yes, determining that the image completely contains the target;
if not, determining that the image does not completely contain the target.
5. The object shooting distance acquisition method according to claim 4, characterized by further comprising:
inputting the attribute data of the image and the attribute data of the face into a tree regression model to obtain a second shooting distance;
the second shooting distance is a face shooting distance output by the tree regression model;
and when the image does not completely contain the target, taking the second shooting distance as a final shooting distance.
6. The object shooting distance acquisition method according to claim 5, characterized by further comprising:
and when the image completely contains the target, taking the average value of the second shooting distance and the first shooting distance as the final shooting distance.
7. An object shooting distance acquisition apparatus, characterized in that the apparatus comprises:
the proportion acquiring unit is used for acquiring the ratio of the area of the target to the equivalent area of the image when the image completely contains the target;
wherein the image equivalent area is an area matching a width of the image;
and the distance acquisition unit is used for determining a first shooting distance of the target according to the ratio value.
8. The object shooting distance acquisition apparatus according to claim 7, wherein the expression of the image equivalent area is:
S′image=(p·W)·W;
wherein p represents a preset aspect ratio, S'imageCharacterizing the equivalent area of the image, W the width of the image.
9. A storage medium on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
10. An electronic device, comprising: a processor and memory for storing one or more programs; the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
CN202011356698.2A 2020-11-26 2020-11-26 Target shooting distance obtaining method and device, storage medium and electronic equipment Active CN112465910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011356698.2A CN112465910B (en) 2020-11-26 2020-11-26 Target shooting distance obtaining method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011356698.2A CN112465910B (en) 2020-11-26 2020-11-26 Target shooting distance obtaining method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112465910A true CN112465910A (en) 2021-03-09
CN112465910B CN112465910B (en) 2021-12-28

Family

ID=74809015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011356698.2A Active CN112465910B (en) 2020-11-26 2020-11-26 Target shooting distance obtaining method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112465910B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113619482A (en) * 2021-09-10 2021-11-09 无锡天博电器制造有限公司 Control component priority setting system and method
CN113837962A (en) * 2021-09-24 2021-12-24 江苏泰扬金属制品有限公司 Computer type priority setting system and method

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102721404A (en) * 2012-06-07 2012-10-10 南京航空航天大学 Non-contact distance measurement device using digital camera and measurement method
CN103024165A (en) * 2012-12-04 2013-04-03 华为终端有限公司 Method and device for automatically setting shooting mode
CN104134067A (en) * 2014-07-07 2014-11-05 河海大学常州校区 Road vehicle monitoring system based on intelligent visual Internet of Things
CN105353875A (en) * 2015-11-05 2016-02-24 小米科技有限责任公司 Method and apparatus for adjusting visible area of screen
CN105654038A (en) * 2015-12-22 2016-06-08 上海汽车集团股份有限公司 Vehicle lamp identification method and device
CN106295533A (en) * 2016-08-01 2017-01-04 厦门美图之家科技有限公司 Optimization method, device and the camera terminal of a kind of image of autodyning
CN106845345A (en) * 2016-12-15 2017-06-13 重庆凯泽科技股份有限公司 Biopsy method and device
CN106959076A (en) * 2017-02-20 2017-07-18 广州视源电子科技股份有限公司 Portrait distance detection method and system based on camera
CN108259752A (en) * 2018-02-27 2018-07-06 北京智启科技有限公司 A kind of image pickup method and system
CN108921125A (en) * 2018-07-18 2018-11-30 广东小天才科技有限公司 A kind of sitting posture detecting method and wearable device
CN109191802A (en) * 2018-07-20 2019-01-11 北京旷视科技有限公司 Method, apparatus, system and storage medium for sight protectio prompt
CN109427062A (en) * 2017-08-30 2019-03-05 深圳星行科技有限公司 Roadway characteristic labeling method, device, computer equipment and readable storage medium storing program for executing
CN109579798A (en) * 2018-12-29 2019-04-05 中国汽车技术研究中心有限公司 A kind of video grammetry method and measuring equipment applied to automated parking system
CN109977727A (en) * 2017-12-27 2019-07-05 广东欧珀移动通信有限公司 Sight protectio method, apparatus, storage medium and mobile terminal
CN209181784U (en) * 2018-12-29 2019-07-30 中国汽车技术研究中心有限公司 A kind of video grammetry device applied to automated parking system
CN110288608A (en) * 2018-03-19 2019-09-27 北京京东尚科信息技术有限公司 Crop row center line extraction method and device
CN110310285A (en) * 2019-05-14 2019-10-08 武汉泓毅智云信息有限公司 A kind of burn surface area calculation method accurately rebuild based on 3 D human body
CN110909617A (en) * 2019-10-28 2020-03-24 广州多益网络股份有限公司 Living body face detection method and device based on binocular vision
CN110970057A (en) * 2018-09-29 2020-04-07 华为技术有限公司 Sound processing method, device and equipment
CN111182207A (en) * 2019-12-31 2020-05-19 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and electronic equipment
CN111307331A (en) * 2020-04-02 2020-06-19 广东博智林机器人有限公司 Temperature calibration method, device, equipment and storage medium
CN111327814A (en) * 2018-12-17 2020-06-23 华为技术有限公司 Image processing method and electronic equipment
CN111337142A (en) * 2020-04-07 2020-06-26 北京迈格威科技有限公司 Body temperature correction method and device and electronic equipment
CN111464740A (en) * 2020-04-07 2020-07-28 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and electronic equipment
CN111626241A (en) * 2020-05-29 2020-09-04 北京华捷艾米科技有限公司 Face detection method and device
CN111651539A (en) * 2020-05-22 2020-09-11 西北农林科技大学 Method for realizing quick updating of plane map elements by using close-range remote sensing technology
CN111669462A (en) * 2020-05-30 2020-09-15 华为技术有限公司 Method and related device for displaying image
CN111753813A (en) * 2020-08-10 2020-10-09 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102721404A (en) * 2012-06-07 2012-10-10 南京航空航天大学 Non-contact distance measurement device using digital camera and measurement method
CN103024165A (en) * 2012-12-04 2013-04-03 华为终端有限公司 Method and device for automatically setting shooting mode
CN104134067A (en) * 2014-07-07 2014-11-05 河海大学常州校区 Road vehicle monitoring system based on intelligent visual Internet of Things
CN105353875A (en) * 2015-11-05 2016-02-24 小米科技有限责任公司 Method and apparatus for adjusting visible area of screen
CN105654038A (en) * 2015-12-22 2016-06-08 上海汽车集团股份有限公司 Vehicle lamp identification method and device
CN106295533A (en) * 2016-08-01 2017-01-04 厦门美图之家科技有限公司 Optimization method, device and the camera terminal of a kind of image of autodyning
CN106845345A (en) * 2016-12-15 2017-06-13 重庆凯泽科技股份有限公司 Biopsy method and device
CN106959076A (en) * 2017-02-20 2017-07-18 广州视源电子科技股份有限公司 Portrait distance detection method and system based on camera
CN109427062A (en) * 2017-08-30 2019-03-05 深圳星行科技有限公司 Roadway characteristic labeling method, device, computer equipment and readable storage medium storing program for executing
CN109977727A (en) * 2017-12-27 2019-07-05 广东欧珀移动通信有限公司 Sight protectio method, apparatus, storage medium and mobile terminal
CN108259752A (en) * 2018-02-27 2018-07-06 北京智启科技有限公司 A kind of image pickup method and system
CN110288608A (en) * 2018-03-19 2019-09-27 北京京东尚科信息技术有限公司 Crop row center line extraction method and device
CN108921125A (en) * 2018-07-18 2018-11-30 广东小天才科技有限公司 A kind of sitting posture detecting method and wearable device
CN109191802A (en) * 2018-07-20 2019-01-11 北京旷视科技有限公司 Method, apparatus, system and storage medium for sight protectio prompt
CN110970057A (en) * 2018-09-29 2020-04-07 华为技术有限公司 Sound processing method, device and equipment
CN111327814A (en) * 2018-12-17 2020-06-23 华为技术有限公司 Image processing method and electronic equipment
CN209181784U (en) * 2018-12-29 2019-07-30 中国汽车技术研究中心有限公司 A kind of video grammetry device applied to automated parking system
CN109579798A (en) * 2018-12-29 2019-04-05 中国汽车技术研究中心有限公司 A kind of video grammetry method and measuring equipment applied to automated parking system
CN110310285A (en) * 2019-05-14 2019-10-08 武汉泓毅智云信息有限公司 A kind of burn surface area calculation method accurately rebuild based on 3 D human body
CN110909617A (en) * 2019-10-28 2020-03-24 广州多益网络股份有限公司 Living body face detection method and device based on binocular vision
CN111182207A (en) * 2019-12-31 2020-05-19 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and electronic equipment
CN111307331A (en) * 2020-04-02 2020-06-19 广东博智林机器人有限公司 Temperature calibration method, device, equipment and storage medium
CN111337142A (en) * 2020-04-07 2020-06-26 北京迈格威科技有限公司 Body temperature correction method and device and electronic equipment
CN111464740A (en) * 2020-04-07 2020-07-28 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and electronic equipment
CN111651539A (en) * 2020-05-22 2020-09-11 西北农林科技大学 Method for realizing quick updating of plane map elements by using close-range remote sensing technology
CN111626241A (en) * 2020-05-29 2020-09-04 北京华捷艾米科技有限公司 Face detection method and device
CN111669462A (en) * 2020-05-30 2020-09-15 华为技术有限公司 Method and related device for displaying image
CN111753813A (en) * 2020-08-10 2020-10-09 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113619482A (en) * 2021-09-10 2021-11-09 无锡天博电器制造有限公司 Control component priority setting system and method
CN113837962A (en) * 2021-09-24 2021-12-24 江苏泰扬金属制品有限公司 Computer type priority setting system and method

Also Published As

Publication number Publication date
CN112465910B (en) 2021-12-28

Similar Documents

Publication Publication Date Title
US8121400B2 (en) Method of comparing similarity of 3D visual objects
CN112465910B (en) Target shooting distance obtaining method and device, storage medium and electronic equipment
CN109272546B (en) Fry length measuring method and system
CN113111844B (en) Operation posture evaluation method and device, local terminal and readable storage medium
JP2017032548A (en) Using 3d vision for automated industrial inspection
CN111811567B (en) Equipment detection method based on curve inflection point comparison and related device
CN103729631A (en) Vision-based connector surface feature automatically-identifying method
CN116824516B (en) Road construction safety monitoring and management system
KR20180098945A (en) Method and apparatus for measuring speed of vehicle by using fixed single camera
CN111325265B (en) Detection method and device for tampered image
TWI543117B (en) Method for recognizing and locating object
CN110675442A (en) Local stereo matching method and system combined with target identification technology
CN114332622A (en) Label detection method based on machine vision
CN111126436B (en) Visual matching method and device
CN111652034A (en) Ship retrieval method and device based on SIFT algorithm
CN112329770B (en) Instrument scale identification method and device
CN114092542A (en) Bolt measuring method and system based on two-dimensional vision
CN115205541A (en) Leak detection method, leak detection apparatus, electronic device, and storage medium
CN114743048A (en) Method and device for detecting abnormal straw picture
WO2015136716A1 (en) Image processing device, image sensor, and image processing method
CN111274899A (en) Face matching method and device, electronic equipment and storage medium
CN111275693A (en) Counting method and counting device for objects in image and readable storage medium
CN113139454B (en) Road width extraction method and device based on single image
CN110378403B (en) Wire spool classification and identification method and system
KR102035245B1 (en) Apparatus and method for estimating position of target marker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant