CN110285752B - Method for matching, identifying and positioning specified object in image - Google Patents

Method for matching, identifying and positioning specified object in image Download PDF

Info

Publication number
CN110285752B
CN110285752B CN201910449926.1A CN201910449926A CN110285752B CN 110285752 B CN110285752 B CN 110285752B CN 201910449926 A CN201910449926 A CN 201910449926A CN 110285752 B CN110285752 B CN 110285752B
Authority
CN
China
Prior art keywords
edge
measured
reference object
size
pixel size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910449926.1A
Other languages
Chinese (zh)
Other versions
CN110285752A (en
Inventor
董宁
罗英靓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhisheng World Science And Technology Co ltd
Original Assignee
Beijing Zhisheng World Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhisheng World Science And Technology Co ltd filed Critical Beijing Zhisheng World Science And Technology Co ltd
Priority to CN201910449926.1A priority Critical patent/CN110285752B/en
Publication of CN110285752A publication Critical patent/CN110285752A/en
Application granted granted Critical
Publication of CN110285752B publication Critical patent/CN110285752B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness

Abstract

A matching identification and positioning method for a specified object in an image comprises the following steps: s100, measuring the handheld intelligent terminal to start a shooting function, and placing a measured object and a reference object in the middle of a terminal screen; s200, adjusting the relative positions of the measured object and the reference object; s300, shooting images of the adjusted measured object and the adjusted reference object by a measurer; s400, calculating the pixel size of the complete edge line; s500, calculating the pixel size of the incomplete edge; s600, respectively calculating edge proportion coefficients according to the pixel sizes of the complete edge, the incomplete edge and the actual size; s700, carrying out arithmetic mean on the scale coefficient of the edge line to obtain the scale coefficient; and S800, calculating the actual size according to the sideline proportion coefficient and the measured pixel size of the measured object. For the problems of reference object identification and matching positioning under the scenes that a reference object and a measured object cannot be accurately positioned on the same plane, the reference object cannot be completely displayed and the like in the actual application of the existing measurement of the human body and the object, the image pixel size of the reference object can be effectively calculated, and the actual size of the measured object can be calculated.

Description

Method for matching, identifying and positioning specified object in image
Technical Field
The invention belongs to the field of non-contact image measurement based on a reference object, and particularly relates to a matching identification and positioning method for a specified object in an image.
Background
In daily life and modern industrial production, aiming at the size measurement of a human body or an object, a traditional method of directly contacting a measured object by using a measuring tool or equipment can be adopted, and non-contact measurement can be carried out based on a smart phone, a tablet personal computer and other light and portable intelligent terminals. Because the traditional measuring tool or equipment is inconvenient to carry, part of scenes are difficult to touch, and measurement cannot be carried out at any time; and the non-contact measurement is carried out by adopting a portable intelligent terminal (hereinafter referred to as a 'mobile phone terminal' or a 'terminal') based on the processes of image shooting, identification processing, correction measurement and the like, so that the application is more convenient and efficient.
In contact measurement based on image capturing, since the distance from an unknown photographer to a measurement target cannot be known the actual size corresponding to the pixel size expressed in pixel units in an image, and a relative proportional relationship between the actual size and the pixel size is generally estimated using a reference object based on the size.
At present, based on intelligent terminal devices such as a smart phone and a tablet personal computer, related published documents exist in the field of measuring related sizes of human bodies or objects in images by adopting a non-contact image shooting mode. For example, a mobile phone length measuring method (application No. 201510152845.7) based on a bank card and a length measuring method (application No. 201210391787) based on money are proposed, in which a bank card and a banknote of known sizes are used as references, a reference object and a measured object are placed on a uniform plane to perform image capturing, and the size of an actual object is calculated from pixels of the image.
When the method is used for measurement by adopting the intelligent terminal, the reference object and the measured object are placed on the uniform plane for measurement, and the bank card is required to be placed independently, so that the complete area of the bank card can be clearly displayed, the requirements are usually difficult to realize in an actual human body measurement scene (for example, the measured person lifts the bank card), and an object identification and matching positioning method suitable for the actual scene such as human body measurement needs to be provided.
Disclosure of Invention
The invention aims to solve the problems of reference object identification and matching positioning in the existing human body and object measurement application under the scenes that a reference object and a measured object cannot be accurately positioned on the same plane, the reference object cannot be completely displayed and the like, and the actual size of the measured object is calculated based on the measurement result of the reference object. In order to solve the technical problems, the invention adopts the following technical scheme:
a matching identification and positioning method for a specified object in an image comprises the following steps:
step S100: the measurer holds the intelligent terminal in hand and keeps a certain distance with the measured object; starting an image shooting function, and placing a handheld standard reference object and a tested object main body in the middle of a display screen of the intelligent terminal;
step S200: adjusting the relative position of the measured object and the standard reference object to make the measured object and the handheld standard reference object approach to the same vertical plane as much as possible; the standard reference is allowed to be partially occluded, but should show at least two corners and three edges.
Step S300: the measurer performs image photographing on the object to be measured and the reference object whose positions are adjusted in step S200.
Step S400: calculating the complete edge pixel size: calculating the pixel size of the measured standard reference object according to the complete boundary angle and the boundary line during photographing;
step S500: calculating the pixel size of the incomplete edge: and local search is carried out according to the characteristics of the complete display corner point and the partial display edge line of the standard reference object, the starting and stopping positions of the incomplete edge line are determined, and the pixel size of the incomplete edge line is calculated and calculated.
Step S600: calculating the edge pixel size according to the complete edge in step S400 and the calculation result calculated in step S500, and dividing the edge pixel size by the actual edge size to obtain a scaling coefficient of size conversion, and an edge scaling coefficient kiThe edge actual size of the reference object/the edge pixel size of the reference object;
step S700: carrying out arithmetic mean on the proportional coefficients calculated by the complete sidelines and the incomplete sidelines to obtain the final proportional coefficient
Figure BDA0002074813120000021
Step S800: measuring and calculating the actual size of the measured object according to the proportionality coefficient k and the measured pixel size of the measured object; the actual size of the measurement object is k × the pixel size of the measurement object.
The technical method is suitable for the field of non-contact image measurement based on the reference object, and can effectively calculate the image pixel size of the reference object so as to calculate the actual size of the measured object for the problems of reference object identification and matching positioning under the scenes that the reference object and the measured object cannot be accurately positioned on the same plane, the reference object cannot be completely displayed and the like in the actual measurement application of the existing human body and the measured object.
(1) When the non-contact method based image shooting measurement application is adopted, when a reference object is partially shielded, and the reference object and the measured object are not in the same vertical plane, the relative position of the measured object and the reference object is determined by adopting the method based on a vertical datum line and a horizontal datum line;
(2) and determining the edge length under the condition that the reference object is blocked by adopting a local feature search algorithm.
Drawings
FIG. 1 general procedure
FIG. 2 center region and measurement reference line of image shot by intelligent terminal
FIG. 3 is a schematic diagram of a standard reference corner point and a complete/incomplete edge
The method comprises the following steps of 1-displaying a screen of the intelligent terminal, 2-vertical reference lines, 3-horizontal reference lines, 4-a screen center area, 5-a complete boundary line A of a reference object, 6-a reference object corner point A, 7-a complete boundary line B of the reference object, 8-a reference object corner point B and 9-an incomplete boundary line of the reference object.
Detailed Description
For the purpose of clarifying the technical solutions and advantages of the present invention, the present invention is further described in detail below with reference to examples. It should be understood that the detailed description and specific examples, while indicating the invention, are given by way of illustration only.
The invention relates to a matching identification and positioning method of a specified object in an image, which comprises the following steps:
step S100: the measurer holds the intelligent terminal in hand and keeps a certain distance with the measured object; starting an image shooting function, and placing the measured object main body of the handheld reference object in the middle of a display screen of the intelligent terminal, as shown in fig. 2;
in the step S100, the typical distance value of the measured person from the measured object is 3m-5m, so that the application requirements of most practical scenes are met;
in the step S100, the handheld reference object refers to an object with a known actual size Ls, such as a bank card, an identity card, a bus card, and the like;
step S200: adjusting the relative position of the measured object and the standard reference object to make the measured object and the handheld standard reference object approach the same plane as much as possible; the standard reference is allowed to be partially occluded, but should show at least two corners and three edges.
In step S200, the edge of the reference object is allowed to be partially blocked, but at least one edge of the reference object must be completely displayed;
in the step S200, the edge line of the reference object should be parallel to a vertical reference line or a horizontal reference line (as shown in fig. 2) in the screen of the smart terminal as much as possible.
Step S300: the measuring person takes an image of the object to be measured and the reference object whose positions have been adjusted only in step S200.
Step S400: calculating the complete edge pixel size: calculating the pixel size of the measured standard reference object according to the complete boundary angle and the boundary line during photographing;
the pixel size in step S400 refers to a measurement unit for measuring the length of the size to be measured in the image when the image is captured by the intelligent terminal for measurement, that is, the number of pixels occupied by the size to be measured. The pixel sizes are all defined in the following steps.
The complete borderline in step S400 refers to a borderline with a complete start-stop position, as shown in fig. 3.
In the step S400, the selected complete borderline should be parallel to the vertical reference line or the horizontal reference line of the screen of the intelligent terminal as much as possible.
Step S500: calculating the pixel size of the incomplete edge: and local search is carried out according to the characteristics of the complete display corner point and the partial display edge line of the standard reference object, the starting and stopping positions of the incomplete edge line are determined, and the pixel size of the incomplete edge line is calculated and calculated.
In step S500, the incomplete edge of the reference object is an edge that is partially blocked.
In step S500, the starting and ending positions of the incomplete edge determine the connection line of the corner points according to the local search algorithm.
In the step S500, the incomplete edge for calculation should be substantially parallel to the vertical reference line or the horizontal reference line of the intelligent terminal screen.
Step S600: calculating the edge pixel size according to the complete edge in step S400 and the calculation result calculated in step S500, and dividing the edge pixel size by the actual edge size to obtain a scaling coefficient of size conversion, and an edge scaling coefficient kiThe actual size of the reference object edge/the pixel size of the reference object edge.
Step S700: carrying out arithmetic mean on the proportional coefficients calculated by the complete sidelines and the incomplete sidelines to obtain the final proportional coefficient
Figure BDA0002074813120000041
Step S800: measuring and calculating the actual size of the measured object according to the proportionality coefficient k and the measured pixel size of the measured object; the actual size of the measurement object is k × the pixel size of the measurement object.
The matching identification and positioning method of the specified object in the image realizes that: (1) when the non-contact method based image shooting measurement application is adopted, when a reference object is partially shielded, and the reference object and the measured object are not in the same plane, the relative position of the measured object and the reference object is determined by adopting the method based on a vertical datum line and a horizontal datum line; (2) and determining the edge length under the condition that the reference object is blocked by adopting a local feature search algorithm.

Claims (10)

1. A matching identification and positioning method for a specified object in an image comprises the following steps:
step S100: the measurer holds the intelligent terminal in hand and keeps a certain distance with the measured object; starting an image shooting function, and placing a handheld standard reference object and a tested object main body in the middle of a display screen of the intelligent terminal;
step S200: adjusting the relative position of the measured object and the standard reference object to make the measured object and the handheld standard reference object approach the same plane as much as possible; the standard reference object is allowed to be partially occluded, but should show at least two corners and three edges;
step S300: the measurer takes an image of the measured object and the reference object after the position is adjusted in the step S200;
step S400: calculating the complete edge pixel size: calculating the pixel size of the measured standard reference object according to the complete boundary angle and the boundary line during photographing;
step S500: calculating the pixel size of the incomplete edge: local search is carried out according to the characteristics of the complete display corner points and the partial display edge lines of the standard reference object, the starting and stopping positions of the incomplete edge lines are determined, and the pixel size of the incomplete edge lines is calculated and calculated;
step S600: calculating the edge pixel size according to the complete edge in step S400 and the calculation result calculated in step S500, and dividing the edge pixel size by the actual edge size to obtain a scaling coefficient of size conversion, and an edge scaling coefficient kiEdge line of reference objectBoundary size/reference object boundary pixel size;
step S700: carrying out arithmetic mean on the proportional coefficients calculated by the complete sidelines and the incomplete sidelines to obtain the final proportional coefficient
Figure FDA0002777736020000011
Step S800: measuring and calculating the actual size of the measured object according to the proportionality coefficient k and the measured pixel size of the measured object; the actual size of the measurement object is k × the pixel size of the measurement object.
2. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in the step S100, the typical distance between the measured person and the measured object is 4m to 5m, so as to meet the application requirements of most actual scenes.
3. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in the step S100, the handheld reference object refers to an object with a known actual size of a bank card, an identity card, or a bus card.
4. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in step S200, the edge of the reference object is allowed to be partially blocked, but at least one edge of the reference object must be completely displayed.
5. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in the step S200, the edge line of the reference object should be parallel to the vertical reference line or the horizontal reference line in the screen of the intelligent terminal as much as possible.
6. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: the pixel size in step S400 refers to a measurement unit for measuring the length of the size to be measured in the image when the image is captured by the intelligent terminal for measurement, that is, the number of pixels occupied by the size to be measured.
7. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: the complete borderline in step S400 refers to a borderline with a complete start-stop position.
8. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in the step S400, the selected complete borderline should be parallel to the vertical reference line or the horizontal reference line of the screen of the intelligent terminal as much as possible.
9. The method of claim 1, wherein the matching identification and location method of the designated object in the image comprises: in step S500, the incomplete edge of the reference object refers to a partially blocked edge, and the starting and stopping positions of the incomplete edge determine the connection line of the corner points according to a local search algorithm.
10. The method of claim 9, wherein the matching identification and location of the designated object in the image comprises: in the step S500, the incomplete edge for calculation should be substantially parallel to the vertical reference line or the horizontal reference line of the intelligent terminal screen.
CN201910449926.1A 2019-05-28 2019-05-28 Method for matching, identifying and positioning specified object in image Expired - Fee Related CN110285752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910449926.1A CN110285752B (en) 2019-05-28 2019-05-28 Method for matching, identifying and positioning specified object in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910449926.1A CN110285752B (en) 2019-05-28 2019-05-28 Method for matching, identifying and positioning specified object in image

Publications (2)

Publication Number Publication Date
CN110285752A CN110285752A (en) 2019-09-27
CN110285752B true CN110285752B (en) 2021-05-18

Family

ID=68002570

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910449926.1A Expired - Fee Related CN110285752B (en) 2019-05-28 2019-05-28 Method for matching, identifying and positioning specified object in image

Country Status (1)

Country Link
CN (1) CN110285752B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115496399B (en) * 2022-10-12 2023-07-25 杭州余杭建筑设计院有限公司 Unmanned aerial vehicle-based foundation pit survey task instant updating and distributing method and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5303254B2 (en) * 2008-12-15 2013-10-02 東京エレクトロン株式会社 Foreign matter removal method and storage medium
CN102102978A (en) * 2009-12-16 2011-06-22 Tcl集团股份有限公司 Handheld terminal, and method and device for measuring object by using same
CN103630074B (en) * 2013-11-29 2016-09-21 北京京东尚科信息技术有限公司 A kind of method and apparatus of Measuring Object minimum package volume
CN104359403B (en) * 2014-11-21 2017-03-29 天津工业大学 Planar part dimension measurement method based on sub-pixel edge algorithm
CN105928598A (en) * 2016-04-20 2016-09-07 上海斐讯数据通信技术有限公司 Method and system for measuring object mass based on photographing

Also Published As

Publication number Publication date
CN110285752A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
CN106909911B (en) Image processing method, image processing apparatus, and electronic apparatus
CN106920279B (en) Three-dimensional map construction method and device
WO2019200837A1 (en) Method and system for measuring volume of parcel, and storage medium and mobile terminal
JP4813517B2 (en) Image processing apparatus, image processing program, image processing method, and electronic apparatus
CN111683204A (en) Unmanned aerial vehicle shooting method and device, computer equipment and storage medium
CN109654676B (en) Adjusting method, device and system of air supply device, computer equipment and storage medium
CN110570460B (en) Target tracking method, device, computer equipment and computer readable storage medium
CN107103056B (en) Local identification-based binocular vision indoor positioning database establishing method and positioning method
CN111307039A (en) Object length identification method and device, terminal equipment and storage medium
KR100810326B1 (en) Method for generation of multi-resolution 3d model
WO2014084249A1 (en) Facial recognition device, recognition method and program therefor, and information device
WO2021136386A1 (en) Data processing method, terminal, and server
CN110263662B (en) Human body contour key point and key part identification method based on grading
CN112017231A (en) Human body weight identification method and device based on monocular camera and storage medium
JP2019531788A (en) Image capturing method and system for wound evaluation by self-tone correction
CN110285752B (en) Method for matching, identifying and positioning specified object in image
CN114298902A (en) Image alignment method and device, electronic equipment and storage medium
CN114005108A (en) Pointer instrument degree identification method based on coordinate transformation
CN113012407A (en) Eye screen distance prompt myopia prevention system based on machine vision
JPWO2008041518A1 (en) Image processing apparatus, image processing apparatus control method, and image processing apparatus control program
CN113838151B (en) Camera calibration method, device, equipment and medium
CN109410272B (en) Transformer nut recognition and positioning device and method
CN111145254B (en) Door valve blank positioning method based on binocular vision
JP2008224323A (en) Stereoscopic photograph measuring instrument, stereoscopic photograph measuring method, and stereoscopic photograph measuring program
CN108537745B (en) Face image problem skin enhancement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210518

CF01 Termination of patent right due to non-payment of annual fee