CN106264537B - System and method for measuring human body posture height in image - Google Patents

System and method for measuring human body posture height in image Download PDF

Info

Publication number
CN106264537B
CN106264537B CN201510270981.6A CN201510270981A CN106264537B CN 106264537 B CN106264537 B CN 106264537B CN 201510270981 A CN201510270981 A CN 201510270981A CN 106264537 B CN106264537 B CN 106264537B
Authority
CN
China
Prior art keywords
image
calibration
measurement
measured
calibration reference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510270981.6A
Other languages
Chinese (zh)
Other versions
CN106264537A (en
Inventor
李苑
刁一平
张彬
潘静
马程
全晓臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN201510270981.6A priority Critical patent/CN106264537B/en
Publication of CN106264537A publication Critical patent/CN106264537A/en
Application granted granted Critical
Publication of CN106264537B publication Critical patent/CN106264537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a system and a method for measuring the human body posture height in an image, wherein the measuring method comprises the following steps: acquiring a measurement image containing a target to be measured; determining a valid calibration reference in the measurement image; and obtaining a mapping relation between the measurement image and a three-dimensional space according to the calibration reference object in the measurement image and the actual length of the calibration reference object, and obtaining the attitude height of the target to be measured according to the mapping relation between the measurement image and the three-dimensional space. The invention adopts the mapping technology of three-dimensional space and plane image, carries out the three-dimensional space information reconstruction technology on the scene where the object to be measured is positioned, realizes the conversion of the real distance corresponding to the space from the image line segment, overcomes the defect of over-complexity in professional measurement, and finally can obtain the effect of the measured human body by drawing the height line segment of the measured human body in the image according to the posture of the measured human body.

Description

System and method for measuring human body posture height in image
Technical Field
The invention relates to the technical field of measurement, in particular to a system and a method for measuring the height of a human body posture in an image.
Background
Measurements are essential in both human daily life and high-end technology. Such as the length of furniture, the height of a human body, the height of trees and the like in daily life, and the distance between a mobile robot and an obstacle to be sensed when the mobile robot avoids the obstacle in high-end science and technology.
Measurement techniques can be broadly divided into two categories: active and passive. The active measurement mainly includes: ultrasonic ranging, lidar ranging, infrared ranging, and the like. Passive measurements are mainly image/video based measurements. Since the measurement based on the image/video only needs to simply shoot the scene, obtain the image and the video information, the measurement is widely applied in some fields, for example, when a public security organization detects a case, the body of a suspect in a monitored video image is estimated, and the like, and the measurement needs to be obtained by the passive measurement method.
The prior method for measuring the human body posture height in an image discloses the following technical scheme, which comprises the following steps: (1) acquiring a scene image including a human body posture through an image acquisition device; (2) removing a scene background, and segmenting to obtain a human body image in a scene image; (3) and recognizing the human body posture of the human body image, judging whether the human body posture is upright, and if so, calculating the actual space distance between the highest point and the lowest point of the human body image to obtain the height of the human body. Although the technical scheme can obtain the human body posture height with certain precision, the technical scheme needs to remove the scene background in the image and segment the human body image in the scene image, so that the method is complex to operate and high in specificity, and cannot be easily popularized and used by general users. In addition, the technical scheme needs a special camera when shooting the human body image, and the camera must be capable of shooting the depth image instead of a conventional general monitoring camera, so that the technical scheme is not easy to be widely applied to daily work.
Therefore, it is desirable to provide a solution that does not require complicated operations and can measure the body posture height in an image using a conventional camera.
Disclosure of Invention
One of the technical problems to be solved by the present invention is to provide a method for measuring the height of a human body posture in an image, which is simple to operate and does not need to use a special camera.
In order to solve the above technical problem, an embodiment of the present invention first provides a method for measuring a height of a human body posture in an image, including: acquiring a measurement image containing a target to be measured; determining a valid calibration reference in the measurement image; and obtaining a mapping relation between the measurement image and a three-dimensional space according to the calibration reference object in the measurement image and the actual length of the calibration reference object, and obtaining the attitude height of the target to be measured according to the mapping relation between the measurement image and the three-dimensional space.
Preferably, the effective calibration references are at least four calibration references, and the calibration references comprise a vertical calibration reference and a horizontal calibration reference.
Preferably, the actual length of the calibration reference is within the same order of magnitude as the length of the object to be measured.
Preferably, the calibration reference surrounds the object to be measured.
Preferably, if no effective calibration reference object exists in the measurement image, the method further comprises the following steps: acquiring a calibration image, wherein the calibration image is an image of a required calibration reference object existing in the same scene in the measurement image; and carrying out image fusion on the calibration image and the measurement image, and taking the fused image as the measurement image.
Preferably, the mapping relation between the measurement image and the three-dimensional space is obtained by: drawing a corresponding calibration line according to the calibration reference object in the measurement image; and determining coordinates corresponding to the calibration lines, and obtaining a mapping relation between the measurement image and the three-dimensional space according to the coordinates of the calibration lines and the actual lengths of the calibration reference objects corresponding to the calibration lines.
Preferably, the attitude height of the target to be measured is obtained by: drawing a corresponding measuring line according to a target to be measured on the measuring image; and determining the coordinates of the measuring line, and calculating the actual length corresponding to the measuring line according to the mapping relation and the coordinates of the measuring line.
In another aspect, the present invention further provides a system for measuring the height of a human body gesture in an image, including: an acquisition module that acquires a measurement image containing an object to be measured; the judging module is used for determining an effective calibration reference object in the measurement image; the system comprises a measuring image acquisition module, a mapping relation calculation module and a height calculation module, wherein the measuring image acquisition module acquires the mapping relation between the measuring image and the three-dimensional space according to a calibration reference object in the measuring image and the actual length of the calibration reference object, and the height calculation module acquires the attitude height of the target to be measured according to the mapping relation between the measuring image and the three-dimensional space.
Preferably, the effective calibration references are at least four calibration references, and the calibration references comprise a vertical calibration reference and a horizontal calibration reference.
Preferably, the actual length of the calibration reference is within the same order of magnitude as the length of the object to be measured.
Preferably, the calibration reference surrounds the object to be measured.
Preferably, the method further comprises the following steps: and the fusion module is used for acquiring a calibration image when the judgment module judges that no effective calibration reference object exists in the measurement image, carrying out image fusion on the calibration image and the measurement image, and taking the fused image as the measurement image, wherein the calibration image is an image in which a required calibration reference object exists in the same scene in the measurement image.
Preferably, the mapping relation calculation module is further configured to: drawing a corresponding calibration line according to the calibration reference object in the measurement image; and determining coordinates corresponding to the calibration lines, and obtaining a mapping relation between the measurement image and the three-dimensional space according to the coordinates of the calibration lines and the actual lengths of the calibration reference objects corresponding to the calibration lines.
Preferably, the height calculation module is further configured to: drawing a corresponding measuring line according to a target to be measured on the measuring image; and determining the coordinates of the measuring line, and calculating the actual length corresponding to the measuring line according to the mapping relation and the coordinates of the measuring line.
One or more embodiments of the above-described aspects may have the following advantages or benefits over the prior art.
The invention adopts the mapping technology of three-dimensional space and plane image, carries out three-dimensional space information reconstruction technology on the scene where the object to be measured is positioned, realizes the conversion of the real distance corresponding to the space from the image line segment, overcomes the objective condition that the object to be measured disappears in the actual scene, overcomes the requirement bottleneck of the prior art on the acquisition source, overcomes the defect of over-complexity in professional measurement, and finally obtains the effect of the measured human body by drawing the height line segment in the image according to the posture of the measured person.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure and/or process particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention or of the prior art, and are incorporated in and constitute a part of this specification. The drawings showing the embodiments of the present invention are provided to explain the technical solutions of the present invention together with the embodiments of the present invention, and do not limit the technical solutions of the present invention.
Fig. 1 is a schematic flow chart of a method for measuring a height of a human body posture in an image according to an embodiment of the present invention.
Fig. 2A to 2E are explanatory diagrams of a method for measuring the height of the human body posture in the image according to the embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a system for measuring the height of a human body gesture in an image according to an embodiment of the present invention.
Fig. 4 is a schematic flow chart of the operation of the measurement system shown in fig. 3.
Detailed Description
The following detailed description of the embodiments of the present invention will be provided with reference to the accompanying drawings and examples, so that how to apply the technical means to solve the technical problems and achieve the corresponding technical effects can be fully understood and implemented. The embodiments of the present invention and the features of the embodiments can be combined with each other without conflict, and the formed technical solutions are within the scope of the present invention.
Additionally, the steps illustrated in the flow charts of the figures may be performed in a computer system such as a set of computer-executable instructions. Also, while a logical order is shown in the flow diagrams, in some cases, the steps shown or described may be performed in an order different than here.
The method for measuring the height of the human body posture in the image is mainly used for measuring the height of the human body posture appearing in the monitoring video, wherein the human body posture can comprise postures of standing, sitting, walking and the like of the human body, and the method can be used for measuring the heights of different postures. The invention obtains the height of the human body posture by determining the calibration reference object in the measurement image, then calculating the mapping relation between the measurement image and the three-dimensional space according to the calibration reference object and finally obtaining the height of the human body posture according to the mapping relation and the target to be measured in the measurement image. The method does not need a special camera to acquire the image or perform complex operations of removing the scene background in the image and segmenting the human body image in the scene image. In the measuring process, the device is not restricted by the image acquisition device, the operation is simple, and the body posture and height in the image can be accurately measured.
In the present invention, the term "measurement image" refers to an image containing an object to be measured. The term "calibration image" refers to a video image captured by a camera in the same scene as the target to be measured, and the image contains a calibration reference object. The term "calibration reference" refers to a physical reference that provides a plotted calibration line in an image.
Example one
Fig. 1 is a schematic flow chart of a method for measuring the height of a human body posture in an image according to an embodiment of the present invention. The flow of the method is described below with reference to fig. 1 and fig. 2A to 2E. As shown in fig. 1, the method for measuring the height of the human body posture in the image according to the embodiment of the present invention mainly includes the following steps.
In step S110, a measurement image containing an object to be measured is acquired.
In this step, a measurement image is acquired from the video image. These video images are captured by a general monitoring camera, and then the required measurement images are intercepted from the video images. The invention is not limited to a specific type of camera as long as the camera can capture images, and of course, a special type of surveillance camera may be applied.
It should be noted that, in order to ensure the calculation accuracy and reduce the error, the target to be measured in the acquired measurement image should have a certain definition, and therefore, the definition of the target to be measured in the measurement image needs to be determined.
In this embodiment, the sharpness of the image may be analyzed by using an image sharpness evaluation function. The image definition evaluation function mainly includes the following categories: (1) a gray scale change function, wherein the change of the gray scale value of the image can be used as an evaluation function because the focused image contains more gray scale changes than the defocused image; (2) the gradient function is used for extracting edges of the image in image processing, the better the focusing is, the sharper the image edges are, and the larger the image gradient value is, such as an image gray gradient energy function, a Robert gradient and a Laplace operator all belong to the gradient function; (3) the information entropy of the image with good focusing is larger than that of the image with out-of-focus, and the information entropy of the image reflects the information richness of the image, so the gray entropy of the image can also be used as an evaluation function. According to the evaluation functions, the definition of the measured image can be obtained, whether the definition meets the required definition or not is further judged, and the measured image with a clear target is finally obtained.
In step S120, a valid calibration reference object in the measurement image is determined, i.e., whether a valid calibration reference object exists in the measurement image is determined.
In this step, it is preferable that the effective calibration references are at least four calibration references, and the calibration references include vertical calibration references and horizontal calibration references, for example, three vertical calibration references and one horizontal calibration reference. The horizontal calibration reference object generally selects a line on the ground, while the vertical calibration reference object needs to be on the same plane with the foot landing point of the target to be measured (for example, the position of the foot of a human body), and the straight line formed by the whole object is required to be vertical to the ground, so that the wall corner edge, the railing, the telegraph pole, the square table edge and the like can be selected in practical application.
In a preferred embodiment, if there are a plurality of (more than four) calibration references in the measurement image, the actual length of the calibration references is within the same order of magnitude as the length of the object to be measured when determining the calibration references. For example, in general, the height of a person in posture is in a range of 1m to 2m, so when the calibration reference object is selected, the calibration reference object with the length of a few centimeters cannot be selected, and the calibration reference object with the length of 4 meters, 5 meters or more cannot be selected, because the actual length of the calibration reference object is too far away from the length of the target to be measured, a large error is introduced to the measurement precision, and therefore, the calibration reference object with the length of 1m to 2m is preferably selected from a plurality of calibration reference objects.
In addition, if only 5m or more vertical calibration references can be selected from the calibration references to be selected, which indicates that the target to be measured occupies a small proportion in the measurement picture, the accuracy of the height of the obtained human posture may be low, and thus the measurement image is not a preferred image.
Furthermore, when determining the calibration reference, those around the target to be measured are preferable, so that the influence of image distortion on the measurement accuracy can be reduced.
In addition, only two or three calibration references may exist in the measurement image, in this case, that is, it is determined that no valid calibration reference exists in the measurement image, step S130 is executed, otherwise step S140 is executed.
In step S130, a calibration image is obtained, the calibration image and the measurement image are subjected to image fusion, and the fused image is used as a new measurement image, where the calibration image is an image in which a required calibration reference object exists in the same scene in the measurement image.
Specifically, if there are only less than four calibration reference objects in the measurement image, the calibration reference objects are placed in the same scene in the measurement image, the image in the scene is captured by the camera, and then the calibration image is cut from the video record.
As shown in fig. 2A, there is insufficient calibration reference, and only the edge line of the table in the dotted circle meets the requirement. Therefore, we take a photograph of the same scene with a calibration object (in this case, a measuring tape) with a known size in front of the camera, and obtain a calibration image, as shown in fig. 2B.
Then, the two images are fused, namely, are transparently superposed in the process of processing the measurement image and the calibration image. In the case of performing the transparency superimposition, it is necessary to ensure that the background is completely superimposed on the fused image. As shown in fig. 2C, it can be seen that the fixed objects on the background, such as tables and chairs, floor items, etc., are completely overlapped. Only in the case of complete coincidence of the background, the calibration reference object in the calibration image can belong to the valid reference.
Through the above steps, enough calibration lines corresponding to the calibration reference object can be found in the fused image, as shown in fig. 2D, the solid line is the calibration line (including three vertical lines and one horizontal line).
Because the selection of the calibration reference object has a great influence on the final measured value, and a rational calibration reference object cannot be found in the measured image, the ideal calibration reference object can be built at the later stage by the design of fusing the calibration image and the measured image, and the method is not limited by the field situation.
In step S140, a mapping relationship between the measurement image and the three-dimensional space is obtained according to the calibration reference object in the measurement image and the actual length of the calibration reference object.
Specifically, a corresponding calibration line is drawn according to a calibration reference object in the measurement image. As shown in fig. 2D, in which the solid line is a calibration line (including three vertical lines and one horizontal line).
And then, determining coordinates corresponding to the calibration lines, and obtaining a mapping relation between the measurement image and the three-dimensional space according to the coordinates of the calibration lines and the actual lengths of the calibration reference objects corresponding to the calibration lines.
As shown in fig. 2E, the top coordinate of the first vertical calibration line is (u1, v1), the bottom coordinate is (u2, v2), and the actual length is h 1; the top coordinate of the second vertical calibration line is (u6, v6), the bottom coordinate is (u5, v5), and the actual length is h 2; the top coordinate of the third vertical calibration line is (u8, v8), the bottom coordinate is (u7, v7), and the actual length is h 3; the first coordinate of the horizontal calibration line is (u3, v3), the second coordinate is (u4, v4), and the distance between two points is d.
The actual length of the calibration reference object can be measured on site, but it should be noted that, for a vertical calibration reference object, if the calibration line drawn in the image is a part of the calibration reference object, the actual length corresponding to the calibration line is not the actual length of the complete reference object, but the actual length of a part of the reference object. For example, if the calibration reference object is an entire balustrade, the actual length of the corresponding calibration line is the height from the ground of the entire balustrade, and if the calibration reference object is a half balustrade, the actual length of the corresponding calibration line is the height from the ground of the half balustrade.
If the calibration line is a horizontal calibration line, the following expression is used:
Figure BDA0000723960890000071
Figure BDA0000723960890000072
Figure BDA0000723960890000073
if the calibration line is a vertical calibration line, the following expression is used:
Figure BDA0000723960890000074
wherein (u)1,v1)、(u2,v2) H is an input parameter, wherein (u)1,v1) Representing the top coordinates of the calibration line in the measurement image, (u)2,v2) The bottom coordinates of a calibration line in a measurement image are represented, and h represents the actual height corresponding to the vertical calibration lineAnd D is the actual distance corresponding to the horizontal calibration line. H. f, f,
Figure BDA0000723960890000075
H represents the actual height of the camera, f represents the normalized focal length of the camera,
Figure BDA0000723960890000076
indicating the tilt angle of the camera.
It is easy to understand that the mapping relationship between the measurement image obtained in this step and the three-dimensional space is essentially obtained H, f,
Figure BDA0000723960890000081
These three unknowns. H, f can be calculated by substituting the input parameters of at least four calibration lines (including a vertical calibration line and a horizontal calibration line) into the expression to form a sufficient equation set and performing computer iterative operation,
Figure BDA0000723960890000082
Is unknown quantity, thus obtaining the mapping relation between the measurement image and the three-dimensional space.
Since this step uses only the calibration lines in the image to construct the mapping relationship between the two-dimensional image and the three-dimensional space, the amount of calculation is greatly reduced compared to the prior art, and the calculation is simpler. In addition, the required calibration line is easier to acquire in a real scene, and therefore, the calibration line is not limited by the scene.
In step S150, the attitude height of the target to be measured is obtained according to the mapping relationship between the measurement image and the three-dimensional space.
Specifically, a measurement line is drawn according to an object to be measured on the measurement image, coordinates of the measurement line are determined, and an actual length (for example, a human body posture height) corresponding to the measurement line is calculated according to the mapping relation and the coordinates of the measurement line. More specifically, the top coordinates and the bottom coordinates of the measurement line are substituted into the following expression.
At this time H, f,
Figure BDA0000723960890000084
For a known quantity, (u1, v1) is the top coordinate of the measurement line in the image corresponding to the object to be measured, (u2, v2) is the bottom coordinate of the measurement line. h is the measured value to be obtained, namely the human posture height.
As shown in fig. 2E, the dashed lines are drawn measurement lines, where (u9, v9) and (u10, v10) are the top and bottom coordinates of the measurement line, respectively, and substituting them into the above equation can result in the size of h.
In summary, the invention adopts the mapping technology of the three-dimensional space and the plane image, and the three-dimensional space information reconstruction technology is carried out on the scene where the target to be measured is located, so that the real distance corresponding to the space is converted from the image line segment, the objective condition that the measured target disappears in the actual scene is overcome, the requirement bottleneck of the prior art on the acquisition source is overcome, the defect of over-complexity in professional measurement is overcome, and finally the height line segment can be drawn in the image according to the posture of the measured person, so that the effect of the measured human body is obtained.
Example two
Fig. 3 is a schematic structural diagram of a measurement system for determining the posture and the body of a human body in an image according to an embodiment of the present invention. The respective constituent structures and functions of the measurement system are explained in detail below with reference to fig. 3.
As shown in fig. 3, the measuring system mainly includes an obtaining module 210, a determining module 220, a fusing module 230, a mapping relation calculating module 240, and a height calculating module 250.
Fig. 4 is a schematic flow chart of the operation of the measurement system shown in fig. 3, and the operation of the system will be described with reference to fig. 3 and 4.
The acquisition module 210 acquires a measurement image containing an object to be measured (as in step S410).
The acquisition module 210 intercepts a required measurement image from a video image acquired by a general monitoring camera. The invention is not limited to a specific type of camera as long as the camera can capture images, and of course, a special type of surveillance camera may be applied.
It should be noted that, in order to ensure the calculation accuracy and reduce the error, the target to be measured in the measurement image acquired by the acquisition module 210 should have a certain definition, and therefore, the acquisition module 210 determines the definition of the target to be measured in the measurement image. In this embodiment, the obtaining module 210 may analyze the sharpness of the image by using image sharpness evaluation functions, and according to the evaluation functions, the sharpness of the measured image may be obtained, and then whether the sharpness meets the required sharpness is determined, and finally the measured image with a clear target is obtained.
The determining module 220, connected to the obtaining module 210, determines a valid calibration reference object in the measurement image (as in step S420), that is, determines whether a valid calibration reference object exists in the measurement image.
Preferably, the determining module 220 determines whether at least four calibration references, such as three vertical calibration references and one horizontal calibration reference, are included in the current calibration references. The horizontal calibration reference object generally selects a line on the ground, while the vertical calibration reference object needs to be on the same plane with the foot landing point of the target to be measured (for example, the position of the foot of a human body), and the straight line formed by the whole object is required to be vertical to the ground, so that the wall corner edge, the railing, the telegraph pole, the square table edge and the like can be selected in practical application.
In a preferred embodiment, if there are multiple (more than four) calibration references in the measurement image, the determining module 220 determines the calibration references that have actual lengths within the same order of magnitude as the length of the target to be measured.
Furthermore, when the determination module 220 determines the calibration reference objects, it is preferable that the calibration reference objects surround the target to be measured, so that the influence of image distortion on the measurement accuracy can be reduced.
The mapping relation calculating module 240, connected to the determining module 220, obtains the mapping relation between the measurement image and the three-dimensional space according to the calibration reference object in the measurement image and the actual length of the calibration reference object (in step S430).
Specifically, the mapping relationship calculation module 240 first draws corresponding calibration lines according to the calibration reference objects in the measurement image, then determines the coordinates corresponding to the calibration lines, and obtains the mapping relationship between the measurement image and the three-dimensional space according to the coordinates of the calibration lines and the actual lengths of the calibration reference objects corresponding to the calibration lines.
If the calibration line is a horizontal calibration line, the following expression is used:
Figure BDA0000723960890000102
Figure BDA0000723960890000103
if the calibration line is a vertical calibration line, the following expression is used:
Figure BDA0000723960890000104
wherein (u)1,v1)、(u2,v2) H is an input parameter, wherein (u)1,v1) Representing the top coordinates of the calibration line in the measurement image, (u)2,v2) The bottom coordinates of a calibration line in the measured image are represented, h represents the actual height corresponding to the vertical calibration line, and D is the actual distance corresponding to the horizontal calibration line. H. f, f,
Figure BDA0000723960890000105
H represents the actual height of the camera, f represents the normalized focal length of the camera,
Figure BDA0000723960890000106
indicating the tilt angle of the camera.
It is easy to understand that the mapping relation between the measurement image and the three-dimensional space obtained by the mapping relation calculation module 240 is substantially H, f,
Figure BDA0000723960890000107
These three unknowns. The mapping relation calculating module 240 can calculate H, f, by substituting the input parameters of at least four calibration lines (including the vertical calibration line and the horizontal calibration line) into the above expression to form a sufficient equation set, and performing iterative operation by using a computer,
Figure BDA0000723960890000108
Is unknown quantity, thus obtaining the mapping relation between the measurement image and the three-dimensional space.
The height calculating module 250, connected to the mapping relation calculating module 240, obtains the attitude height of the target to be measured according to the mapping relation between the measurement image and the three-dimensional space (as in step S440).
The height calculating module 250 draws a measuring line according to the target to be measured on the measuring image, determines the coordinates of the measuring line, and calculates the actual length value (e.g. the height of the human body posture) corresponding to the measuring line according to the mapping relation and the coordinates of the measuring line. More specifically, the height calculation module 250 substitutes the top and bottom coordinates of the measurement line into the following expression.
Figure BDA0000723960890000109
At this time H, f,
Figure BDA00007239608900001010
For a known quantity, (u1, v1) is the top coordinate of the measurement line in the image corresponding to the object to be measured, (u2, v2) is the bottom coordinate of the measurement line. h is the measured value to be obtained, namely the human posture height.
In addition, when the determining module 220 determines that no valid calibration reference object exists in the measurement image, the fusing module 230 starts to operate and sends the generated result to the mapping relation calculating module 240.
And a fusion module 230 connected to the determination module 220 and the mapping relation calculation module 240, wherein when the determination module 220 determines that no valid calibration reference object exists in the measurement image, the fusion module 230 obtains the calibration image, performs image fusion on the calibration image and the measurement image, and sends the fused image as a new measurement image to the mapping relation calculation module 240, where the calibration image is an image in which a required calibration reference object exists in the same scene in the measurement image.
The fusion module 230 needs to fuse the two images, i.e. transparent overlay, in the process of processing the measurement image and the calibration image. In the case of performing the transparency superimposition, it is necessary to ensure that the background is completely superimposed on the fused image. Only in the case of complete coincidence of the background, the calibration reference object in the calibration image can belong to the valid reference.
Since the selection of the calibration reference substance has a great influence on the final measurement value, and a rational calibration reference substance cannot be found in the measurement image, the fusion module 230 can build an ideal calibration reference substance at a later stage by the design of fusing the calibration image and the measurement image, and is not limited by the field situation.
In summary, the invention adopts the mapping technology of the three-dimensional space and the plane image, and the three-dimensional space information reconstruction technology is carried out on the scene where the target to be measured is located, so that the real distance corresponding to the space is converted from the image line segment, the objective condition that the measured target disappears in the actual scene is overcome, the requirement bottleneck of the prior art on the acquisition source is overcome, the defect of over-complexity in professional measurement is overcome, and finally the height line segment can be drawn in the image according to the posture of the measured person, so that the effect of the measured human body is obtained.
It will be appreciated by those skilled in the art that the embodiments of the invention described above provide systems where the components, and steps of the method, may be centralized on a single computing device or distributed across a network of computing devices. Alternatively, they may be implemented in program code executable by a computing device. Thus, they may be stored in a memory device for execution by a computing device, or they may be separately fabricated as individual integrated circuit modules, or multiple modules or steps thereof may be fabricated as a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
Although the embodiments of the present invention have been described above, the above description is only for the convenience of understanding the technical solution of the present invention, and is not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Those skilled in the art will appreciate that all or part of the steps in the method for implementing the above embodiments may be implemented by hardware instructions related to a program, where the program may be stored in a computer-readable storage medium, and when the program is executed, the program includes the steps of the first embodiment, and the storage medium, such as: ROM/RAM, magnetic disk, optical disk, etc.

Claims (12)

1. A method for measuring the height of a human body posture in an image comprises the following steps:
acquiring a measurement image containing a target to be measured;
determining a valid calibration reference in the measurement image;
obtaining the mapping relation between the measurement image and the three-dimensional space according to the calibration reference object in the measurement image and the actual length of the calibration reference object,
obtaining the attitude height of the target to be measured according to the mapping relation between the measurement image and the three-dimensional space,
the mapping relation between the measurement image and the three-dimensional space is obtained through the following steps:
drawing a corresponding calibration line according to the calibration reference object in the measurement image;
and determining coordinates corresponding to the calibration lines, obtaining the actual height of the camera, the normalized focal length of the camera and the inclination angle of the camera according to the coordinates of the calibration lines and the actual length of the calibration reference object corresponding to the calibration lines, and obtaining the mapping relation between the measurement image and the three-dimensional space according to the actual height of the camera, the normalized focal length of the camera and the inclination angle of the camera.
2. The measuring method according to claim 1,
the effective calibration references are at least four calibration references, and the calibration references comprise a vertical calibration reference and a horizontal calibration reference.
3. The measuring method according to claim 1,
the actual length of the calibration reference object and the length of the target to be measured are in the same order of magnitude.
4. The measuring method according to claim 1,
the calibration reference surrounds the object to be measured.
5. The measurement method according to any one of claims 1 to 4, further comprising the following steps if no valid calibration reference exists in the measurement image:
acquiring a calibration image, wherein the calibration image is an image of a required calibration reference object existing in the same scene in the measurement image;
and carrying out image fusion on the calibration image and the measurement image, and taking the fused image as the measurement image.
6. The measurement method according to claim 1, characterized in that the attitude height of the object to be measured is obtained by:
drawing a corresponding measuring line according to a target to be measured on the measuring image;
and determining the coordinates of the measuring line, and calculating the actual length corresponding to the measuring line according to the mapping relation and the coordinates of the measuring line.
7. A system for measuring the height of a human body gesture in an image, comprising:
an acquisition module that acquires a measurement image containing an object to be measured;
the judging module is used for determining an effective calibration reference object in the measurement image;
a mapping relation calculation module for obtaining the mapping relation between the measurement image and the three-dimensional space according to the calibration reference object in the measurement image and the actual length of the calibration reference object,
a height calculation module for obtaining the attitude height of the target to be measured according to the mapping relation between the measurement image and the three-dimensional space,
wherein the mapping relation calculation module is further configured to:
drawing a corresponding calibration line according to the calibration reference object in the measurement image;
and determining coordinates corresponding to the calibration lines, obtaining the actual height of the camera, the normalized focal length of the camera and the inclination angle of the camera according to the coordinates of the calibration lines and the actual length of the calibration reference object corresponding to the calibration lines, and obtaining the mapping relation between the measurement image and the three-dimensional space according to the actual height of the camera, the normalized focal length of the camera and the inclination angle of the camera.
8. The measurement system of claim 7,
the effective calibration references are at least four calibration references, and the calibration references comprise a vertical calibration reference and a horizontal calibration reference.
9. The measurement system of claim 7,
the actual length of the calibration reference object and the length of the target to be measured are in the same order of magnitude.
10. The measurement system of claim 7,
the calibration reference surrounds the object to be measured.
11. The measurement system of any one of claims 7 to 10, further comprising:
a fusion module for obtaining a calibration image when the judgment module judges that no effective calibration reference object exists in the measurement image, performing image fusion on the calibration image and the measurement image, taking the fused image as the measurement image,
and the calibration image is an image of a required calibration reference object existing in the same scene in the measurement image.
12. The measurement system of claim 7, wherein the height calculation module is further configured to:
drawing a corresponding measuring line according to a target to be measured on the measuring image;
and determining the coordinates of the measuring line, and calculating the actual length corresponding to the measuring line according to the mapping relation and the coordinates of the measuring line.
CN201510270981.6A 2015-05-25 2015-05-25 System and method for measuring human body posture height in image Active CN106264537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510270981.6A CN106264537B (en) 2015-05-25 2015-05-25 System and method for measuring human body posture height in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510270981.6A CN106264537B (en) 2015-05-25 2015-05-25 System and method for measuring human body posture height in image

Publications (2)

Publication Number Publication Date
CN106264537A CN106264537A (en) 2017-01-04
CN106264537B true CN106264537B (en) 2020-02-18

Family

ID=57634473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510270981.6A Active CN106264537B (en) 2015-05-25 2015-05-25 System and method for measuring human body posture height in image

Country Status (1)

Country Link
CN (1) CN106264537B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10582095B2 (en) 2016-10-14 2020-03-03 MP High Tech Solutions Pty Ltd Imaging apparatuses and enclosures
US11765323B2 (en) * 2017-05-26 2023-09-19 Calumino Pty Ltd. Apparatus and method of location determination in a thermal imaging system
CN110338800A (en) * 2019-07-17 2019-10-18 成都泰盟软件有限公司 The chaise longue of automatic measurement height and weight
CN111161339B (en) * 2019-11-18 2020-11-27 珠海随变科技有限公司 Distance measuring method, device, equipment and computer readable medium
CN111442845A (en) * 2020-03-26 2020-07-24 浙江大华技术股份有限公司 Infrared temperature measurement method and device based on distance compensation and computer storage medium
CN111557666B (en) * 2020-05-28 2022-10-25 苏州市职业大学 Height measuring method and height measuring device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN103905797A (en) * 2014-04-18 2014-07-02 山东神戎电子股份有限公司 Monitoring equipment with distance measurement function and distance measurement method
CN104274180A (en) * 2013-07-03 2015-01-14 中山大学深圳研究院 Single-image human body height measuring method based on constructed planes
CN104508704A (en) * 2012-05-25 2015-04-08 波可斯有限公司 Body measurement

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9338409B2 (en) * 2012-01-17 2016-05-10 Avigilon Fortress Corporation System and method for home health care monitoring
CN104392450A (en) * 2014-11-27 2015-03-04 苏州科达科技股份有限公司 Method for determining focal length and rotary angles of camera, camera calibration method and camera calibration system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN104508704A (en) * 2012-05-25 2015-04-08 波可斯有限公司 Body measurement
CN104274180A (en) * 2013-07-03 2015-01-14 中山大学深圳研究院 Single-image human body height measuring method based on constructed planes
CN103905797A (en) * 2014-04-18 2014-07-02 山东神戎电子股份有限公司 Monitoring equipment with distance measurement function and distance measurement method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
摄像机定标与单视测量技术研究;王斌;《中国优秀硕士学位论文全文数据库信息科技辑》;20090115(第1期);第45-47页 *

Also Published As

Publication number Publication date
CN106264537A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
CN106264537B (en) System and method for measuring human body posture height in image
US8988317B1 (en) Depth determination for light field images
JP6295645B2 (en) Object detection method and object detection apparatus
CN106959691B (en) Mobile electronic equipment and instant positioning and map construction method
US20210342990A1 (en) Image coordinate system transformation method and apparatus, device, and storage medium
US20180012371A1 (en) Image Registration with Device Data
US10762649B2 (en) Methods and systems for providing selective disparity refinement
JP6619927B2 (en) Calibration device
CN104715479A (en) Scene reproduction detection method based on augmented virtuality
CN109147027B (en) Monocular image three-dimensional rebuilding method, system and device based on reference planes
JP6174104B2 (en) Method, apparatus and system for generating indoor 2D plan view
WO2021136386A1 (en) Data processing method, terminal, and server
KR102263152B1 (en) Method and apparatus for object detection in 3d point clouds
JP5027741B2 (en) Image monitoring device
US8599278B2 (en) Method for estimating a plane in a range image and range image camera
WO2022237026A1 (en) Plane information detection method and system
US20130135446A1 (en) Street view creating system and method thereof
JP2017205135A (en) Individual identification device, individual identification method, and individual identification program
US20230394834A1 (en) Method, system and computer readable media for object detection coverage estimation
KR101469099B1 (en) Auto-Camera Calibration Method Based on Human Object Tracking
CN111047678A (en) Three-dimensional face acquisition device and method
JP4914870B2 (en) Congestion degree measuring device, congestion degree measuring method, congestion degree measuring program, and recording medium recording the program
CN110443228B (en) Pedestrian matching method and device, electronic equipment and storage medium
CN109242900B (en) Focal plane positioning method, processing device, focal plane positioning system and storage medium
Aliakbarpour et al. Multi-sensor 3D volumetric reconstruction using CUDA

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant