CN104539926A - Distance determination method and equipment - Google Patents

Distance determination method and equipment Download PDF

Info

Publication number
CN104539926A
CN104539926A CN201410804150.8A CN201410804150A CN104539926A CN 104539926 A CN104539926 A CN 104539926A CN 201410804150 A CN201410804150 A CN 201410804150A CN 104539926 A CN104539926 A CN 104539926A
Authority
CN
China
Prior art keywords
primary importance
information
references object
intersecting point
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410804150.8A
Other languages
Chinese (zh)
Other versions
CN104539926B (en
Inventor
王正翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410804150.8A priority Critical patent/CN104539926B/en
Publication of CN104539926A publication Critical patent/CN104539926A/en
Application granted granted Critical
Publication of CN104539926B publication Critical patent/CN104539926B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a distance determination method and equipment and relates to the field of image processing. The method includes the steps that first position information of a first position where imaging equipment is located is obtained; a reference object and a target object are determined in an image formed by the imaging equipment at the first position; according to the first position information and the target object, an intersection point position is determined on the reference object or an extension image of the reference object; the height information of an intersection point at the intersection point position is determined; the reference horizontal distance from the reference object to the first position is determined according to the first position information and the reference position information of the reference object; according to the reference horizontal distance, the intersection point height information and the first position information, the target horizontal distance from the target object to the first position is determined. The distance determination method and equipment are beneficial to further subdivision on different objects in distance images.

Description

Distance defining method and equipment
Technical field
The application relates to image processing field, particularly relates to a kind of distance defining method and equipment.
Background technology
Along with the development of the communication technology, digital camera, slr camera, smart mobile phone etc. more carry out the life that also many imaging devices progress into people, have greatly enriched the life of people.
Current imaging device, for the depth calculation of become objects in images, is measured mainly through the method such as infrared-ray or binocular stereo vision, to the depth measurements comparatively satisfaction of near objects; But comparatively large for the depth survey error of distant view, distant view image region often can only be unified to be classified as background classes, and is difficult to further segment according to the depth distance of reality again.
Summary of the invention
The object of the application is: provide a kind of distance defining method and equipment.
According to the first aspect of at least one embodiment of the application, provide a kind of distance defining method, described method comprises:
Obtain the primary importance information of imaging device place primary importance;
Become in image to determine a references object and a destination object at described imaging device in described primary importance place;
On the extension bitmap picture of described references object or described references object, a position of intersecting point is determined according to described primary importance information and described destination object;
Determine the intersection height information of described position of intersecting point;
Reference position information according to described primary importance information and described references object determines the reference levels distance of described references object to described primary importance;
The target level distance of described destination object to described primary importance is determined according to described reference levels distance, described intersection height information and described primary importance information.
In conjunction with any one possible implementation of first aspect, in the implementation that the second is possible, described primary importance information comprises: the elevation information of described primary importance and horizontal coordinate information.
In conjunction with any one possible implementation of first aspect, in the implementation that the third is possible, in described image, determine described references object by image recognition.
In conjunction with any one possible implementation of first aspect, in the 4th kind of possible implementation, described target level distance is greater than described reference levels distance.
In conjunction with any one possible implementation of first aspect, in the 5th kind of possible implementation, described method also comprises: the bottom determining described destination object.
In conjunction with any one possible implementation of first aspect, in the 6th kind of possible implementation, describedly on the extension bitmap picture of described references object or described references object, determine that a position of intersecting point comprises according to described primary importance information and described destination object:
On the extension bitmap picture of described references object or described references object, a position of intersecting point is determined according to the bottom of described primary importance information and described destination object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
In conjunction with any one possible implementation of first aspect, in the 7th kind of possible implementation, describedly determine that the bottom of described destination object comprises:
Bottom in response to described destination object is blocked completely by one the 3rd object, determines the bottom of bottom as described destination object of described 3rd object.
In conjunction with any one possible implementation of first aspect, in the 8th kind of possible implementation, described method also comprises:
Obtain the reference altitude information of described references object;
Determine bottom and the top of described references object.
In conjunction with any one possible implementation of first aspect, in the 9th kind of possible implementation, describedly determine that the intersection height information of described position of intersecting point comprises:
According to bottom and the top of described reference altitude information, described references object, determine the intersection height information of described position of intersecting point.
In conjunction with any one possible implementation of first aspect, in the tenth kind of possible implementation, describedly determine that the bottom of described references object comprises:
Bottom in response to described references object is blocked completely by one the 4th object, determines the bottom of bottom as described references object of described 4th object.
According to the second aspect of at least one embodiment of the application, provide a kind of distance and determine equipment, described equipment comprises:
Primary importance acquisition module, for obtaining the primary importance information of imaging device place primary importance;
Object determination module, for becoming in image to determine a references object and a destination object at described imaging device in described primary importance place;
Position of intersecting point determination module, for determining a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object;
Intersection height determination module, for determining the intersection height information of described position of intersecting point;
Reference levels distance determination module, for determining the reference levels distance of described references object to described primary importance according to the reference position information of described primary importance information and described references object;
Target level distance determination module, for determining the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
In conjunction with any one possible implementation of second aspect, in the implementation that the second is possible, described object determination module, for determining described references object by image recognition in described image.
In conjunction with any one possible implementation of second aspect, in the implementation that the third is possible, described equipment also comprises:
One destination object local determination module, for determining the bottom of described destination object.
In conjunction with any one possible implementation of second aspect, in the 4th kind of possible implementation, described position of intersecting point determination module, for determining a position of intersecting point according to the bottom of described primary importance information and described destination object on the extension bitmap picture of described references object or described references object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
In conjunction with any one possible implementation of second aspect, in the 5th kind of possible implementation, described destination object local determination module, blocked completely by one the 3rd object for the bottom in response to described destination object, determine the bottom of bottom as described destination object of described 3rd object.
In conjunction with any one possible implementation of second aspect, in the 6th kind of possible implementation, described equipment also comprises:
One reference altitude acquisition module, for obtaining the reference altitude information of described references object;
One references object local determination module, for determining bottom and the top of described references object.
In conjunction with any one possible implementation of second aspect, in the 7th kind of possible implementation, described intersection height determination module, for according to the bottom of described reference altitude information, described references object and top, determines the intersection height information of described position of intersecting point.
In conjunction with any one possible implementation of second aspect, in the 8th kind of possible implementation, described references object local determination module, blocked completely by one the 4th object for the bottom in response to described references object, determine the bottom of bottom as described references object of described 4th object.
Distance defining method and equipment described in the embodiment of the present application, according to the primary importance information of imaging device place primary importance and the relevant information of a references object in become image, proportionate relationship based on similar triangles obtains the horizontal range of primary importance and a destination object in become image, thus provide a kind of method determining the degree of depth of destination object in image, be conducive to segmenting further object different in distant view image.
Accompanying drawing explanation
Fig. 1 is the flow chart of distance defining method described in the application's embodiment;
Fig. 2 is the flow chart of distance defining method described in the application's execution mode;
Fig. 3 is the end view of imaging scene in the application's execution mode;
Fig. 4 be in the application's execution mode become image schematic diagram;
Fig. 5 is the flow chart of distance defining method described in another execution mode of the application;
Fig. 6 is the modular structure schematic diagram of distance determination equipment described in the application's embodiment;
Fig. 7 is the modular structure schematic diagram of distance determination equipment described in the application's execution mode;
Fig. 8 is the modular structure schematic diagram of distance determination equipment described in another execution mode of the application;
Fig. 9 is the hardware configuration schematic diagram of distance determination equipment described in the application's embodiment.
Embodiment
Below in conjunction with drawings and Examples, the embodiment of the application is described in further detail.Following examples for illustration of the application, but are not used for limiting the scope of the application.
Those skilled in the art understand, in the embodiment of the application, the size of the sequence number of following each step does not also mean that the priority of execution sequence, and the execution sequence of each step should be determined with its function and internal logic, and should not form any restriction to the implementation process of the embodiment of the present application.
Fig. 1 is the flow chart of distance defining method described in the application's embodiment, and described method can be determined equipment realizes in a such as distance.As shown in Figure 1, described method comprises:
S120: the primary importance information obtaining imaging device place primary importance;
S140: become in image to determine a references object and a destination object in described primary importance place at described imaging device;
S160: determine a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object;
S180: the intersection height information determining described position of intersecting point;
S200: the reference position information according to described primary importance information and described references object determines the reference levels distance of described references object to described primary importance;
S220: determine the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
Method described in the embodiment of the present application, according to the primary importance information of imaging device place primary importance and the relevant information of a references object in become image, proportionate relationship based on similar triangles obtains the horizontal range of primary importance and a destination object in become image, thus provide a kind of method determining the degree of depth of destination object in image, be conducive to segmenting further object different in distant view image.
Below with reference to embodiment, describe the function of described step S120, S140, S160, S180, S200 and S220 in detail.
S120: the primary importance information obtaining imaging device place primary importance.
Wherein, described primary importance is the image space of described imaging device, i.e. the position of photographic images.Described primary importance information can comprise: the elevation information of described primary importance and horizontal coordinate information.Described elevation information can be the height value of described primary importance to ground level, and described horizontal coordinate information can be latitude and longitude information.Described primary importance information can as by acquisitions such as access GPS (global positioning system), dipper systems.
S140: become in image to determine a references object and a destination object in described primary importance place at described imaging device.
Wherein, described references object can be a significant object, a such as road sign building, in other words, described references object should have obvious visual signature can be determined by such as image recognition, further, the relevant information of described references object can be obtained by retrieve relevant data storehouse.Described relevant information can comprise the elevation information, latitude and longitude information etc. of described references object.Such as, described image comprises Central TV Tower, can determine that Central TV Tower is described references object by image recognition.
Described destination object and object to be measured, generally can specify according to user and determine.
S160: determine a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object.
See Fig. 2, in one embodiment, described method also comprises:
S150: the bottom determining described destination object.
The bottom of described destination object can be determined by such as image recognition mode.Wherein, in application scenes, the bottom of described destination object may be blocked completely, namely the bottom of described destination object cannot directly be picked out, such as blocked by some green planting, in this case, because shelter is comparatively near apart from described destination object, the bottom of the bottom of described shelter as described destination object can therefore be determined.Also namely, described step S150 is further:
S150 ': the bottom in response to described destination object is blocked completely by one the 3rd object, determines the bottom of bottom as described destination object of described 3rd object.
In one embodiment, described step S160 comprises further:
S160 ': determine a position of intersecting point according to the bottom of described primary importance information and described destination object on the extension bitmap picture of described references object or described references object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
Wherein, in one embodiment, the position relationship of described primary importance, described destination object and described references object can be as shown in Figure 3 a and Figure 3 b shows.
Fig. 3 a is the end view of imaging scene, and wherein, described destination object can build B in Fig. 3 a, and described references object can for building L in Fig. 3 a, and described primary importance can for building the top A of A in Fig. 3 a 2.Due to the bottom of described primary importance, described destination object and described position of intersecting point on the same line, therefore, can at the top A of building A 2and do line between the bottom of building B, the intersection point of this line and building L and described position of intersecting point O.
Fig. 3 b is become image schematic diagram, is alternatively the front view of imaging scene.Wherein, S1 represents the horizon at the bottom place of building B, and S2 represents the horizon at the bottom place of building L, the line segment in the intersection point O corresponding diagram 4 in Fig. 3 a between O1 to O2.Can see, in described Fig. 3 a, S1 is above S2, this is because for described primary importance, building L is in the front of building B.In other words, the horizontal range of building L and described primary importance is less than the horizontal range of building B and described primary importance.It should be noted that, if become image be tilt, first can carry out aligning process to image.
In another embodiment, the position relationship of described primary importance, described destination object and described references object can be as shown in figures 4 a and 4b.
Fig. 4 a is the end view of imaging scene, and wherein, described destination object can build B in Fig. 4 a, and described references object can for building L in Fig. 4 a, and described primary importance can for building the top A of A in Fig. 4 a 2.Due to the bottom of described primary importance, described destination object and described position of intersecting point on the same line, therefore, can at the top A of building A 2and do line between the bottom of building B.Can see, because the height building L is lower, this line can not be directly crossing with building L's, therefore can make the extension bitmap picture (in Fig. 4 a dotted portion) of building L, then obtain the intersection point O of described line and described extension bitmap picture.
Fig. 4 b is become image schematic diagram, is alternatively the front view of imaging scene.Wherein, S1 represents the horizon at the bottom place of building B, and S2 represents the horizon at the bottom place of building L, the line segment in the intersection point O corresponding diagram 4b in Fig. 4 a between O1 to O2.Can see, because building L is lower, in fig. 4b, the top of building L is lower than the bottom of building B, and described intersection point O is in fact on the extension bitmap picture (in figure vertical dotted portion) of described building L.
It should be noted that, the extension bitmap picture of described references object can be the image of the extended line institute region at described references object edge vertically.
In addition, it will be appreciated by those skilled in the art that described destination object and described references object might not be only building, can also be such as massif etc.
S180: the intersection height information determining described position of intersecting point.
Wherein, described intersection height information can be the height value of described position of intersecting point distance ground level.
See Fig. 5, in one embodiment, described method also comprises:
S171: the reference altitude information obtaining described references object;
S172: bottom and the top of determining described references object.
Wherein, the height value of described reference altitude information i.e. described references object, it can be obtained by inquiry associated databases according to the recognition result of described references object.Such as, supposing that described references object is Central TV Tower, then can be 405 meters by inquiring about its height values of acquisition such as such as Baidupedia.
The bottom of described references object and top can be determined by such as image recognition mode.Wherein, in application scenes, the bottom of described references object may be blocked completely, namely the bottom of described references object cannot directly be picked out, such as blocked by some green planting, in this case, because shelter is comparatively near apart from described references object, the bottom of the bottom of described shelter as described references object can therefore be determined.Also namely, described step S172 is further:
S172 ': the bottom in response to described references object is blocked completely by one the 4th object, determines the bottom of bottom as described references object of described 4th object.
Accordingly, described step S180 comprises further:
S180 ': according to bottom and the top of described reference altitude information, described references object, determine the intersection height information of described position of intersecting point.
In one embodiment, as shown in Figure 3 a, suppose described references object, the bottom of namely building L is L 1, top is L 2, suppose L 1distance to described position of intersecting point (i.e. intersection point O) is h l1, suppose L 2distance to described position of intersecting point is h l2, by image procossing, intersection point O to L can be obtained 2between pixel quantity and O to L 1between the pixel ratio of pixel quantity, this pixel ratio and h l2and h l1between length ratio, suppose that described pixel ratio is k, then have:
h l 2 h l 1 = k ; - - - ( 1 )
Suppose that obtaining the described total height with reference to building L according to described reference altitude information is h l, then have:
h l1+h l2=h l; (2)
Can obtain in conjunction with formula (1):
h l1+k×h l1=h l; (3)
Thus can h be calculated l1, namely determine the intersection height information of described position of intersecting point.
In another embodiment, as shown in fig. 4 a, by image procossing, intersection point O to L can be obtained 2between pixel quantity and L 2to L 1between the pixel ratio of pixel quantity, this pixel ratio and h xand h lbetween length ratio, suppose that described pixel ratio is k ', then have:
h x h l = k , ; - - - ( 4 )
Thus can h be calculated x, and then and h ldo and can obtain the intersection height information of described position of intersecting point.
S200: the reference position information according to described primary importance information and described references object determines the reference levels distance of described references object to described primary importance.
The positional information of described references object, namely described reference position information can be obtained by inquiry associated databases, in conjunction with the horizontal coordinate information in described primary importance information, then can calculate the distance of described references object to described primary importance, i.e. described reference levels distance.Such as, suppose that described references object is Central TV Tower, its latitude and longitude information can be obtained by the background data base of inquiry Baidu map.
S220: determine the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
In one embodiment, composition graphs 3a, described reference levels distance can the corresponding bottom A building A 1with the bottom L of building L 1between distance, be assumed to be d l; Described intersection height information can be intersection height value h l1; The height value of described primary importance can be obtained according to described primary importance information, be assumed to be h l1; Suppose that described first position is A 2, the height value of described primary importance is h; Suppose that described target level distance is for d b; Then triangle BOL 1with triangle BA 1a 2for similar triangles, thus have:
d B - d L d B = h l 1 h - - - ( 5 )
Described target level distance can be calculated for d according to formula (5) b.
In another embodiment, composition graphs 4b, also can obtain dihedral BOL 1with triangle BA 1a 2for similar triangles, thus have:
d B - d L d B = h l + h x h ; - - - ( 6 )
Described target level distance can be calculated for d according to formula (6) b.
In addition, it should be noted that, if corresponding ground level, described references object location plane and the described destination object location plane of described first position is not at grade, then need first to carry out corresponding height value correction, process according to described method again, to reduce error.Such as references object on a hillside, then needs to revise described intersection height information.
In addition, due to earth surface spherically shape, therefore when described primary importance and the described horizontal range with reference to build (or described target structures) comparatively far (being such as greater than 11 kms), the bottom of described primary importance and the described bottom with reference to building (or described target structures) can obviously not at grade, and two level differences may reach 10 meters.In this case, after may needing to revise the height value of described primary importance (or described position of intersecting point), then process according to method described in the application, to reduce the error of described target level distance.
In addition, the embodiment of the present application also provides a kind of computer-readable medium, is included in the computer-readable instruction carrying out following operation when being performed: perform the step S120 of method in above-mentioned Fig. 1 illustrated embodiment, the operation of S140 and S160.
To sum up, method described in the embodiment of the present application, according to the primary importance information of imaging device place primary importance and the relevant information of a references object in become image, proportionate relationship based on similar triangles obtains the horizontal range of primary importance and a destination object in become image, thus provide a kind of method determining the degree of depth of destination object in image, be conducive to segmenting further object different in distant view image.
Fig. 6 is the modular structure schematic diagram of distance determination equipment described in one embodiment of the invention, described distance determines that equipment can be arranged at for user in the imaging device such as smart mobile phone, slr camera as a functional module, can certainly as an autonomous device for user.As shown in Figure 6, described equipment 600 can also comprise:
Primary importance acquisition module 610, for obtaining the primary importance information of imaging device place primary importance;
Object determination module 620, for becoming in image to determine a references object and a destination object at described imaging device in described primary importance place;
Position of intersecting point determination module 630, for determining a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object;
Intersection height determination module 640, for determining the intersection height information of described position of intersecting point;
Reference levels distance determination module 650, for determining the reference levels distance of described references object to described primary importance according to the reference position information of described primary importance information and described references object;
Target level distance determination module 660, for determining the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
Equipment described in the embodiment of the present application, according to the primary importance information of imaging device place primary importance and the relevant information of a references object in become image, proportionate relationship based on similar triangles obtains the horizontal range of primary importance and a destination object in become image, thus provide a kind of equipment determining the degree of depth of destination object in image, be conducive to segmenting further object different in distant view image.
Below with reference to embodiment, describe the function of described primary importance acquisition module 610, described object determination module 620, described position of intersecting point determination module 630, described intersection height determination module 640, described reference levels distance determination module 650 and target level distance determination module 660 in detail.
Described primary importance acquisition module 610, for obtaining the primary importance information of imaging device place primary importance.
Wherein, described primary importance is the image space of described imaging device, i.e. the position of photographic images.Described primary importance information can comprise: the elevation information of described primary importance and horizontal coordinate information.Described elevation information can be the height value of described primary importance to ground level, and described horizontal coordinate information can be latitude and longitude information.Described primary importance acquisition module 610 can obtain described primary importance information as by access GPS (global positioning system), dipper system etc.
Described object determination module 620, for becoming in image to determine a references object and a destination object at described imaging device in described primary importance place.
Wherein, described references object can be a significant object, a such as road sign building, in other words, described references object should have obvious visual signature so that described object determination module 620 can be determined by such as image recognition, further, the relevant information of described references object can be obtained by retrieve relevant data storehouse.Described relevant information can comprise the elevation information, latitude and longitude information etc. of described references object.
Described destination object and object to be measured, generally can specify according to user and determine.
Described position of intersecting point determination module 630, for determining a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object.
See Fig. 7, in one embodiment, described equipment 600 also comprises:
One destination object local determination module 670, for determining the bottom of described destination object.
The bottom of described destination object can be determined by such as image recognition mode.Wherein, in application scenes, the bottom of described destination object may be blocked completely, namely the bottom of described destination object cannot directly be picked out, such as blocked by some green planting, in this case, because shelter is comparatively near apart from described destination object, the bottom of the bottom of described shelter as described destination object can therefore be determined.
Also namely, in one embodiment, described destination object local determination module 670, is blocked completely by one the 3rd object for the bottom in response to described destination object, determines the bottom of bottom as described destination object of described 3rd object.
Accordingly, in one embodiment, described position of intersecting point determination module 630, for determining a position of intersecting point according to the bottom of described primary importance information and described destination object on the extension bitmap picture of described references object or described references object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
As above, described in an embodiment, the position relationship of described primary importance, described destination object and described references object as shown in Fig. 3 a, 3b or Fig. 4 a, 4b, can repeat no more herein.
Described intersection height determination module 640, for determining the intersection height information of described position of intersecting point.
Wherein, described intersection height information can be the height value of described position of intersecting point distance ground level.
See Fig. 8, in one embodiment, described equipment 600 also comprises:
One reference altitude acquisition module 680, for obtaining the reference altitude information of described references object;
One references object local determination module 690, for determining bottom and the top of described references object.
Wherein, the height value of described reference altitude information i.e. described references object, it can be obtained by inquiry associated databases according to the recognition result of described references object.Such as, supposing that described references object is Central TV Tower, then can be 405 meters by inquiring about its height values of acquisition such as such as Baidupedia.
The bottom of described references object and top can be determined by such as image recognition mode.Wherein, in application scenes, the bottom of described references object may be blocked completely, namely the bottom of described references object cannot directly be picked out, such as blocked by some green planting, in this case, because shelter is comparatively near apart from described references object, the bottom of the bottom of described shelter as described references object can therefore be determined.Also namely, in one embodiment, described references object local determination module 690, is blocked completely by one the 4th object for the bottom in response to described references object, determines the bottom of bottom as described references object of described 4th object.
Accordingly, described intersection height determination module 640, for according to the bottom of described reference altitude information, described references object and top, determines the intersection height information of described position of intersecting point.
In one embodiment, as shown in Figure 3 a, suppose described references object, the bottom of namely building L is L 1, top is L 2, suppose L 1distance to described position of intersecting point (i.e. intersection point O) is h l1, suppose L 2distance to described position of intersecting point is h l2, by image procossing, intersection point O to L can be obtained 2between pixel quantity and O to L 1between the pixel ratio of pixel quantity, this pixel ratio and h l2and h l1between length ratio, according to this pixel ratio and described with reference to building L total height h l, described intersection height information can be obtained.
In another embodiment, as shown in fig. 4 a, by image procossing, intersection point O to L can be obtained 2between pixel quantity and L 2to L 1between the pixel ratio of pixel quantity, this pixel ratio and h xand h lbetween length ratio, according to this length than and described with reference to building the total height h of L l, described intersection height information can be obtained.
Described reference levels distance determination module 650, for determining the reference levels distance of described references object to described primary importance according to the reference position information of described primary importance information and described references object.
The positional information of described references object, namely described reference position information can be obtained by inquiry associated databases, in conjunction with the horizontal coordinate information in described primary importance information, then can calculate the distance of described references object to described primary importance, i.e. described reference levels distance.Such as, suppose that described references object is Central TV Tower, its latitude and longitude information can be obtained by the background data base of inquiry Baidu map.
Described target level distance determination module 660, for determining the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
In one embodiment, composition graphs 3a, described reference levels distance can the corresponding bottom A building A 1with the bottom L of building L 1between distance, be assumed to be d l; Described intersection height information can be intersection height value h l1; The height value of described primary importance can be obtained according to described primary importance information, be assumed to be h l1; Suppose that described first position is A 2, the height value of described primary importance is h; Suppose that described target level distance is for d b; Then triangle BOL 1with triangle BA 1a 2for similar triangles, according to the proportionate relationship of similar triangles, described target level distance can be obtained for d b.
In another embodiment, composition graphs 4b, also can obtain dihedral BOL 1with triangle BA 1a 2for similar triangles, and then described target level distance can be calculated for d according to formula (6) b.
In addition, it should be noted that, if corresponding ground level, described references object location plane and the described destination object location plane of described first position is not at grade, then need first to carry out corresponding height value correction, process according to described method again, to reduce error.Such as references object on a hillside, then needs to revise described intersection height information.
In addition, due to earth surface spherically shape, therefore when described primary importance and the described horizontal range with reference to build (or described target structures) comparatively far (being such as greater than 11 kms), the bottom of described primary importance and the described bottom with reference to building (or described target structures) can obviously not at grade, and two level differences may reach 10 meters.In this case, after may needing to revise the height value of described primary importance (or described position of intersecting point), then process according to method described in the application, to reduce the error of described target level distance.
An application scenarios of distance defining method described in the embodiment of the present application and equipment can be as follows: a subscriber station hand-held slr camera in Xishan Mountain Beijing area is taken pictures, when camera screen there is the image of Central TV Tower, camera determines that it is Central TV Tower by image recognition, and Query Database obtains its relevant information, on screen, then mark out the horizontal range that oneself arrives this television tower; User clicks a unknown building at television tower rear on screen, screen marks out simultaneously the distance of this unknown building to oneself.
The hardware configuration of distance determination equipment described in the application's embodiment as shown in Figure 9.To described distance, the application's specific embodiment does not determine that the specific implementation of equipment limits, see Fig. 9, described equipment 900 can comprise:
Processor (processor) 910, communication interface (Communications Interface) 920, memory (memory) 930, and communication bus 940.Wherein:
Processor 910, communication interface 920, and memory 930 completes mutual communication by communication bus 940.
Communication interface 920, for other net element communications.
Processor 910, for executive program 932, specifically can perform the correlation step in the embodiment of the method shown in above-mentioned Fig. 1.
Particularly, program 932 can comprise program code, and described program code comprises computer-managed instruction.
Processor 910 may be a central processor CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or is configured to the one or more integrated circuits implementing the embodiment of the present application.
Memory 930, for depositing program 932.Memory 930 may comprise high-speed RAM memory, still may comprise nonvolatile memory (non-volatile memory), such as at least one magnetic disc store.Program 932 specifically can perform following steps:
Obtain the primary importance information of imaging device place primary importance;
Become in image to determine a references object and a destination object at described imaging device in described primary importance place;
In described references object, a position of intersecting point is determined according to described primary importance information and described destination object;
Determine the intersection height information of described position of intersecting point;
Reference position information according to described primary importance information and described references object determines the reference levels distance of described references object to described primary importance;
The target level distance of described destination object to described primary importance is determined according to described reference levels distance, described intersection height information and described primary importance information.
In program 932, the specific implementation of each step see the corresponding steps in above-described embodiment or module, can be not repeated herein.Those skilled in the art can be well understood to, and for convenience and simplicity of description, the equipment of foregoing description and the specific works process of module, can describe with reference to the corresponding process in preceding method embodiment, not repeat them here.
Those of ordinary skill in the art can recognize, in conjunction with unit and the method step of each example of embodiment disclosed herein description, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions perform with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can use distinct methods to realize described function to each specifically should being used for, but this realization should not think the scope exceeding the application.
If described function using the form of SFU software functional unit realize and as independently production marketing or use time, can be stored in a computer read/write memory medium.Based on such understanding, the part of the part that the technical scheme of the application contributes to prior art in essence in other words or this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprising some instructions in order to make a computer equipment (can be personal computer, controller, or the network equipment etc.) perform all or part of step of method described in each embodiment of the application.And aforesaid storage medium comprises: USB flash disk, portable hard drive, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. various can be program code stored medium.
Above execution mode is only for illustration of the application; and the restriction not to the application; the those of ordinary skill of relevant technical field; when not departing from the spirit and scope of the application; can also make a variety of changes and modification; therefore all equivalent technical schemes also belong to the category of the application, and the scope of patent protection of the application should be defined by the claims.

Claims (10)

1. a distance defining method, it is characterized in that, described method comprises:
Obtain the primary importance information of imaging device place primary importance;
Become in image to determine a references object and a destination object at described imaging device in described primary importance place;
On the extension bitmap picture of described references object or described references object, a position of intersecting point is determined according to described primary importance information and described destination object;
Determine the intersection height information of described position of intersecting point;
Reference position information according to described primary importance information and described references object determines the reference levels distance of described references object to described primary importance;
The target level distance of described destination object to described primary importance is determined according to described reference levels distance, described intersection height information and described primary importance information.
2. the method for claim 1, is characterized in that, described method also comprises: the bottom determining described destination object.
3. method as claimed in claim 2, is characterized in that, describedly on the extension bitmap picture of described references object or described references object, determines that a position of intersecting point comprises according to described primary importance information and described destination object:
On the extension bitmap picture of described references object or described references object, a position of intersecting point is determined according to the bottom of described primary importance information and described destination object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
4. the method as described in any one of claims 1 to 3, is characterized in that, described method also comprises:
Obtain the reference altitude information of described references object;
Determine bottom and the top of described references object.
5. method as claimed in claim 4, is characterized in that, describedly determines that the intersection height information of described position of intersecting point comprises:
According to bottom and the top of described reference altitude information, described references object, determine the intersection height information of described position of intersecting point.
6. distance determines an equipment, it is characterized in that, described equipment comprises:
One primary importance acquisition module, for obtaining the primary importance information of imaging device place primary importance;
One object determination module, for becoming in image to determine a references object and a destination object at described imaging device in described primary importance place;
One position of intersecting point determination module, for determining a position of intersecting point according to described primary importance information and described destination object on the extension bitmap picture of described references object or described references object;
One intersection height determination module, for determining the intersection height information of described position of intersecting point;
One reference levels distance determination module, for determining the reference levels distance of described references object to described primary importance according to the reference position information of described primary importance information and described references object;
One target level distance determination module, for determining the target level distance of described destination object to described primary importance according to described reference levels distance, described intersection height information and described primary importance information.
7. equipment as claimed in claim 6, it is characterized in that, described equipment also comprises:
One destination object local determination module, for determining the bottom of described destination object.
8. equipment as claimed in claim 7, it is characterized in that, described position of intersecting point determination module, for determining a position of intersecting point according to the bottom of described primary importance information and described destination object on the extension bitmap picture of described references object or described references object, wherein, the bottom of described primary importance, described destination object and described position of intersecting point are on the same line.
9. the equipment as described in any one of claim 6 to 8, is characterized in that, described intersection height determination module, for according to the bottom of described reference altitude information, described references object and top, determines the intersection height information of described position of intersecting point.
10. an imaging device, is characterized in that, described imaging device comprises the distance determination equipment described in any one of claim 6 to 9.
CN201410804150.8A 2014-12-19 2014-12-19 Distance determines method and apparatus Active CN104539926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410804150.8A CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410804150.8A CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Publications (2)

Publication Number Publication Date
CN104539926A true CN104539926A (en) 2015-04-22
CN104539926B CN104539926B (en) 2016-10-26

Family

ID=52855385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410804150.8A Active CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Country Status (1)

Country Link
CN (1) CN104539926B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105180817A (en) * 2015-08-06 2015-12-23 联想(北京)有限公司 Data processing method and electronic equipment
CN106092055A (en) * 2016-06-16 2016-11-09 河海大学 The method measuring object height based on slr camera
CN106482646A (en) * 2016-10-10 2017-03-08 河海大学 Based on the method that slr camera measures object width
CN109147123A (en) * 2018-08-03 2019-01-04 北京旷视科技有限公司 Unlocking method, device, electronic equipment and the computer storage medium of door-control lock

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101526353A (en) * 2008-03-03 2009-09-09 株式会社拓普康 Geographical data collecting device
CN101858742A (en) * 2010-05-27 2010-10-13 沈阳理工大学 Fixed-focus ranging method based on single camera
CN101858741A (en) * 2010-05-26 2010-10-13 沈阳理工大学 Zoom ranging method based on single camera
CN101943580A (en) * 2009-07-07 2011-01-12 宏达国际电子股份有限公司 Method and device for detecting distance from target and computer program product thereof
CN102113017A (en) * 2008-08-05 2011-06-29 高通股份有限公司 System and method to generate depth data using edge detection
CN102369413A (en) * 2009-03-31 2012-03-07 阿尔卡特朗讯公司 A method for determining the relative position of a first and a second imaging device and devices therefore
CN102445148A (en) * 2010-09-30 2012-05-09 西门子公司 Method, device and system for acquiring position parameters
CN102761700A (en) * 2011-04-29 2012-10-31 国际商业机器公司 Shooting device and method for obtaining distance between different points on shot object
CN103245337A (en) * 2012-02-14 2013-08-14 联想(北京)有限公司 Method for acquiring position of mobile terminal, mobile terminal and position detection system
CN103282741A (en) * 2011-01-11 2013-09-04 高通股份有限公司 Position determination using horizontal angles
CN103424083A (en) * 2012-05-24 2013-12-04 北京数码视讯科技股份有限公司 Object depth detection method, device and system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101526353A (en) * 2008-03-03 2009-09-09 株式会社拓普康 Geographical data collecting device
CN102113017A (en) * 2008-08-05 2011-06-29 高通股份有限公司 System and method to generate depth data using edge detection
CN102369413A (en) * 2009-03-31 2012-03-07 阿尔卡特朗讯公司 A method for determining the relative position of a first and a second imaging device and devices therefore
CN101943580A (en) * 2009-07-07 2011-01-12 宏达国际电子股份有限公司 Method and device for detecting distance from target and computer program product thereof
CN101858741A (en) * 2010-05-26 2010-10-13 沈阳理工大学 Zoom ranging method based on single camera
CN101858742A (en) * 2010-05-27 2010-10-13 沈阳理工大学 Fixed-focus ranging method based on single camera
CN102445148A (en) * 2010-09-30 2012-05-09 西门子公司 Method, device and system for acquiring position parameters
CN103282741A (en) * 2011-01-11 2013-09-04 高通股份有限公司 Position determination using horizontal angles
US20130310071A1 (en) * 2011-01-11 2013-11-21 Qualcomm Incorporated Position determination using horizontal angles
CN102761700A (en) * 2011-04-29 2012-10-31 国际商业机器公司 Shooting device and method for obtaining distance between different points on shot object
CN103245337A (en) * 2012-02-14 2013-08-14 联想(北京)有限公司 Method for acquiring position of mobile terminal, mobile terminal and position detection system
CN103424083A (en) * 2012-05-24 2013-12-04 北京数码视讯科技股份有限公司 Object depth detection method, device and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105180817A (en) * 2015-08-06 2015-12-23 联想(北京)有限公司 Data processing method and electronic equipment
CN106092055A (en) * 2016-06-16 2016-11-09 河海大学 The method measuring object height based on slr camera
CN106482646A (en) * 2016-10-10 2017-03-08 河海大学 Based on the method that slr camera measures object width
CN106482646B (en) * 2016-10-10 2018-12-28 河海大学 Method based on slr camera measurement object width
CN109147123A (en) * 2018-08-03 2019-01-04 北京旷视科技有限公司 Unlocking method, device, electronic equipment and the computer storage medium of door-control lock
CN109147123B (en) * 2018-08-03 2021-05-04 北京旷视科技有限公司 Unlocking method and device of access control lock, electronic equipment and computer storage medium

Also Published As

Publication number Publication date
CN104539926B (en) 2016-10-26

Similar Documents

Publication Publication Date Title
Verhoeven et al. Computer vision‐based orthophoto mapping of complex archaeological sites: The ancient quarry of Pitaranha (Portugal–Spain)
US9641755B2 (en) Reimaging based on depthmap information
TWI483215B (en) Augmenting image data based on related 3d point cloud data
US10534870B2 (en) Methods, apparatuses and computer program products for automatic, non-parametric, non-iterative three dimensional geographic modeling
CN109520500B (en) Accurate positioning and street view library acquisition method based on terminal shooting image matching
US20140362082A1 (en) Automated Overpass Extraction from Aerial Imagery
WO2020055928A1 (en) Calibration for vision in navigation systems
EP2769183A2 (en) Three dimensional routing
JP6854195B2 (en) Image processing device, image processing method and program for image processing
CN110703805B (en) Method, device and equipment for planning three-dimensional object surveying and mapping route, unmanned aerial vehicle and medium
CN104539926A (en) Distance determination method and equipment
US11959749B2 (en) Mobile mapping system
US20220148219A1 (en) Method and system for visual localization
CN104539927A (en) Distance determination method and equipment
US20220139032A1 (en) Method of generating map and visual localization system using the map
CN107193820B (en) Position information acquisition method, device and equipment
US9372081B2 (en) Method and system for geo-referencing at least one sensor image
CN111982076B (en) Single-lens unmanned aerial vehicle flight parameter setting method
US9811889B2 (en) Method, apparatus and computer program product for generating unobstructed object views
US9031281B2 (en) Identifying an area of interest in imagery
US9240055B1 (en) Symmetry-based interpolation in images
Bakuła et al. Capabilities of a smartphone for georeferenced 3dmodel creation: An evaluation
RU2583756C2 (en) Method of signature-based positioning of urban area images in visible and ir bands
US9852542B1 (en) Methods and apparatus related to georeferenced pose of 3D models
US10976179B1 (en) Geolocating contents of a video with device orientation, and application thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant