CN104539926B - Distance determines method and apparatus - Google Patents

Distance determines method and apparatus Download PDF

Info

Publication number
CN104539926B
CN104539926B CN201410804150.8A CN201410804150A CN104539926B CN 104539926 B CN104539926 B CN 104539926B CN 201410804150 A CN201410804150 A CN 201410804150A CN 104539926 B CN104539926 B CN 104539926B
Authority
CN
China
Prior art keywords
described
object
primary importance
information
bottom
Prior art date
Application number
CN201410804150.8A
Other languages
Chinese (zh)
Other versions
CN104539926A (en
Inventor
王正翔
Original Assignee
北京智谷睿拓技术服务有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京智谷睿拓技术服务有限公司 filed Critical 北京智谷睿拓技术服务有限公司
Priority to CN201410804150.8A priority Critical patent/CN104539926B/en
Publication of CN104539926A publication Critical patent/CN104539926A/en
Application granted granted Critical
Publication of CN104539926B publication Critical patent/CN104539926B/en

Links

Abstract

This application provides a kind of distance and determine method and apparatus, relate to image processing field.Described method includes: obtain the primary importance information of imaging device place primary importance;At described imaging device, become image determines at described primary importance a references object and a destination object;On the extension bitmap picture of described references object or references object, a position of intersecting point is determined according to described primary importance information and described destination object;Determine the intersection height information of described position of intersecting point;Reference position information according to described primary importance information and described references object determines the described references object reference levels distance to described primary importance;The described destination object target level distance to described primary importance is determined according to described reference levels distance, described intersection height information and described primary importance information.Described method and apparatus, is conducive to segmenting objects different in distant view image further.

Description

Distance determines method and apparatus

Technical field

The application relates to image processing field, particularly relates to a kind of distance and determines method and apparatus.

Background technology

Along with the development of communication technology, digital camera, slr camera, smart mobile phone etc. more come also Many imaging devices progress into the life of people, are greatly enriched the life of people.

Current imaging device is for the depth calculation of become objects in images, main by infrared The method such as ray or binocular stereo vision is measured, to the depth measurements of near objects more Satisfied;But the depth survey error for distant view is relatively big, and distant view image region often can only be unified It is classified as background classes, and is difficult to the most further be segmented according to actual depth distance.

Summary of the invention

The purpose of the application is: provide a kind of distance to determine method and apparatus.

An aspect according at least one embodiment of the application, it is provided that a kind of distance side of determination Method, described method includes:

Obtain the primary importance information of imaging device place primary importance;

At described imaging device, become image determines at described primary importance a references object With a destination object;

According to described primary importance information and described destination object in described references object or described A position of intersecting point is determined on the extension bitmap picture of references object;

Determine the intersection height information of described position of intersecting point;

Reference position information according to described primary importance information and described references object determines institute State the references object reference levels distance to described primary importance;

According to described reference levels distance, described intersection height information and described primary importance information Determine the described destination object target level distance to described primary importance.

Another aspect according at least one embodiment of the application, it is provided that a kind of distance determines Equipment, described equipment includes:

Primary importance acquisition module, for obtaining the primary importance of imaging device place primary importance Information;

Object determines module, at described imaging device become image at described primary importance In determine a references object and a destination object;

Position of intersecting point determines module, for according to described primary importance information and described destination object The extension bitmap picture of described references object or described references object determines a position of intersecting point;

Intersection height determines module, for determining the intersection height information of described position of intersecting point;

Reference levels distance determines module, for according to described primary importance information and described reference The reference position information of object determines the described references object reference levels to described primary importance Distance;

Target level distance determines module, for according to described reference levels distance, described intersection point Elevation information and described primary importance information determine that described destination object is to described primary importance Target level distance.

Described in the embodiment of the present application, distance determines method and apparatus, according to imaging device place first The primary importance information of position and the relevant information of a references object in become image, based on similar The proportionate relationship of triangle obtain primary importance and the level of a destination object in become image away from From, thus provide and a kind of determine the method for the degree of depth of destination object in image, be conducive to far In scape image, different objects segment further.

Accompanying drawing explanation

Fig. 1 is the flow chart that distance described in one embodiment of the application determines method;

Fig. 2 is the flow chart that described in one embodiment of the application, distance determines method;

Fig. 3 is the side view of image scene in one embodiment of the application;

Fig. 4 is become image schematic diagram in one embodiment of the application;

Fig. 5 is the flow chart that described in another embodiment of the application, distance determines method;

Fig. 6 is the modular structure schematic diagram that described in one embodiment of the application, distance determines equipment;

Fig. 7 is the modular structure signal that distance described in one embodiment of the application determines equipment Figure;

Fig. 8 is that described in another embodiment of the application, distance determines that the modular structure of equipment is shown It is intended to;

Fig. 9 is the hardware architecture diagram that distance described in one embodiment of the application determines equipment.

Detailed description of the invention

Below in conjunction with the accompanying drawings and embodiment, the detailed description of the invention of the application is made the most in detail Explanation.Following example are used for illustrating the application, but are not limited to scope of the present application.

It will be appreciated by those skilled in the art that in embodiments herein, the sequence number of following each step Size be not meant to the priority of execution sequence, the execution sequence of each step should with its function and Internal logic determine, and the implementation process of the embodiment of the present application should not constituted any restriction.

Fig. 1 is the flow chart that distance described in one embodiment of the application determines method, described method Can determine in such as one distance and realize on equipment.As it is shown in figure 1, described method includes:

S120: obtain the primary importance information of imaging device place primary importance;

S140: determine a reference at described primary importance in become image at described imaging device Object and a destination object;

S160: according to described primary importance information and described destination object in described references object or A position of intersecting point is determined on the extension bitmap picture of described references object;

S180: determine the intersection height information of described position of intersecting point;

S200: the reference position information according to described primary importance information and described references object is true Fixed described references object is to the reference levels distance of described primary importance;

S220: according to described reference levels distance, described intersection height information and described first Confidence breath determines the described destination object target level distance to described primary importance.

Method described in the embodiment of the present application, according to the primary importance of imaging device place primary importance Information and the relevant information of a references object in become image, ratios based on similar triangles are closed System obtains primary importance and the horizontal range of a destination object in become image, thus provides one Plant and determine the method for the degree of depth of destination object in image, be conducive to objects different in distant view image Segmentation further.

Below with reference to detailed description of the invention, describe in detail described step S120, S140, S160, The function of S180, S200 and S220.

S120: obtain the primary importance information of imaging device place primary importance.

Wherein, described primary importance is the image space of described imaging device, i.e. shoots image Position.Described primary importance information may include that elevation information and the level of described primary importance Coordinate information.Described elevation information can be the described primary importance height value to ground level, institute Stating horizontal coordinate information can be latitude and longitude information.Described primary importance information can such as be passed through Access GPS (global positioning system), dipper system etc. to obtain.

S140: determine a reference at described primary importance in become image at described imaging device Object and a destination object.

Wherein, described references object can be a significant object, such as one road sign building, changes Sentence is talked about, and described references object should have obvious visual signature so that can be by such as scheming It is determined as identifying, and, the relevant information of described references object can be relevant by retrieval Data base obtains.Described relevant information can include the elevation information of described references object, longitude and latitude Degree information etc..Such as, described image includes Central TV Tower, can be true by image recognition Determining Central TV Tower is described references object.

Described destination object object the most to be measured, typically can specify according to user and determine.

S160: according to described primary importance information and described destination object in described references object or A position of intersecting point is determined on the extension bitmap picture of described references object.

Seeing Fig. 2, in one embodiment, described method also includes:

S150: determine the bottom of described destination object.

The bottom of described destination object can be determined by such as image recognition mode.Wherein, exist In application scenes, the bottom of described destination object may be blocked completely, i.e. cannot be straight Connect the bottom picking out described destination object, such as blocked by some green planting, in this case, Owing to shelter is relatively near apart from described destination object, thus may determine that the bottom of described shelter Bottom as described destination object.That is, described step S150 is further:

S150 ': the bottom in response to described destination object is blocked completely by one the 3rd object, determines The bottom of described 3rd object is as the bottom of described destination object.

In one embodiment, described step S160 farther includes:

S160 ': according to the bottom of described primary importance information and described destination object in described reference A position of intersecting point, wherein, described first is determined on the extension bitmap picture of object or described references object Position, the bottom of described destination object and described position of intersecting point are on the same line.

Wherein, in one embodiment, described primary importance, described destination object and described The position relationship of references object can be as shown in Figure 3 a and Figure 3 b shows.

Fig. 3 a is the side view of image scene, and wherein, described destination object can be in Fig. 3 a Building B, described references object can be building L in Fig. 3 a, and described primary importance can be Fig. 3 a builds the top A of A2.Due to described primary importance, the bottom of described destination object With described position of intersecting point on the same line, therefore, it can the top A at building A2With build Build and do line between the bottom of B, this line and the intersection point the most described position of intersecting point O building L.

Fig. 3 b is become image schematic diagram, it may also be said to be the front view of image scene.Wherein, S1 represents the horizon at the bottom of building B, and S2 represents the horizon at the bottom of building L, Line segment between O1 to O2 in intersection point O corresponding diagram 4 in Fig. 3 a.It will be seen that institute Stating in Fig. 3 a, S1 is above S2, for described primary importance, Building L is in the front of building B.In other words, building L and the level of described primary importance The distance horizontal range less than building B with described primary importance.If it should be noted that institute Becoming image is to tilt, and first image can be carried out aligning process.

In another embodiment, described primary importance, described destination object and described reference The position relationship of object can be as shown in figures 4 a and 4b.

Fig. 4 a is the side view of image scene, and wherein, described destination object can be in Fig. 4 a Building B, described references object can be building L in Fig. 4 a, and described primary importance can be Fig. 4 a builds the top A of A2.Due to described primary importance, the bottom of described destination object With described position of intersecting point on the same line, therefore, it can the top A at building A2With build Build and do line between the bottom of B.It will be seen that owing to the height of building L is relatively low, this is even Line will not directly intersect with building L's, therefore can be extension bitmap picture (Fig. 4 a of building L Middle dotted portion), then obtain the intersection point O of described line and described extension bitmap picture.

Fig. 4 b is become image schematic diagram, it may also be said to be the front view of image scene.Wherein, S1 represents the horizon at the bottom of building B, and S2 represents the horizon at the bottom of building L, Line segment between O1 to O2 in intersection point O corresponding diagram 4b in Fig. 4 a.It will be seen that by Relatively low in building L, in fig. 4b, the bottom less than building B, the top of building L, described Intersection point O is substantially on the extension bitmap picture (vertical dotted portion in figure) of described building L.

It should be noted that the extension bitmap picture of described references object can be described references object edge The image in the enclosed region of extended line at the edge of vertical direction.

It addition, it will be appreciated by those skilled in the art that described destination object and described references object not Certain being only is built, it is also possible to be such as massif etc..

S180: determine the intersection height information of described position of intersecting point.

Wherein, described intersection height information can be the height of described position of intersecting point distance ground level Value.

Seeing Fig. 5, in one embodiment, described method also includes:

S171: obtain the reference altitude information of described references object;

S172: determine bottom and the top of described references object.

Wherein, described reference altitude information i.e. the height value of described references object, it is permissible Recognition result according to described references object is obtained by inquiry associated databases.Such as, it is assumed that Described references object is Central TV Tower, then can obtain it by inquiring about such as Baidupedia etc. Height value is 405 meters.

The bottom of described references object and top can be determined by such as image recognition mode.Its In, in application scenes, the bottom of described references object may be blocked completely, i.e. Cannot directly pick out the bottom of described references object, such as be blocked by some green planting, this feelings Under condition, owing to shelter is relatively near apart from described references object, thus may determine that described shelter Bottom as the bottom of described references object.That is, described step S172 is further:

S172 ': the bottom in response to described references object is blocked completely by one the 4th object, determines The bottom of described 4th object is as the bottom of described references object.

Accordingly, described step S180 farther includes:

S180 ': according to described reference altitude information, the bottom of described references object and top, really The intersection height information of fixed described position of intersecting point.

In one embodiment, as shown in Figure 3 a, it is assumed that described references object, L is i.e. built Bottom be L1, top is L2, it is assumed that L1Distance to described position of intersecting point (i.e. intersection point O) For, it is assumed that L2Distance to described position of intersecting point is, by image procossing, can obtain To intersection point O to L2Between pixel quantity and O to L1Between the pixel ratio of pixel quantity, should Pixel ratio is i.e.WithBetween length ratio, it is assumed that described pixel ratio is k, then have:

h l 2 h l 1 = k ; - - - ( 1 )

Assume that obtaining the described total height with reference to building L according to described reference altitude information is hl, Then have:

h l 1 + h l 2 = h l ; - - - ( 2 )

Can obtain in conjunction with formula (1):

h l 1 + k × h l 1 = h l ; - - - ( 3 )

Such that it is able to be calculated, i.e. determine the intersection height information of described position of intersecting point.

In another embodiment, as shown in fig. 4 a, by image procossing, can obtain Intersection point O to L2Between pixel quantity and L2To L1Between the pixel ratio of pixel quantity, this pixel Ratio i.e. hxAnd hlBetween length ratio, it is assumed that described pixel ratio is k ', then have:

h x h l = k , ; - - - ( 4 )

Such that it is able to be calculated hx, and then and hlDo and can obtain the friendship of described position of intersecting point Point elevation information.

S200: the reference position information according to described primary importance information and described references object is true Fixed described references object is to the reference levels distance of described primary importance.

The positional information of described references object, the most described reference position information can be by inquiry phase Answer data base to obtain, in conjunction with the horizontal coordinate information in described primary importance information, then can count Calculate and obtain the described references object distance to described primary importance, the most described reference levels distance. Such as, it is assumed that described references object is Central TV Tower, after can be by inquiry Baidu map Platform data base obtains its latitude and longitude information.

S220: according to described reference levels distance, described intersection height information and described first Confidence breath determines the described destination object target level distance to described primary importance.

In one embodiment, in conjunction with Fig. 3 a, described reference levels distance can corresponding be built The bottom A of A1Bottom L with building L1Between distance, it is assumed that for dL;Described intersection height Information can be intersection height value;Described can be obtained according to described primary importance information The height value of one position, it is assumed that for;Assume that described first position is A2, described first The height value of position is h;Assume that described target level distance is dB;Then triangle BOL1With Triangle BA1A2For similar triangles, thus have:

d B - d L d B = h l 1 h ; - - - ( 5 )

Can be calculated described target level distance according to formula (5) is dB

In another embodiment, in conjunction with Fig. 4 b, it is also possible to obtain dihedral BOL1And triangle Shape BA1A2For similar triangles, thus have:

d B - d L d B = h l + h x h ; - - - ( 6 )

Can be calculated described target level distance according to formula (6) is dB

In addition, it is necessary to explanation, if ground level corresponding to described first position, described References object location plane and described destination object location plane the most at grade, Then need first to carry out corresponding height value correction, process according still further to described method, to reduce Error.Such as references object on a hillside, then needs to carry out described intersection height information Revise.

Further, since earth surface spherically shape, therefore in described primary importance and described reference The feelings that the horizontal range of building (or described target structures) (is such as more than 11 kms) farther out Under condition, the bottom of described primary importance and the described end with reference to building (or described target structures) Portion can the most at grade, and two level differences may reach 10 meters.This In the case of, it may be necessary to the height value of described primary importance (or described position of intersecting point) is carried out After correction, process according still further to herein described method, to reduce described target level distance Error.

Additionally, the embodiment of the present application also provides for a kind of computer-readable medium, it is included in and is performed The computer-readable instruction of below Shi Jinhang operation: perform in above-mentioned Fig. 1 illustrated embodiment Step S120 of method, the operation of S140 and S160.

To sum up, method described in the embodiment of the present application, according to the of imaging device place primary importance One positional information and the relevant information of a references object in become image, based on similar triangles Proportionate relationship obtains primary importance and the horizontal range of a destination object in become image, thus carries Supplied a kind of to determine the method for the degree of depth of destination object in image, be conducive to in distant view image not Segment further with object.

Fig. 6 is the modular structure schematic diagram that distance described in one embodiment of the invention determines equipment, Described distance determines that equipment can be arranged at smart mobile phone, slr camera as a functional module Deng in imaging device for user, naturally it is also possible to as an autonomous device for user. As shown in Figure 6, described equipment 600 can also include:

Primary importance acquisition module 610, for obtaining the first of imaging device place primary importance Positional information;

Object determines module 620, for being become at described primary importance at described imaging device Image determines a references object and a destination object;

Position of intersecting point determines module 630, for according to described primary importance information and described target Object determines a position of intersecting point on the extension bitmap picture of described references object or described references object;

Intersection height determines module 640, for determining the intersection height information of described position of intersecting point;

Reference levels distance determines module 650, for according to described primary importance information and described The reference position information of references object determines the reference to described primary importance of the described references object Horizontal range;

Target level distance determines module 660, for according to described reference levels distance, described Intersection height information and described primary importance information determine described destination object to described first The target level distance put.

Equipment described in the embodiment of the present application, according to the primary importance of imaging device place primary importance Information and the relevant information of a references object in become image, ratios based on similar triangles are closed System obtains primary importance and the horizontal range of a destination object in become image, thus provides one Plant and determine the equipment of the degree of depth of destination object in image, be conducive to objects different in distant view image Segmentation further.

Below with reference to detailed description of the invention, describe in detail described primary importance acquisition module 610, Described object determines that module 620, described position of intersecting point determine module 630, described intersection height Determine that module 640, described reference levels distance determine that module 650 and target level distance determine The function of module 660.

Described primary importance acquisition module 610, for obtaining imaging device place primary importance Primary importance information.

Wherein, described primary importance is the image space of described imaging device, i.e. shoots image Position.Described primary importance information may include that elevation information and the level of described primary importance Coordinate information.Described elevation information can be the described primary importance height value to ground level, institute Stating horizontal coordinate information can be latitude and longitude information.Described primary importance acquisition module 610 is permissible Such as by accessing the described primary importances of acquisition such as GPS (global positioning system), dipper system Information.

Described object determines module 620, is used at described imaging device in described first position Become image determines a references object and a destination object.

Wherein, described references object can be a significant object, such as one road sign building, changes Sentence is talked about, and described references object should have obvious visual signature so that described object determines mould Block 620 can be determined by such as image recognition, and, being correlated with of described references object Information can be obtained by retrieval Relational database.Described relevant information can include described reference The elevation information of object, latitude and longitude information etc..

Described destination object object the most to be measured, typically can specify according to user and determine.

Described position of intersecting point determines module 630, for according to described primary importance information and described Destination object determines an intersection point on the extension bitmap picture of described references object or described references object Position.

Seeing Fig. 7, in one embodiment, described equipment 600 also includes:

One destination object local determines module 670, for determining the bottom of described destination object.

The bottom of described destination object can be determined by such as image recognition mode.Wherein, exist In application scenes, the bottom of described destination object may be blocked completely, i.e. cannot be straight Connect the bottom picking out described destination object, such as blocked by some green planting, in this case, Owing to shelter is relatively near apart from described destination object, thus may determine that the bottom of described shelter Bottom as described destination object.

That is, in one embodiment, described destination object local determines module 670, uses Blocked completely by one the 3rd object in the bottom in response to described destination object, determine the described 3rd The bottom of object is as the bottom of described destination object.

Accordingly, in one embodiment, described position of intersecting point determines module 630, is used for Bottom according to described primary importance information and described destination object is in described references object or institute State and on the extension bitmap picture of references object, determine a position of intersecting point, wherein, described primary importance, institute State the bottom of destination object and described position of intersecting point on the same line.

As above described in an embodiment, described primary importance, described destination object and described with reference to right The position relationship of elephant can be as shown in Fig. 3 a, 3b or Fig. 4 a, 4b, and here is omitted.

Described intersection height determines module 640, for determining the intersection height of described position of intersecting point Information.

Wherein, described intersection height information can be the height of described position of intersecting point distance ground level Value.

Seeing Fig. 8, in one embodiment, described equipment 600 also includes:

One reference altitude acquisition module 680, for obtaining the reference altitude letter of described references object Breath;

One references object local determine module 690, for determine described references object bottom and Top.

Wherein, described reference altitude information i.e. the height value of described references object, it is permissible Recognition result according to described references object is obtained by inquiry associated databases.Such as, it is assumed that Described references object is Central TV Tower, then can obtain it by inquiring about such as Baidupedia etc. Height value is 405 meters.

The bottom of described references object and top can be determined by such as image recognition mode.Its In, in application scenes, the bottom of described references object may be blocked completely, i.e. Cannot directly pick out the bottom of described references object, such as be blocked by some green planting, this feelings Under condition, owing to shelter is relatively near apart from described references object, thus may determine that described shelter Bottom as the bottom of described references object.That is, in one embodiment, described ginseng Examine object local and determine module 690, for the bottom in response to described references object by one the 4th Object blocks completely, determines the bottom as described references object, the bottom of described 4th object.

Accordingly, described intersection height determines module 640, for believing according to described reference altitude Breath, the bottom of described references object and top, determine the intersection height information of described position of intersecting point.

In one embodiment, as shown in Figure 3 a, it is assumed that described references object, L is i.e. built Bottom be L1, top is L2, it is assumed that L1Distance to described position of intersecting point (i.e. intersection point O) For, it is assumed that L2Distance to described position of intersecting point is, by image procossing, can obtain To intersection point O to L2Between pixel quantity and O to L1Between the pixel ratio of pixel quantity, should Pixel ratio is i.e.WithBetween length ratio, according to this pixel ratio and described with reference to building Total height h of Ll, described intersection height information can be obtained.

In another embodiment, as shown in fig. 4 a, by image procossing, can obtain Intersection point O to L2Between pixel quantity and L2To L1Between the pixel ratio of pixel quantity, this pixel Ratio i.e. hxAnd hlBetween length ratio, according to this length than and described with reference to building L height overall Degree hl, described intersection height information can be obtained.

Described reference levels distance determines module 650, for according to described primary importance information and The reference position information of described references object determines that described references object is to described primary importance Reference levels distance.

The positional information of described references object, the most described reference position information can be by inquiry phase Answer data base to obtain, in conjunction with the horizontal coordinate information in described primary importance information, then can count Calculate and obtain the described references object distance to described primary importance, the most described reference levels distance. Such as, it is assumed that described references object is Central TV Tower, after can be by inquiry Baidu map Platform data base obtains its latitude and longitude information.

Described target level distance determines module 660, for according to described reference levels distance, Described intersection height information and described primary importance information determine that described destination object is to described The target level distance of one position.

In one embodiment, in conjunction with Fig. 3 a, described reference levels distance can corresponding be built The bottom A of A1Bottom L with building L1Between distance, it is assumed that for dL;Described intersection height Information can be intersection height value;Described can be obtained according to described primary importance information The height value of one position, it is assumed that for;Assume that described first position is A2, described first The height value of position is h;Assume that described target level distance is dB;Then triangle BOL1With Triangle BA1A2For similar triangles, according to the proportionate relationship of similar triangles, can obtain Described target level distance is dB

In another embodiment, in conjunction with Fig. 4 b, it is also possible to obtain dihedral BOL1And triangle Shape BA1A2For similar triangles, and then described target can be calculated according to formula (6) Horizontal range is dB

In addition, it is necessary to explanation, if ground level corresponding to described first position, described References object location plane and described destination object location plane the most at grade, Then need first to carry out corresponding height value correction, process according still further to described method, to reduce Error.Such as references object on a hillside, then needs to carry out described intersection height information Revise.

Further, since earth surface spherically shape, therefore in described primary importance and described reference The feelings that the horizontal range of building (or described target structures) (is such as more than 11 kms) farther out Under condition, the bottom of described primary importance and the described end with reference to building (or described target structures) Portion can the most at grade, and two level differences may reach 10 meters.This In the case of, it may be necessary to the height value of described primary importance (or described position of intersecting point) is carried out After correction, process according still further to herein described method, to reduce described target level distance Error.

Distance described in the embodiment of the present application determines that an application scenarios of method and apparatus can be as Under: subscriber station hand-held slr camera in Xishan Mountain Beijing area is taken pictures, during camera screen occurs When entreating the image of television tower, camera determines that it is Central TV Tower by image recognition, and inquires about Data base obtains its relevant information, then marks out oneself level to this television tower on screen Distance;User clicks on a unknown building at television tower rear, same markers on screen on screen Outpour this unknown building distance to oneself.

Distance described in one embodiment of the application determines the hardware configuration of equipment as shown in Figure 9.This To described distance, application specific embodiment does not determine that equipment implements and limits, see figure 9, described equipment 900 may include that

Processor (processor) 910, communication interface (Communications Interface) 920, Memorizer (memory) 930, and communication bus 940.Wherein:

Processor 910, communication interface 920, and memorizer 930 is by communication bus 940 Complete mutual communication.

Communication interface 920, is used for and other net element communications.

Processor 910, is used for the program that performs 932, specifically can perform shown in above-mentioned Fig. 1 Correlation step in embodiment of the method.

Specifically, program 932 can include that program code, described program code include computer Operational order.

Processor 910 is probably a central processor CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or be configured to implement One or more integrated circuits of the embodiment of the present application.

Memorizer 930, is used for program of depositing 932.Memorizer 930 may comprise high-speed RAM Memorizer, it is also possible to also include nonvolatile memory (non-volatile memory), such as At least one disk memory.Program 932 specifically can perform following steps:

Obtain the primary importance information of imaging device place primary importance;

At described imaging device, become image determines at described primary importance a references object With a destination object;

Determine in described references object according to described primary importance information and described destination object One position of intersecting point;

Determine the intersection height information of described position of intersecting point;

Reference position information according to described primary importance information and described references object determines institute State the references object reference levels distance to described primary importance;

According to described reference levels distance, described intersection height information and described primary importance information Determine the described destination object target level distance to described primary importance.

In program 932, each step implements the corresponding step that may refer in above-described embodiment Rapid or module, is not repeated herein.Those skilled in the art it can be understood that arrive, for The convenience that describes and succinct, the equipment of foregoing description and the specific works process of module, Ke Yican State the corresponding process in embodiment of the method before examination to describe, do not repeat them here.

Those of ordinary skill in the art are it is to be appreciated that combine the embodiments described herein and retouch The unit of each example stated and method step, it is possible to electronic hardware or computer software and Being implemented in combination in of electronic hardware.These functions perform with hardware or software mode actually, Depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can be to often Individual specifically should being used for uses different methods to realize described function, but this realization is not It is considered as beyond scope of the present application.

If described function realizes and as independent product pin using the form of SFU software functional unit When selling or use, can be stored in a computer read/write memory medium.Based on such Understand, part that prior art is contributed by the technical scheme of the application the most in other words or The part of this technical scheme of person can embody with the form of software product, this computer software Product is stored in a storage medium, including some instructions with so that a computer equipment (can be personal computer, controller, or the network equipment etc.) performs the application, and each is real Execute all or part of step of method described in example.And aforesaid storage medium includes: USB flash disk, shifting Dynamic hard disk, read only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can store journey The medium of sequence code.

Embodiment of above is merely to illustrate the application, and not restriction to the application, relevant The those of ordinary skill of technical field, in the case of without departing from spirit and scope, Can also make a variety of changes and modification, the technical scheme of the most all equivalents falls within the application Category, the scope of patent protection of the application should be defined by the claims.

Claims (14)

1. a distance determines method, it is characterised in that described method includes:
Obtain the primary importance information of imaging device place primary importance;
At described imaging device, become image determines at described primary importance a references object With a destination object;
Determine the bottom of described destination object;
Bottom according to described primary importance information and described destination object is in described references object Or on the extension bitmap picture of described references object, determine a position of intersecting point, wherein, described primary importance, The bottom of described destination object and described position of intersecting point are on the same line;
Determine the intersection height information of described position of intersecting point;
Reference position information according to described primary importance information and described references object determines institute State the references object reference levels distance to described primary importance;
According to described reference levels distance, described intersection height information and described primary importance information Determine the described destination object target level distance to described primary importance;
Described target level distance is more than described reference levels distance.
2. the method for claim 1, it is characterised in that described primary importance information Including: the elevation information of described primary importance and horizontal coordinate information.
3. method as claimed in claim 1 or 2, it is characterised in that pass through image recognition Described references object is determined in described image.
4. the method for claim 1, it is characterised in that described determine described target The bottom of object includes:
Bottom in response to described destination object is blocked completely by one the 3rd object, determines described The bottom of three objects is as the bottom of described destination object.
5. the method for claim 1, it is characterised in that described method also includes:
Obtain the reference altitude information of described references object;
Determine bottom and the top of described references object.
6. method as claimed in claim 5, it is characterised in that described determine described intersection point The intersection height information of position includes:
According to described reference altitude information, the bottom of described references object and top, determine described The intersection height information of position of intersecting point.
7. method as claimed in claim 5, it is characterised in that described determine described reference The bottom of object includes:
Bottom in response to described references object is blocked completely by one the 4th object, determines described The bottom of four objects is as the bottom of described references object.
8. a distance determines equipment, it is characterised in that described equipment includes:
One primary importance acquisition module, is used for obtaining first of imaging device place primary importance Confidence ceases;
One object determines module, at described imaging device become figure at described primary importance A references object and a destination object is determined in Xiang;
One destination object local determines module, for determining the bottom of described destination object;
One position of intersecting point determines module, for according to described primary importance information and described target pair The bottom of elephant determines an intersection point on the extension bitmap picture of described references object or described references object Position, wherein, described primary importance, the bottom of described destination object and described position of intersecting point exist On same straight line;
One intersection height determines module, for determining the intersection height information of described position of intersecting point;
One reference levels distance determines module, for according to described primary importance information and described ginseng The reference position information examining object determines the described references object reference water to described primary importance Flat distance;
One target level distance determines module, for according to described reference levels distance, described friendship Point elevation information and described primary importance information determine that described destination object is to described primary importance Target level distance;
Wherein, described target level distance is more than described reference levels distance.
9. equipment as claimed in claim 8, it is characterised in that described object determines module, For determining described references object in described image by image recognition.
10. equipment as claimed in claim 8, it is characterised in that described destination object local Determine module, blocked completely by one the 3rd object for the bottom in response to described destination object, Determine the bottom as described destination object, the bottom of described 3rd object.
11. equipment as described in any one of claim 8 to 10, it is characterised in that described Equipment also includes:
One reference altitude acquisition module, for obtaining the reference altitude information of described references object;
One references object local determines module, for determining bottom and the top of described references object.
12. equipment as claimed in claim 11, it is characterised in that described intersection height is true Cover half block, is used for according to described reference altitude information, the bottom of described references object and top, Determine the intersection height information of described position of intersecting point.
13. equipment as claimed in claim 11, it is characterised in that described references object office Portion determines module, is blocked completely by one the 4th object for the bottom in response to described references object, Determine the bottom as described references object, the bottom of described 4th object.
14. 1 kinds of imaging devices, it is characterised in that described imaging device includes claim 8 Equipment is determined to the distance described in 13 any one.
CN201410804150.8A 2014-12-19 2014-12-19 Distance determines method and apparatus CN104539926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410804150.8A CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410804150.8A CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Publications (2)

Publication Number Publication Date
CN104539926A CN104539926A (en) 2015-04-22
CN104539926B true CN104539926B (en) 2016-10-26

Family

ID=52855385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410804150.8A CN104539926B (en) 2014-12-19 2014-12-19 Distance determines method and apparatus

Country Status (1)

Country Link
CN (1) CN104539926B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105180817B (en) * 2015-08-06 2018-08-10 联想(北京)有限公司 A kind of data processing method and electronic equipment
CN106092055A (en) * 2016-06-16 2016-11-09 河海大学 The method measuring object height based on slr camera
CN106482646B (en) * 2016-10-10 2018-12-28 河海大学 Method based on slr camera measurement object width

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5150307B2 (en) * 2008-03-03 2013-02-20 株式会社トプコン Geographic data collection device
US8184196B2 (en) * 2008-08-05 2012-05-22 Qualcomm Incorporated System and method to generate depth data using edge detection
EP2236980B1 (en) * 2009-03-31 2018-05-02 Alcatel Lucent A method for determining the relative position of a first and a second imaging device and devices therefore
CN101943580B (en) * 2009-07-07 2012-08-29 宏达国际电子股份有限公司 Method and device for detecting distance from target and computer program product thereof
CN101858741A (en) * 2010-05-26 2010-10-13 沈阳理工大学 Zoom ranging method based on single camera
CN101858742A (en) * 2010-05-27 2010-10-13 沈阳理工大学 Fixed-focus ranging method based on single camera
CN102445148A (en) * 2010-09-30 2012-05-09 西门子公司 Method, device and system for acquiring position parameters
US8494553B2 (en) * 2011-01-11 2013-07-23 Qualcomm Incorporated Position determination using horizontal angles
CN102761700B (en) * 2011-04-29 2015-01-21 国际商业机器公司 Shooting device and method for obtaining distance between different points on shot object
CN103245337B (en) * 2012-02-14 2016-06-01 联想(北京)有限公司 A kind of obtain the method for mobile terminal locations, mobile terminal and position detecting system
CN103424083B (en) * 2012-05-24 2016-02-10 北京数码视讯科技股份有限公司 The detection method of Object Depth, device and system

Also Published As

Publication number Publication date
CN104539926A (en) 2015-04-22

Similar Documents

Publication Publication Date Title
Smith et al. Structure from motion photogrammetry in physical geography
Lucieer et al. Mapping landslide displacements using Structure from Motion (SfM) and image correlation of multi-temporal UAV photography
US9430871B2 (en) Method of generating three-dimensional (3D) models using ground based oblique imagery
US9699375B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
US9390519B2 (en) Depth cursor and depth management in images
US10311297B2 (en) Determination of position from images and associated camera positions
Li et al. Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera
US8571265B2 (en) Measurement apparatus, measurement method, and feature identification apparatus
Nikolakopoulos et al. UAV vs classical aerial photogrammetry for archaeological studies
US9280851B2 (en) Augmented reality system for supplementing and blending data
CN104748746B (en) Intelligent machine attitude determination and virtual reality loaming method
US8693806B2 (en) Method and apparatus of taking aerial surveys
US9641755B2 (en) Reimaging based on depthmap information
Grün et al. Photogrammetric reconstruction of the great Buddha of Bamiyan, Afghanistan
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
KR100743485B1 (en) Video object recognition device and recognition method, video annotation giving device and giving method, and program
JP5582548B2 (en) Display method of virtual information in real environment image
US9330504B2 (en) 3D building model construction tools
TWI494898B (en) Extracting and mapping three dimensional features from geo-referenced images
US9613388B2 (en) Methods, apparatuses and computer program products for three dimensional segmentation and textured modeling of photogrammetry surface meshes
US7310606B2 (en) Method and system for generating an image-textured digital surface model (DSM) for a geographical area of interest
JP5255595B2 (en) Terminal location specifying system and terminal location specifying method
Chen et al. Rise of the indoor crowd: Reconstruction of building interior view via mobile crowdsourcing
US10037469B2 (en) Image location through large object detection
US10534870B2 (en) Methods, apparatuses and computer program products for automatic, non-parametric, non-iterative three dimensional geographic modeling

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant