CN105606086A - Positioning method and terminal - Google Patents

Positioning method and terminal Download PDF

Info

Publication number
CN105606086A
CN105606086A CN201510542616.6A CN201510542616A CN105606086A CN 105606086 A CN105606086 A CN 105606086A CN 201510542616 A CN201510542616 A CN 201510542616A CN 105606086 A CN105606086 A CN 105606086A
Authority
CN
China
Prior art keywords
object content
picture
content
filming apparatus
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510542616.6A
Other languages
Chinese (zh)
Inventor
刘东声
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201510542616.6A priority Critical patent/CN105606086A/en
Publication of CN105606086A publication Critical patent/CN105606086A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention provides a positioning method. The method includes: according to the target content of a target picture, finding out a reference picture containing the target content in a preset picture group, with pictures in the preset picture group containing location known content; determining the location of the target content according to the reference picture; acquiring the relative location of the target content to a shooting device while shooting the target picture; determining the location of the shooting device according to the location of the target content and the relative location. Correspondingly, the invention also provides a terminal. According to the invention, the location of the target content in the target picture is determined according to the preset picture group, then the location of the shooting device is determined according to the relative location of the target content to the shooting device during shooting, and further the location of a photographing user can be determined, thus improving the positioning accuracy.

Description

A kind of localization method and terminal
Technical field
The present invention relates to field of locating technology, relate in particular to a kind of localization method and terminal.
Background technology
Main two kinds of main location technology: GPS location and the network positions of existing in prior art.
Wherein, GPS location need to just can position in satellite-signal overlay area, and positioning precisionOften not high, in the scope of tens meters. Special in the situation that building eclipse phenomena is serious, locationOften there is deviation in result.
Wherein, network positions is also current widely used localization method, and it need to connect from position by networkThe server of putting service provider obtains positioning result. Network positions is mainly achieved as follows: terminal is according to cell ID (CellAnd/or the infrastructure service set identifier (BasicServiceSet of Wi-Fi access point (AccessPoint, AP) ID)Identification, BSSID) etc. network identity obtain described network identity from the webserver and be mapped to the earthOn geographical position, thereby realize coarse localization. But the positioning precision of network positions is not ideal enough, logicalIn the scope of approximately hundred meters of being everlasting.
Summary of the invention
The embodiment of the present invention provides a kind of localization method and terminal, determines target figure according to default picture groupThe position of the object content in sheet, then described in when taking pictures object content with respect to the phase contraposition of filming apparatusPut the position of determining described filming apparatus, and then determine the residing position of user of taking pictures, improved positioning accurateDegree.
Embodiment of the present invention first aspect provides a kind of localization method, and the method comprises:
According to the object content in Target Photo, in default picture group, find out and comprise described object contentReference picture; The content that picture in described default picture group comprises location aware;
Determine the position of described object content according to the described reference picture finding out;
Obtain while taking described Target Photo, described object content is with respect to the relative position of filming apparatus;
According to the residing position of filming apparatus described in the position of described object content and described Relative position determination.
In conjunction with first aspect, in the possible implementation of the first, described relative position comprises: direction letterBreath and range information; According to filming apparatus institute described in the position of described object content and described Relative position determinationThe position at place, comprising:
If the position of described object content is absolute geographical position, according to the position of described object content,Described directional information, described range information calculate the absolute geographical position of described filming apparatus; Or,
If the position of described object content is relative geographical position, taking the position of described object content as ginsengExamination point, represents the residing position of described filming apparatus by described directional information and described range information.
In conjunction with first aspect, in the possible implementation of the second, described default picture group is kept at far-endOn server; Described Target Photo and default picture group are compared before, also comprise: with described far awayEnd server establishes a communications link; Upload described target figure by described communication connection to described far-end serverSheet, carries out the described step that Target Photo and default picture group are compared in order to trigger described far-end serverSuddenly.
In conjunction with first aspect, in the third possible implementation, described according to the target in Target PhotoContent finds out and the reference picture that comprises described object content in described default picture group, comprising:
Identify the content that each picture in described default picture group comprises;
The content that whether exists picture to comprise in described each picture of judgement is corresponding with described object content sameReference object, if exist, determines that the described picture existing is described reference picture.
In conjunction with first aspect, in the 4th kind of possible implementation, described method also comprises: taking instituteState after Target Photo, on target map, indicate institute according to the residing position of described filming apparatus of determiningState the residing position of filming apparatus.
Embodiment of the present invention second aspect provides a kind of terminal, and this terminal comprises:
Search unit, for according to the object content of Target Photo, in default picture group, find out and wrapContaining the reference picture of described object content; The content that picture in described default picture group comprises location aware;
The first determining unit, for determining the position of described object content according to the described reference picture finding out;
Acquiring unit, while taking described Target Photo for obtaining, described object content is with respect to filming apparatusRelative position;
The second determining unit, for clapping described in the position of described object content and described Relative position determinationTake the photograph the residing position of device.
In conjunction with second aspect, in the possible implementation of the first, described relative position comprises: direction letterBreath and range information; The second determining unit, specifically for:
If the position of described object content is absolute geographical position, according to the position of described object content,Described directional information, described range information calculate the absolute geographical position of described filming apparatus; Or,
If the position of described object content is relative geographical position, taking the position of described object content as ginsengExamination point, represents the residing position of described filming apparatus by described directional information and described range information.
In conjunction with second aspect, in the possible implementation of the second, described default picture group is kept at far-endOn server; Described terminal also comprises: linkage unit and uploading unit, wherein:
Described linkage unit, for establishing a communications link with described far-end server;
Described uploading unit, for uploading described target figure by described communication connection to described far-end serverSheet, described according to the object content in Target Photo in order to trigger described far-end server execution, at default figureIn sheet group, find out the step with the reference picture that comprises described object content.
In conjunction with second aspect, in the third possible implementation, described in search unit, comprising: identificationUnit and judging unit, wherein:
Described recognition unit, for: the content that each picture of described default picture group comprises identified;
Described judging unit, for: described each picture of judgement whether exist content that picture comprises withThe corresponding same reference object of described object content, if exist, determines that the described picture existing is described referencePicture.
In conjunction with second aspect, in the 4th kind of possible implementation, described terminal also comprises: display unit,For taking after described Target Photo, the described filming apparatus institute determining according to described the second determining unitThe position at place indicates the residing position of described filming apparatus on target map.
Implement the embodiment of the present invention, determine the position of the object content in Target Photo according to default picture group,Object content goes out described filming apparatus with respect to the Relative position determination of filming apparatus described in again when taking picturesPosition, and then determine the residing position of user of taking pictures, improve positioning precision.
Brief description of the drawings
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, institute in describing embodiment belowNeed the accompanying drawing using to be briefly described, apparently, the accompanying drawing in the following describes is of the present invention oneA little embodiment, for those of ordinary skill in the art, are not paying under the prerequisite of creative work, alsoCan obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the flow chart of the first embodiment of the localization method that provides of the embodiment of the present invention;
Fig. 2 is that doublebeat that the embodiment of the present invention provides is taken the photograph showing of relative position relation between head and reference objectIntention;
Fig. 3 is the flow chart of localization method the second embodiment of providing of the embodiment of the present invention;
Fig. 4 is the structural representation of the first embodiment of the terminal that provides of the embodiment of the present invention;
Fig. 5 is the structural representation of the second embodiment of the terminal that provides of the embodiment of the present invention;
Fig. 6 is the structural representation of the 3rd embodiment of the terminal that provides of the embodiment of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clearlyChu, intactly description, obviously, described embodiment is the present invention's part embodiment, instead of allEmbodiment. Based on the embodiment in the present invention, those of ordinary skill in the art are not making creative laborThe every other embodiment obtaining under moving prerequisite, belongs to the scope of protection of the invention.
Referring to Fig. 1, it is the flow chart of the first embodiment of the localization method that provides of the embodiment of the present invention. At Fig. 1In embodiment, terminal is determined the position of the object content in Target Photo according to default picture group, then passes throughDescribed in while taking pictures, object content goes out the position of described filming apparatus with respect to the Relative position determination of filming apparatus,And then determine the residing position of user of taking pictures, improve positioning precision. As shown in Figure 1, the method comprises:
Step S101: according to the object content in Target Photo, find out in default picture group and comprise instituteState the reference picture of object content; The content that picture in described default picture group comprises location aware.
In the embodiment of the present invention, described Target Photo can be the picture of terminal taking, and described object content canTo be the reference object of terminal, for example, sight spot, building etc. Part in described default picture group orAll picture all can comprise the content of location aware. For example, described default picture group is that 100 sight spots are shoneSheet, the wherein location aware at the sight spot in each photo.
In the embodiment of the present invention, described reference picture and described Target Photo comprise same reference object,Described object content. The shooting of described same reference object in described reference picture and described Target PhotoAngle can be inconsistent. For example, described reference picture comprises positive " Tian An-men, Beijing " of taking, described orderThe sheet of marking on a map comprises " Tian An-men, Beijing " that side is taken.
Concrete, terminal can be identified the content that each picture in described default picture group comprises, and judges instituteState the content that whether exists picture to comprise in each picture same reference object corresponding to described object content,If exist, determine that the described picture existing is described reference picture.
In one implementation, if reference object corresponding to the picture in described default picture group is known,Terminal can first identify the reference object that described object content is corresponding, afterwards according to described reference object in instituteState the picture that comprises described reference object of searching in default picture group, and will comprise the figure of described reference objectSheet is defined as described reference picture.
In another kind of implementation, terminal can first identify the reference object that described object content is corresponding,Afterwards by the picture of various angles corresponding to reference object described in Network Capture, then by described reference object pairThe picture of the various angles of answering mates with the photo in described default picture group, and the picture that coupling obtainsDescribed reference picture.
Need to illustrate, described default picture group can be stored in described terminal this locality, also can be stored in instituteState in the external memory storage of terminal, can also be stored on far-end server, be not restricted here.
Need explanation, content that in described default picture group, single photo can corresponding multiple location awares. ?In described default picture group, the corresponding position of content of the location aware in each pictures
Step S103: the position of determining described object content according to the described reference picture finding out.
Content in refer step S101 is known, and described reference picture and described Target Photo comprise same batTake the photograph object, i.e. described object content. Therefore, terminal can be by the described object content in described reference pictureCorresponding known location is defined as the position of described object content.
Need to illustrate, the position of described object content can represent by relative geographical position, also canBy absolute geographical positional representation.
Step S105: obtain while taking described Target Photo, described object content is with respect to the phase of filming apparatusTo position.
The described relative position that the embodiment of the present invention relates to can pass through: direction (passing through angular quantification) and distanceFrom representing.
In the embodiment of the present invention, described filming apparatus can be dual camera. Known according to prior art, eventuallyEnd in the time taking pictures by dual camera, can obtain reference object with respect to the filming apparatus of terminal relativelyPosition.
In specific implementation, terminal can be according to reference object with respect to two angle, distances that camera is differentAnd constant distance between dual camera, utilize " range of triangle " algorithm, calculate fast reference object phaseFor distance and the direction of described dual camera.
Referring to Fig. 2, Fig. 2 shows the relative position relation of dual camera and reference object. Wherein, A, BCorresponding dual camera respectively, S is reference object. In prior art, terminal can, in the time taking, be obtained shootingDistance between object and dual camera: L1 and L2, reference object (is used angle with respect to the direction of dual cameraDegree represents): angle a and angle b. Wherein, the distance between dual camera A and B is fixed value.
In Fig. 2, reference object S can pass through with respect to the relative position at camera (C place): angle cRepresent with distance L 3.
Known according to foregoing, angle c and distance L 3 can be by above-mentioned distance: L1 and L2, above-mentioned sideTo angle a and angle b, and fixed range between dual camera A and B calculates. ConcreteComputing formula can, with reference to the correlation technique of " range of triangle ", not repeat herein.
Need explanation, taking after described Target Photo, terminal can by described object content with respect toThe relative position of filming apparatus is kept in the attribute of described Target Photo. In practical application, terminal is all rightOther forms are preserved described relative position, are not restricted herein.
Need to illustrate, except utilizing dual camera, terminal can also be passed through other means, for example range finding groupThe auxiliary single camera of part (as the range finding such as laser, infrared ray assembly), obtains described object content with respect to instituteThe relative position of stating filming apparatus, is not restricted here.
Step S107: according to filming apparatus institute described in the position of described object content and described Relative position determinationThe position at place.
In one implementation, if the position of the described object content obtaining by step S103 is definitelyGeographical position, terminal can be according to the position of described object content and described relative position (directional information, distanceFrom information) calculate the absolute geographical position of described filming apparatus.
Taking Fig. 2 as example, described object content is equivalent to the S in Fig. 2, and filming apparatus is equivalent to the C in Fig. 2.The position coordinates of supposing to determine described object content (being equivalent to the S in Fig. 2) by step S103 for (x1,X2). Owing to having calculated the relative position of S with respect to C by step S105: angle c and distance L 3.Therefore, terminal can be according to the position coordinates (x1, x2) of S and S the relative position with respect to C: folderAngle c and distance L 3, the position coordinates that calculates C is: (x1-L3*sinc, y1-cosc).
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
In another kind of implementation, if the position of the described object content obtaining by step S103 is phaseTo geographical position, the position that terminal can described object content is reference point, by described relative position (sideTo information, range information) represent the residing position of described filming apparatus.
For example, described object content is famous sight spot " Tian An-men, Beijing ", the direction letter in described relative positionBreath is: southeastern direction, the range information in described relative position is: 20 meters. So, terminal can be by instituteStating the residing location expression of filming apparatus is: in the southeastern direction in " Tian An-men, Beijing ", and distance " sky, BeijingPeace door " 20 meters.
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
Further, in order better to apply localization method provided by the invention, terminal can also taken instituteState after Target Photo, on target map (as Baidu's map), indicate that step S107 determines as described in batTake the photograph the residing position of device, so that user is known own residing position more intuitively, promote user and experience.
Implement the embodiment of the present invention, determine the position of the object content in Target Photo according to default picture group,Object content goes out described filming apparatus with respect to the Relative position determination of filming apparatus described in again when taking picturesPosition, and then determine the residing position of user of taking pictures, improve positioning precision.
Referring to Fig. 3, it is the flow chart of the second embodiment of the localization method that provides of the embodiment of the present invention. At Fig. 3In embodiment, on far-end server, store the default picture group for locating image content; In photographic subjectsAfter picture, described Target Photo is uploaded to far-end server by terminal, in order to trigger described far-end serverDetermine the position of the object content in described Target Photo according to described default picture group, and then according to described orderMark content goes out the residing position of filming apparatus with respect to the Relative position determination of filming apparatus, can realize and utilizing farThe powerful computational resource of end server completes the location positioning process of described object content, improves location efficiency.Fig. 2 embodiment is the further improvement of Fig. 1 embodiment, and the content of not mentioning in Fig. 2 embodiment, please joinExamine Fig. 1 embodiment. As shown in Figure 2, the method comprises:
Step S201: terminal and far-end server establish a communications link; Default picture group is kept at remote serviceOn device, the content that the picture in described default picture group comprises location aware.
In the embodiment of the present invention, described Target Photo can be the picture of terminal taking, and described object content canTo be the reference object of terminal, for example, sight spot, building etc. Part in described default picture group orAll picture all can comprise the content of location aware. For example, described default picture group is that 100 sight spots are shoneSheet, the wherein location aware at the sight spot in each photo.
Step S203: described terminal is uploaded described Target Photo by described communication connection to described server.
Concrete, after taking described Target Photo, described terminal can send to described Target PhotoDescribed server, determines in described Target Photo according to described default picture group in order to trigger described serverThe position of object content, detailed process refers to following step S205 and step S207.
Step S205: described server, according to the object content in Target Photo, is searched in default picture groupGo out and the reference picture that comprises described object content.
Concrete, in the embodiment of the present invention, described reference picture and described Target Photo comprise same shootingObject, i.e. described object content. Described same reference object is at described reference picture and described Target PhotoIn shooting angle can be inconsistent. For example, described reference picture comprises positive " Tian An-men, Beijing " of taking,Described Target Photo comprises " Tian An-men, Beijing " that side is taken.
Concrete, described server can be identified the content that each picture in described default picture group comprises, andThe content same shooting corresponding to described object content that whether exists picture to comprise in described each picture of judgementObject, if exist, determines that the described picture existing is described reference picture.
Step S207: described server is determined the position of described object content according to the described reference picture finding outPut.
In the embodiment of the present invention, described server can be by the described object content correspondence in described reference pictureKnown location be defined as the position of described object content.
Need to illustrate, the position of described object content can represent by relative geographical position, also canBy absolute geographical positional representation.
Step S209: the position of the described object content of determining is sent to described terminal by described server.
Concrete, to determine behind the position of described object content, described server can be by described targetThe position holding sends to described terminal, so that described terminal is according to the position of described object content, then in conjunction with instituteState the relative position of object content with respect to filming apparatus, oppositely determine the residing position of described filming apparatus,Detailed process refers to following step S211 and step S213.
Step S211: described terminal is obtained while taking described Target Photo, and described object content is with respect to shootingThe relative position of device.
Concrete, can, with reference to the step S105 in figure 1 embodiment, repeat no more here.
Step S213: described terminal is clapped described in the position of described object content and described Relative position determinationTake the photograph the residing position of device.
Concrete, can, with reference to the step S107 in figure 1 embodiment, repeat no more here.
Further, in order better to apply localization method provided by the invention, terminal can also taken instituteState after Target Photo, on target map (as Baidu's map), indicate that step S213 determines as described in batTake the photograph the residing position of device, so that user is known own residing position more intuitively, promote user and experience.
Implement the embodiment of the present invention, after photographic subjects picture, terminal is uploaded to described Target Photo farEnd server, determines in described Target Photo according to described default picture group in order to trigger described far-end serverThe position of object content, and then go out with respect to the Relative position determination of filming apparatus according to described object contentThe residing position of filming apparatus, can realize and utilize the powerful computational resource of far-end server to complete in described targetThe location positioning process of holding, improves location efficiency.
Referring to Fig. 4, be the structural representation of the first embodiment of the terminal that provides of the embodiment of the present invention, as Fig. 4Shown terminal 40 can comprise: search unit 401, the first determining units 403, acquiring unit 405 andThe second determining unit 407. Wherein:
Search unit 401, for according to the object content of Target Photo, in default picture group, find out withThe reference picture that comprises described object content; The content that picture in described default picture group comprises location aware;
The first determining unit 403, for determining the position of described object content according to the described reference picture finding outPut;
Acquiring unit 405, while taking described Target Photo for obtaining, described object content is with respect to taking dressThe relative position of putting;
The second determining unit 407, for according to described in the position of described object content and described Relative position determinationThe residing position of filming apparatus.
In the embodiment of the present invention, described Target Photo can be the picture that terminal 40 is taken, described object contentCan be the reference object of terminal 40, for example, sight spot, building etc. Portion in described default picture groupPoint or all picture all can comprise the content of location aware. For example, described default picture group is 100 scapesPoint photo, the wherein location aware at the sight spot in each photo.
In the embodiment of the present invention, described reference picture and described Target Photo comprise same reference object,Described object content. The shooting of described same reference object in described reference picture and described Target PhotoAngle can be inconsistent. For example, described reference picture comprises positive " Tian An-men, Beijing " of taking, described orderThe sheet of marking on a map comprises " Tian An-men, Beijing " that side is taken.
In specific implementation, searching unit 401 can further comprise: recognition unit and judging unit, wherein:
Described recognition unit, for: the content that each picture of the default picture group of identification comprises;
Described judging unit, for: described each picture of judgement whether exist content that picture comprises withThe corresponding same reference object of described object content, if exist, determines that the described picture existing is described referencePicture.
Concrete, the first determining unit 403 can be by corresponding the described object content in described reference pictureKnown location is defined as the position of described object content.
The described relative position that the embodiment of the present invention relates to can pass through: direction (passing through angular quantification) and distanceFrom representing.
In the embodiment of the present invention, described filming apparatus can be dual camera. In specific implementation, acquiring unit405 with according to reference object with respect to constant between two cameras different angle, distance and dual cameraDistance, utilizes " range of triangle " algorithm, calculates fast the distance of reference object with respect to described dual cameraFrom and direction.
In one implementation, if the position of the described object content that the first determining unit 403 is determinedFor absolute geographical position, the second determining unit 407 can be according to the position of described object content with described relativePosition (directional information, range information) calculates the absolute geographical position of described filming apparatus.
Taking Fig. 2 as example, described object content is equivalent to the S in Fig. 2, and filming apparatus is equivalent to the C in Fig. 2.Suppose that the first determining unit 403 determines the position coordinates of described object content (being equivalent to the S in Fig. 2) and be(x1, x2), and acquiring unit 405 has obtained the relative position of S with respect to C: angle c and distanceFrom L3. Therefore, the second determining unit 407 can be relative according to position coordinates (x1, x2) and the S of SRelative position in C: angle c and distance L 3, the position coordinates that calculates C is: (x1-L3*sinc,y1-cosc)。
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
In another kind of implementation, if the position of the described object content that the first determining unit 403 is determinedBe set to relative geographical position, the position that the second determining unit 407 can described object content is reference point,Represent the residing position of described filming apparatus by described relative position (directional information, range information).
For example, described object content is famous sight spot " Tian An-men, Beijing ", the direction letter in described relative positionBreath is: southeastern direction, the range information in described relative position is: 20 meters. So, terminal can be by instituteStating the residing location expression of filming apparatus is: in the southeastern direction in " Tian An-men, Beijing ", and distance " sky, BeijingPeace door " 20 meters.
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
Further, as shown in Figure 5, terminal 40 is comprising: search unit 401, the first determining units 403,Outside acquiring unit 405 and the second determining unit 407, also can comprise: display unit, for taking instituteState after Target Photo, the residing position of described filming apparatus of determining according to described the second determining unit is at orderOn mark map, indicate the residing position of described filming apparatus, can make user know more intuitively own of living inPosition, promote user experience.
In a kind of implementation of the embodiment of the present invention, described default picture group can be kept at far-end serverOn. Terminal 40 also can comprise: linkage unit and uploading unit, wherein:
Described linkage unit, for establishing a communications link with described far-end server;
Described uploading unit, for uploading described target figure by described communication connection to described far-end serverSheet in order to trigger far-end server according to the object content in Target Photo, finds out in default picture groupWith the reference picture that comprises described object content, and determine described target according to the described reference picture finding outThe position of content, can realize the position that utilizes the powerful computational resource of far-end server to complete described object contentDeterministic process, improves location efficiency.
Will be understood that, in terminal 40, the function of each functional module can be real according to the method for above-mentioned Fig. 1 or Fig. 3Execute the method specific implementation in example, repeat no more here.
Referring to Fig. 6, it is the structural representation of the 3rd embodiment of terminal provided by the invention. Wherein, as Fig. 6Shown in, this terminal 100 can comprise: filming apparatus 1006, input unit 1003, output device 1004, deposit(quantity of the processor 1001 in terminal 100 can for reservoir 1005 and the processor 1001 that is coupled with memory 1005Being one or more, in Fig. 6 taking a processor as example). In some embodiments of the invention, takeDevice 1006, input unit 1003, output device 1004, memory 1002 and processor 1001 can be by totalLine or alternate manner connect, wherein, in Fig. 6 to be connected to example by bus.
Wherein, filming apparatus 1006, for pictures taken. In specific implementation, filming apparatus 1006 can comprise:Bionical dual camera, single camera etc. Input unit 1003, for receiving outside input data. SpecificallyIn realization, input unit 1003 can comprise keyboard, mouse, photoelectricity input unit, acoustic input dephonoprojectoscope, touchTouch formula input unit, scanner etc. Output device 1004, for externally exporting data. In specific implementation,Output device 1004 can comprise display, loudspeaker, printer etc. Memory 1005 is for storage programCode, in specific implementation, memory 1005 can adopt read-only storage (ReadOnlyMemory, ROM).Processor 1001, for example CPU, is stored in the following step of memory 1005 Program codes execution for callingRapid:
According to the object content in Target Photo, in default picture group, find out and comprise described object contentReference picture; The content that picture in described default picture group comprises location aware;
Determine the position of described object content according to the described reference picture finding out;
Obtain while taking described Target Photo, described object content is with respect to the relative position of filming apparatus;
According to the residing position of filming apparatus described in the position of described object content and described Relative position determination.
In the embodiment of the present invention, described Target Photo can be the picture that filming apparatus 1006 is taken, described orderMark content can be the reference object of filming apparatus 1006, for example, and sight spot, building etc. Described defaultPart or all of picture in picture group all can comprise the content of location aware. For example, described default pictureGroup is 100 sight spot photos, wherein the location aware at the sight spot in each photo.
In the embodiment of the present invention, described reference picture and described Target Photo comprise same reference object,Described object content. The shooting of described same reference object in described reference picture and described Target PhotoAngle can be inconsistent. For example, described reference picture comprises positive " Tian An-men, Beijing " of taking, described orderThe sheet of marking on a map comprises " Tian An-men, Beijing " that side is taken.
Concrete, processor 1001 can be identified the content that each picture in described default picture group comprises, andThe content same shooting corresponding to described object content that whether exists picture to comprise in described each picture of judgementObject, if exist, determines that the described picture existing is described reference picture.
Concrete, processor 1001 can be by known bits corresponding to described object content in described reference picturePut the position that is defined as described object content.
In the embodiment of the present invention, filming apparatus 1006 can be bionical dual camera. It is known according to prior art,Processor 1001 can obtain described object content phase in the time taking described Target Photo by bionical dual cameraFor the relative position of bionical dual camera.
Need to illustrate, except utilizing dual camera, processor 1001 can also pass through other means, for exampleThe auxiliary single camera of range finding assembly (as the range finding such as laser, infrared ray assembly), obtains described object content phaseFor the relative position of described filming apparatus, be not restricted here.
In one implementation, if the position of the described object content that processor 1001 obtains for utterlyReason position, processor 1001 can according to the position of described object content and described relative position (directional information,Range information) calculate the absolute geographical position of described filming apparatus.
Taking Fig. 2 as example, described object content is equivalent to the S in Fig. 2, and filming apparatus is equivalent to the C in Fig. 2.The position coordinates of supposing described object content (being equivalent to the S in Fig. 2) is (x1, x2). Due to processor1001 have calculated the relative position of S with respect to C: angle c and distance L 3. Therefore, processor 1001Can be according to the position coordinates (x1, x2) of S and S the relative position with respect to C: angle c and distanceL3, the position coordinates that calculates C is: (x1-L3*sinc, y1-cosc).
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
In another kind of implementation, if the position of the described object content that processor 1001 obtains is for relativeGeographical position, the position that processor 1001 can described object content is reference point, by described phase contrapositionPut (directional information, range information) and represent the residing position of described filming apparatus.
For example, described object content is famous sight spot " Tian An-men, Beijing ", the direction letter in described relative positionBreath is: southeastern direction, the range information in described relative position is: 20 meters. So, processor 1001 canTaking by output device 1004 by residing described filming apparatus location expression as: in " Tian An-men, Beijing "Southeastern direction, distance " Tian An-men, Beijing " 20 meters.
Example is only a kind of implementation of the embodiment of the present invention, can be different in practical application, here notBe restricted.
Further, for better application localization method provided by the invention, processor 1001 can also beFilming apparatus 1006 is taken after described Target Photo, by output device 1004 at target map (as BaiduMap) on indicate the residing position of described filming apparatus that step S107 determines so that user is more straightThat sees knows own residing position, promotes user and experiences.
In a kind of implementation of the embodiment of the present invention, described default picture group can be kept at far-end serverOn. Processor 1001 can also establish a communications link with described far-end server, by described communication connection toDescribed far-end server is uploaded described Target Photo, in order to trigger described far-end server according in Target PhotoObject content, in default picture group, find out and the reference picture that comprises described object content, and according toThe described reference picture finding out is determined the position of described object content, can realize and utilize far-end server powerfulComputational resource complete the location positioning process of described object content, improve location efficiency.
Will be understood that, the step that processor 1001 is carried out can also be implemented with reference to above-mentioned Fig. 1 or Fig. 3 methodThe content of example, repeats no more here.
In sum, determine the position of the object content in Target Photo according to default picture group, then pass throughDescribed in while taking pictures, object content goes out the position of described filming apparatus with respect to the Relative position determination of filming apparatus,And then determine the residing position of user of taking pictures, improve positioning precision.
Module or submodule in all embodiment of the present invention, can pass through universal integrated circuit, for example CPU(CentralProcessingUnit, central processing unit), or by ASIC (ApplicationSpecificIntegratedCircuit, special IC) realize.
The sequence of steps of the method for the embodiment of the present invention can be adjusted according to actual needs, merges or delete.The module of the terminal of the embodiment of the present invention can be integrated according to actual needs, Further Division or delete.
One of ordinary skill in the art will appreciate that all or part of flow process realizing in above-described embodiment method,Be can carry out the hardware that instruction is relevant by computer program to complete, described program can be stored in a calculatingIn machine read/write memory medium, this program, in the time carrying out, can comprise as the flow process of the embodiment of above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-OnlyMemory,Or random store-memory body (RandomAccessMemory, RAM) etc. ROM).
Above disclosed only for preferred embodiment of the present invention, certainly can not with this limit the present invention itInterest field, the equivalent variations of therefore doing according to the claims in the present invention, still belongs to the scope that the present invention is contained.

Claims (10)

1. a localization method, is characterized in that, comprising:
According to the object content in Target Photo, in default picture group, find out and comprise described object contentReference picture; The content that picture in described default picture group comprises location aware;
Determine the position of described object content according to the described reference picture finding out;
Obtain while taking described Target Photo, described object content is with respect to the relative position of filming apparatus;
According to the residing position of filming apparatus described in the position of described object content and described Relative position determination.
2. the method for claim 1, is characterized in that, described relative position comprises: directional informationAnd range information; Of living according to filming apparatus described in the position of described object content and described Relative position determinationPosition, comprising:
If the position of described object content is absolute geographical position, according to the position of described object content,Described directional information, described range information calculate the absolute geographical position of described filming apparatus; Or,
If the position of described object content is relative geographical position, taking the position of described object content as ginsengExamination point, represents the residing position of described filming apparatus by described directional information and described range information.
3. the method for claim 1, is characterized in that, described default picture group is kept at far-end clothesOn business device; Described method also comprises: establish a communications link with described far-end server; By described communication linkConnect to described far-end server and upload described Target Photo, carry out described in order to trigger described far-end serverAccording to the object content in Target Photo, in default picture group, find out and the reference that comprises described object contentThe step of picture.
4. the method for claim 1, is characterized in that, described according in the target in Target PhotoHold, in default picture group, find out and the reference picture that comprises described object content, comprising:
Identify the content that each picture in described default picture group comprises;
The content that whether exists picture to comprise in described each picture of judgement is corresponding with described object content sameReference object, if exist, determines that the described picture existing is described reference picture.
5. the method for claim 1, is characterized in that, also comprises: taking described Target PhotoAfter, on target map, indicate described filming apparatus according to the residing position of described filming apparatus of determiningResiding position.
6. a terminal, is characterized in that, comprising:
Search unit, for according to the object content of Target Photo, in default picture group, find out and wrapContaining the reference picture of described object content; The content that picture in described default picture group comprises location aware;
The first determining unit, for determining the position of described object content according to the described reference picture finding out;
Acquiring unit, while taking described Target Photo for obtaining, described object content is with respect to filming apparatusRelative position;
The second determining unit, for clapping described in the position of described object content and described Relative position determinationTake the photograph the residing position of device.
7. terminal as claimed in claim 6, is characterized in that, described relative position comprises: directional informationAnd range information; The second determining unit, specifically for:
If the position of described object content is absolute geographical position, according to the position of described object content,Described directional information, described range information calculate the absolute geographical position of described filming apparatus; Or,
If the position of described object content is relative geographical position, taking the position of described object content as ginsengExamination point, represents the residing position of described filming apparatus by described directional information and described range information.
8. terminal as claimed in claim 6, is characterized in that, described default picture group is kept at far-end clothesOn business device; Described terminal also comprises: linkage unit and uploading unit, wherein:
Described linkage unit, for establishing a communications link with described far-end server;
Described uploading unit, for uploading described target figure by described communication connection to described far-end serverSheet, described according to the object content in Target Photo in order to trigger described far-end server execution, at default figureIn sheet group, find out the step with the reference picture that comprises described object content.
9. terminal as claimed in claim 6, is characterized in that, described in search unit, comprising: identification formUnit and judging unit, wherein:
Described recognition unit, for: the content that each picture of the default picture group of identification comprises;
Described judging unit, for: described each picture of judgement whether exist content that picture comprises withThe corresponding same reference object of described object content, if exist, determines that the described picture existing is described referencePicture.
10. terminal as claimed in claim 6, is characterized in that, also comprises: display unit, forTake after described Target Photo the residing position of described filming apparatus of determining according to described the second determining unitPut and on target map, indicate the residing position of described filming apparatus.
CN201510542616.6A 2015-08-28 2015-08-28 Positioning method and terminal Pending CN105606086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510542616.6A CN105606086A (en) 2015-08-28 2015-08-28 Positioning method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510542616.6A CN105606086A (en) 2015-08-28 2015-08-28 Positioning method and terminal

Publications (1)

Publication Number Publication Date
CN105606086A true CN105606086A (en) 2016-05-25

Family

ID=55986218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510542616.6A Pending CN105606086A (en) 2015-08-28 2015-08-28 Positioning method and terminal

Country Status (1)

Country Link
CN (1) CN105606086A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106304335A (en) * 2016-09-22 2017-01-04 宇龙计算机通信科技(深圳)有限公司 Localization method, device and terminal
CN106331639A (en) * 2016-08-31 2017-01-11 浙江宇视科技有限公司 Method and apparatus of automatically determining position of camera
CN106530008A (en) * 2016-11-10 2017-03-22 广州市沃希信息科技有限公司 Scene-picture-based advertising method and system
CN106530009A (en) * 2016-11-10 2017-03-22 广州市沃希信息科技有限公司 Passenger location-based advertisement method and system
CN107123144A (en) * 2017-03-31 2017-09-01 维沃移动通信有限公司 A kind of method and mobile terminal for positioning calibration
CN107771408A (en) * 2016-06-17 2018-03-06 华为技术有限公司 Mobile terminal and its localization method
WO2019000461A1 (en) * 2017-06-30 2019-01-03 广东欧珀移动通信有限公司 Positioning method and apparatus, storage medium, and server
CN110132274A (en) * 2019-04-26 2019-08-16 中国铁道科学研究院集团有限公司电子计算技术研究所 A kind of indoor orientation method, device, computer equipment and storage medium
CN110413719A (en) * 2019-07-25 2019-11-05 Oppo广东移动通信有限公司 Information processing method and device, equipment, storage medium
CN112113580A (en) * 2019-06-21 2020-12-22 北汽福田汽车股份有限公司 Vehicle positioning method and device and automobile
CN113014810A (en) * 2021-02-25 2021-06-22 深圳市慧鲤科技有限公司 Positioning method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006137667A1 (en) * 2005-06-20 2006-12-28 Samsung Electronics Co., Ltd. Method and system for providing image-related information to user, and mobile terminal therefor
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN103067856A (en) * 2011-10-24 2013-04-24 康佳集团股份有限公司 Geographic position locating method and system based on image recognition
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN103557834A (en) * 2013-11-20 2014-02-05 无锡儒安科技有限公司 Dual-camera-based solid positioning method
CN103630112A (en) * 2013-12-03 2014-03-12 青岛海尔软件有限公司 Method for achieving target positioning through double cameras
CN103884334A (en) * 2014-04-09 2014-06-25 中国人民解放军国防科学技术大学 Moving target positioning method based on wide beam laser ranging and single camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006137667A1 (en) * 2005-06-20 2006-12-28 Samsung Electronics Co., Ltd. Method and system for providing image-related information to user, and mobile terminal therefor
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN103067856A (en) * 2011-10-24 2013-04-24 康佳集团股份有限公司 Geographic position locating method and system based on image recognition
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN103557834A (en) * 2013-11-20 2014-02-05 无锡儒安科技有限公司 Dual-camera-based solid positioning method
CN103630112A (en) * 2013-12-03 2014-03-12 青岛海尔软件有限公司 Method for achieving target positioning through double cameras
CN103884334A (en) * 2014-04-09 2014-06-25 中国人民解放军国防科学技术大学 Moving target positioning method based on wide beam laser ranging and single camera

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107771408B (en) * 2016-06-17 2020-02-21 华为技术有限公司 Mobile terminal and positioning method thereof
CN107771408A (en) * 2016-06-17 2018-03-06 华为技术有限公司 Mobile terminal and its localization method
CN106331639A (en) * 2016-08-31 2017-01-11 浙江宇视科技有限公司 Method and apparatus of automatically determining position of camera
CN106331639B (en) * 2016-08-31 2019-08-27 浙江宇视科技有限公司 A kind of method and device automatically determining camera position
CN106304335A (en) * 2016-09-22 2017-01-04 宇龙计算机通信科技(深圳)有限公司 Localization method, device and terminal
CN106530008A (en) * 2016-11-10 2017-03-22 广州市沃希信息科技有限公司 Scene-picture-based advertising method and system
CN106530009A (en) * 2016-11-10 2017-03-22 广州市沃希信息科技有限公司 Passenger location-based advertisement method and system
CN106530009B (en) * 2016-11-10 2021-10-26 广州市沃希信息科技有限公司 Advertising method and system based on passenger positioning
CN107123144A (en) * 2017-03-31 2017-09-01 维沃移动通信有限公司 A kind of method and mobile terminal for positioning calibration
WO2019000461A1 (en) * 2017-06-30 2019-01-03 广东欧珀移动通信有限公司 Positioning method and apparatus, storage medium, and server
CN110132274A (en) * 2019-04-26 2019-08-16 中国铁道科学研究院集团有限公司电子计算技术研究所 A kind of indoor orientation method, device, computer equipment and storage medium
CN112113580A (en) * 2019-06-21 2020-12-22 北汽福田汽车股份有限公司 Vehicle positioning method and device and automobile
CN110413719A (en) * 2019-07-25 2019-11-05 Oppo广东移动通信有限公司 Information processing method and device, equipment, storage medium
WO2021013038A1 (en) * 2019-07-25 2021-01-28 Oppo广东移动通信有限公司 Information processing method and apparatus, device and storage medium
CN113014810A (en) * 2021-02-25 2021-06-22 深圳市慧鲤科技有限公司 Positioning method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN105606086A (en) Positioning method and terminal
EP3134870B1 (en) Electronic device localization based on imagery
JP6676082B2 (en) Indoor positioning method and system, and device for creating the indoor map
KR101620299B1 (en) Picture positioning method and system based on mobile terminal navigation function
KR101356192B1 (en) Method and System for Determining Position and Attitude of Smartphone by Image Matching
US7991194B2 (en) Apparatus and method for recognizing position using camera
CN109029444B (en) Indoor navigation system and method based on image matching and space positioning
US20180188033A1 (en) Navigation method and device
CN102714684B (en) Use the image recognition that the position based on track is determined
CN101794316A (en) Real-scene status consulting system and coordinate offset method based on GPS location and direction identification
CN112556685B (en) Navigation route display method and device, storage medium and electronic equipment
EP3241037A1 (en) Changing camera parameters based on wireless signal information
US20100310125A1 (en) Method and Device for Detecting Distance, Identifying Positions of Targets, and Identifying Current Position in Smart Portable Device
CN103245337B (en) A kind of obtain the method for mobile terminal locations, mobile terminal and position detecting system
CN107193820B (en) Position information acquisition method, device and equipment
Steinhoff et al. How computer vision can help in outdoor positioning
JP2014209680A (en) Land boundary display program, method, and terminal device
KR100853379B1 (en) Method for transforming based position image file and service server thereof
US20190272426A1 (en) Localization system and method and computer readable storage medium
JP5562814B2 (en) Map information providing apparatus, map information providing system, map information providing method, and map information providing program
JP6591594B2 (en) Information providing system, server device, and information providing method
CN106951553A (en) A kind of address locating methods and device
TWI631861B (en) Method and system for indoor positioning and device for creating indoor maps thereof
KR20210016757A (en) System and method for managing base station antenna information
CN112533135B (en) Pedestrian positioning method and device, server and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160525