CN104897152A - Navigation method and navigation apparatus - Google Patents

Navigation method and navigation apparatus Download PDF

Info

Publication number
CN104897152A
CN104897152A CN201510192131.9A CN201510192131A CN104897152A CN 104897152 A CN104897152 A CN 104897152A CN 201510192131 A CN201510192131 A CN 201510192131A CN 104897152 A CN104897152 A CN 104897152A
Authority
CN
China
Prior art keywords
latitude
information
destination locations
camera site
longitude information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510192131.9A
Other languages
Chinese (zh)
Inventor
刘东声
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coolpad Software Technology Shenzhen Co Ltd
Original Assignee
Coolpad Software Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coolpad Software Technology Shenzhen Co Ltd filed Critical Coolpad Software Technology Shenzhen Co Ltd
Priority to CN201510192131.9A priority Critical patent/CN104897152A/en
Publication of CN104897152A publication Critical patent/CN104897152A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The present invention belongs to the technical field of communication, and provides a navigation method and a navigation apparatus. The navigation method comprises: recording the first latitude and longitude information and the angle information of the shooting position; shooting the image of the destination position at the shooting position, and acquiring the information of the distance from the shooting position to the destination position; according to the distance information and the angle information of the shooting position, calculating the second latitude and longitude information of the destination position; and planning the route from the shooting position to the destination position through the first latitude and longitude information and the second latitude and longitude information. With the navigation method and the navigation apparatus of the present invention, the navigation arriving in the unknown destination within the user visual field is achieved.

Description

The method of navigation and device thereof
Technical field
The present invention relates to communication technical field, particularly relate to a kind of method and device thereof of navigation.
Background technology
Airmanship is widely used in people's life now.People use vehicle mounted guidance to carry out the navigation of vehicle driving route usually.Or adopt some digital map navigations to carry out walking navigation.But no matter be vehicle mounted guidance, or digital map navigation, all usually need the title knowing destination, the title of such as certain mansion, just can navigate after then inputting the title of destination.And when user comes a strange city, when it may see certain mansion or highland within sweep of the eye, but do not know the title on this mansion or highland, then cannot realize navigation by above-mentioned airmanship of the prior art, arrive nameless destination to thus user in unfamiliar city and bring inconvenience.
Summary of the invention
For above-mentioned defect, the object of the present invention is to provide a kind of method and device thereof of navigation, to realize arriving the nameless destination navigation in the user visual field.
To achieve these goals, the invention provides a kind of method of navigation, comprising:
First latitude and longitude information of records photographing position and the angle information of camera site;
At the image of described camera site shooting destination locations, and obtain the range information of described camera site to described destination locations;
The second latitude and longitude information of described destination locations is calculated according to the angle information of described range information and described camera site;
The route of described camera site to described destination locations is planned by described first latitude and longitude information and the second latitude and longitude information.
According to described method, the first latitude and longitude information of described records photographing position and the angle information of camera site, comprising:
Adopt dual camera to take the image of described destination locations in described camera site, and obtain the three-dimensional coordinate information of described destination locations;
In the image of described destination locations, choose impact point, according to position and the described three-dimensional coordinate information of described impact point, calculate the air line distance S that described destination locations is taken to described dual camera.
According to described method, the angle information of described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
The described angle information according to described range information and described camera site calculates the second latitude and longitude information of described destination locations, comprising:
Calculate the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β;
Described first latitude and longitude information is calculated by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta;
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y2=Y1+H*Sin (90-θ)/111.
According to described method, describedly plan the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information, comprising:
Input in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively.
Method according to above-mentioned any one, according to the route of described planning, described camera site of navigating is to described destination locations.
In order to realize another goal of the invention of the present invention, present invention also offers a kind of device of navigation, comprising:
Logging modle, for the first latitude and longitude information of records photographing position and the angle information of camera site;
Acquisition module, for the image at described camera site shooting destination locations, and obtains the range information of described camera site to described destination locations;
Computing module, for calculating the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site;
Navigation module, for planning the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information.
According to described device, described acquisition module comprises:
Shooting submodule, for adopting dual camera to take the image of described destination locations in described camera site, and obtains the three-dimensional coordinate information of described destination locations;
First calculating sub module, for choosing impact point in the image of described destination locations, according to position and the described three-dimensional coordinate information of described impact point, calculates the air line distance S that described destination locations is taken to described dual camera.
According to described device, the angle information of described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
Described computing module comprises:
Second calculating sub module, for calculating the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β;
3rd calculating sub module, for calculating described first latitude and longitude information by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta;
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y2=Y1+H*Sin (90-θ)/111.
According to described device, described navigation module comprises:
Input submodule, for inputting in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively;
Navigation submodule, for according to described route information, plans the route of described camera site to described destination locations.
Device according to above-mentioned any one, described navigation module is also for the route according to described planning, and described camera site of navigating is to described destination locations.
The present invention is by taking the image of destination locations to obtain the range information of described camera site to described destination locations in camera site; And the angle information of the first latitude and longitude information of records photographing position and camera site; Then, the second latitude and longitude information of described destination locations is calculated according to the angle information of described range information and described camera site; The route of described camera site to described destination locations is planned finally by described first latitude and longitude information and the second latitude and longitude information.Thus, navigated by the shooting style auxiliary positioning of dual camera, the application of the dual camera of enhancing, realizes arriving the nameless destination navigation in the user visual field.And make user obtain brand-new different navigation experience mode.
Accompanying drawing explanation
Fig. 1 is the structural representation of the device of the navigation provided in first embodiment of the invention;
Fig. 2 be the present invention second and third, the structural representation of the device of navigation that provides in four embodiments;
Fig. 3 A is the three-dimensional scenic schematic diagram of the measurement device destination locations of the navigation that one embodiment of the invention provides;
Fig. 3 B is the plan view schematic diagram of the measurement device destination locations of the navigation that one embodiment of the invention provides;
Fig. 4 is the method flow diagram of the navigation that fifth embodiment of the invention provides.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
See Fig. 1, provide a kind of device 100 of navigation in the first embodiment of the present invention, comprising:
Logging modle 10, for the first latitude and longitude information of records photographing position and the angle information of camera site;
Acquisition module 20, for the image at described camera site shooting destination locations, and obtains the range information of described camera site to described destination locations;
Computing module 30, for calculating the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site;
Navigation module 40, for planning the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information.
In this embodiment, when the image of camera site shooting destination locations, acquisition module 20 will obtain the range information of described camera site to described destination locations; This range information comprises the on-plane surface air line distance of camera site to destination locations.Meanwhile, first latitude and longitude information of logging modle 10 by records photographing position and the angle information of camera site.Then, computing module 30 will calculate the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site.Finally, navigation module 40 provides the route from described camera site to described destination locations by utilizing described first latitude and longitude information and the second latitude and longitude information for user.And described destination locations is building or high mountain.Such as, user is seeing destination locations at a distance, as in order to a mansion, is then taken described mansion by the device 100 of navigation.Wherein logging modle 10 have recorded the first latitude and longitude information of the current camera site of user and the angle information of camera site; Acquisition module 20 then have taken the image of this mansion, and obtains the range information of described camera site to described mansion; Computing module 30 calculates the second latitude and longitude information of described mansion according to the angle information of described range information and described camera site; Last navigation module 40, the route relevant with the second latitude and longitude information planning of described mansion according to the first latitude and longitude information of camera site, makes user know the route how arriving described mansion from camera site.Thus, realize arriving the nameless destination navigation in the user visual field.Experience for user brings navigation Service easily.
See Fig. 2, in the second embodiment of the present invention, acquisition module 20 comprises:
Shooting submodule 21, for adopting dual camera to take the image of described destination locations in described camera site, and obtains the three-dimensional coordinate information of described destination locations;
First calculating sub module 22, for choosing impact point in the image of described destination locations, according to position and the described three-dimensional coordinate information of described impact point, calculates the air line distance S that described destination locations is taken to described dual camera.
In this embodiment, the device 100 of navigation has dual camera, and shooting submodule 21 adopts dual camera to take the image of described destination locations in described camera site, can obtain the three-dimensional coordinate information of described destination locations in the process of shooting; Then according to the impact point that user selects in described image, the first calculating sub module 22, according to described aiming spot and described three-dimensional coordinate information, calculates the air line distance S that described destination locations is taken to described dual camera.The position of the image that described impact point is formed in described dual camera is different.The technical scheme of concrete calculating comprises multiple, also has in the prior art and measures the technical scheme with the distance of shot object for adopting dual camera to take pictures.In one embodiment of the invention, dual camera can be utilized to gather the image of object under test; Different according to the position of the image that object under test is a bit formed in dual camera, determine the distance of this point on object under test to camera, and determine its position in space by calculating the distance of these all directions in the three dimensions being initial point with the intermediate point of two cameras.Concrete, by to tested point on determinand (namely a bit) in the position of two camera imagings, the orientation angles of the incidence of the orientation angles being incident to the incident ray incidence of the first camera drawing tested point on described determinand and the incident ray being incident to second camera, according to the distance between the central point of two cameras, and described in be incident to the orientation angles of the incident ray incidence of the first camera and be incident to the orientation angles of incidence of incident ray of second camera, calculate the distance of the mid point of tested point and two camera central point lines on determinand.This distance is described air line distance S.In addition, for the measuring distance scope of described air line distance S, by the spacing by described dual camera, the precision of hardware, the factor impacts such as algorithm.
See Fig. 2 and Fig. 3 A and Fig. 3 B, in the third embodiment of the present invention, the angle information of described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
Computing module 30 comprises:
Second calculating sub module 31, for calculating the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β;
3rd calculating sub module 32, for calculating described first latitude and longitude information by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta;
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y2=Y1+H*Sin (90-θ)/111.
In this embodiment, the angle information of described camera site and described destination locations is utilized to calculate the second latitude and longitude information of described destination locations to the air line distance S that described dual camera is taken.Concrete, the second calculating sub module 31, according to S and β (elevation angle), calculates camera site described in water outlet to the plane and straight line distance H of described destination locations, H=S*Cos β;
3rd calculating sub module 32 calculates the longitude and latitude of destination locations by above known conditions:
X2=X1+H*Cos (90-θ)/111CosX1; (111CosX1 equals 1 latitude)
Y2=Y1+H*Sin (90-θ)/111; (111Km equals 1 longitude).
See Fig. 2, in the fourth embodiment of the present invention, navigation module 40 comprises:
Input submodule 41, for inputting in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively;
Navigation submodule 42, for according to described route information, plans the route of described camera site to described destination locations.
In this embodiment, in this embodiment, owing to having been obtained the second latitude and longitude information of destination locations by computing module 30, then by inputting submodule 41 respectively by the first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) in the navigation map prestored, the route information from described camera site to described destination locations can be obtained thus; Navigation submodule 42, according to this navigation information, plans the route of described camera site to described destination locations.Concrete, described navigation module 40 is also for the route according to described planning, and described camera site of navigating is to described destination locations.User can be navigated by navigation module 40 according to the route of this planning, arrives the destination locations that it needs to arrive smoothly.
In above-mentioned multiple embodiment, the device 100 of described navigation can be applicable to mobile phone, PDA (Personal Digital Assistant, personal digital assistant), in the communication terminal such as panel computer, and multiple modules of the device 100 of described navigation, can be the software unit being built in terminal, hardware cell or software and hardware combining unit.
See Fig. 4, in the fifth embodiment of the present invention, provide a kind of method of navigation, comprising:
In step S401, the first latitude and longitude information of records photographing position and the angle information of camera site; Realized by logging modle 10.
In step S402, at the image of described camera site shooting destination locations, and obtain the range information of described camera site to described destination locations; 20. are realized by acquisition module
In step S403, calculate the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site; Realized by computing module 30.
In step S404, plan the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information; Realized by navigation module 40.
In this embodiment, when user comes a unfamiliar city, need to arrive certain high-rise building within the vision, taken the image of destination locations by the device 100 of navigation, be i.e. the image of described certain high building of shooting.And in shooting process, first latitude and longitude information of camera site at logging modle 10 meeting recording user place and the angle information of camera site; Specific implementation, the first longitude and latitude of the camera site, GPS recording user present place of the device 100 of navigation.In addition, the azimuth angle theta during magnetic field induction module records photographing photo of the device 100 of navigation, according to the elevation angle β of gravity sensing module records photographing, user's current location arrives the range information in described certain building, and the three-dimensional information simultaneously on recording photograph.Described magnetic field induction module can be a gyroscope, calculates described azimuth angle theta by gyroscope.And described gravity sensing module can be gravity sensor.After shooting completes, computing module 30 calculates the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site; Finally, the route of described camera site to described destination locations is planned by navigation module 40 by described first latitude and longitude information and the second latitude and longitude information.According to the route of described planning, navigation module 40 navigates described camera site to described destination locations.Thus, user can obtain the route navigating to described certain mansion from current camera site.Concrete, described destination locations is building or high mountain.Certainly, can also be other object, as long as the object that user can photograph in its camera site, comprise buildings, slight slope, massif etc. can realize navigation.Thus, realize arriving the nameless destination navigation in the user visual field.
In the sixth embodiment of the present invention, described step S402 comprises:
Adopt dual camera to take the image of described destination locations in described camera site, and obtain the three-dimensional coordinate information of described destination locations; This step realizes by taking submodule 21.
The image of described destination locations chooses impact point, according to position and the described three-dimensional coordinate information of described impact point, calculates the air line distance S that described destination locations is taken to described dual camera; This step is realized by the first calculating sub module 22.
In this embodiment, the device 100 of navigation includes dual camera, adopts dual camera to take the image of described destination locations, and obtain the three-dimensional coordinate information of described destination locations by dual camera in described camera site.Then user can choose impact point in described image, and the method for range finding of taking pictures according to dual camera, can measure the air line distance S that described destination locations is taken to described dual camera.
In the seventh embodiment of the present invention, the angle information of foregoing described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
Described step S303 comprises:
Calculate the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β; This step is realized by the second calculating sub module 31.
Described first latitude and longitude information is calculated by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta; This step is realized by the 3rd calculating sub module 32.
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y2=Y1+H*Sin (90-θ)/111.
In this embodiment, the horizontal linear distance H of user (camera site) and (destination locations) target structures can be calculated by three-dimensional information and elevation angle β, again according to the first longitude and latitude (X1 of GPS location, and known azimuth θ Y1), calculate the actual longitude and latitude (x2, y2) of target structures.
In the eighth embodiment of the present invention, described step S404 comprises:
Input in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively; This step realizes by inputting submodule 41;
According to described navigation information, navigate the route of described camera site to described destination locations; This step is realized by the submodule 42 that navigates.
In this embodiment, (X1, Y1) and (X2, Y2) coordinate is imported Mobile Telephone Gps map, carries out route planning.Namely finally by the current position longitude and latitude of user and target longitude and latitude, import navigation map, plan guidance path by navigation map.
In sum, the present invention is by taking the image of destination locations to obtain the range information of described camera site to described destination locations in camera site; And the angle information of the first latitude and longitude information of records photographing position and camera site; Then, the second latitude and longitude information of described destination locations is calculated according to the angle information of described range information and described camera site; The route of described camera site to described destination locations is planned finally by described first latitude and longitude information and the second latitude and longitude information.Thus, navigated by the shooting style auxiliary positioning of dual camera, the application of the dual camera of enhancing, realizes arriving the nameless destination navigation in the user visual field.And make user obtain brand-new different navigation experience mode.
Certainly; the present invention also can have other various embodiments; when not deviating from the present invention's spirit and essence thereof; those of ordinary skill in the art are when making various corresponding change and distortion according to the present invention, but these change accordingly and are out of shape the protection domain that all should belong to the claim appended by the present invention.

Claims (10)

1. a method for navigation, is characterized in that, comprising:
First latitude and longitude information of records photographing position and the angle information of camera site;
At the image of described camera site shooting destination locations, and obtain the range information of described camera site to described destination locations;
The second latitude and longitude information of described destination locations is calculated according to the angle information of described range information and described camera site;
The route of described camera site to described destination locations is planned by described first latitude and longitude information and the second latitude and longitude information.
2. method according to claim 1, is characterized in that, the first latitude and longitude information of described records photographing position and the angle information of camera site, comprising:
Adopt dual camera to take the image of described destination locations in described camera site, and obtain the three-dimensional coordinate information of described destination locations;
The image of described destination locations chooses impact point, according to position and the described three-dimensional coordinate information of described impact point, calculates the air line distance S that described destination locations is taken to described dual camera.
3. method according to claim 1, is characterized in that, the angle information of described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
The described angle information according to described range information and described camera site calculates the second latitude and longitude information of described destination locations, comprising:
Calculate the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β;
Described first latitude and longitude information is calculated by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta;
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y 2=Y1+H*Sin (90-θ)/111.
4. method according to claim 3, is characterized in that, describedly plans the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information, comprising:
Input in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively.
5. the method according to any one of Claims 1 to 4, is characterized in that, described method also comprises: according to the route of described planning, and described camera site of navigating is to described destination locations.
6. a device for navigation, is characterized in that, comprising:
Logging modle, for the first latitude and longitude information of records photographing position and the angle information of camera site;
Acquisition module, for the image at described camera site shooting destination locations, and obtains the range information of described camera site to described destination locations;
Computing module, for calculating the second latitude and longitude information of described destination locations according to the angle information of described range information and described camera site;
Navigation module, for planning the route of described camera site to described destination locations by described first latitude and longitude information and the second latitude and longitude information.
7. device according to claim 6, is characterized in that, described acquisition module comprises:
Shooting submodule, for adopting dual camera to take the image of described destination locations in described camera site, and obtains the three-dimensional coordinate information of described destination locations;
First calculating sub module, for choosing impact point in the image of described destination locations, according to position and the described three-dimensional coordinate information of described impact point, calculates the air line distance S that described destination locations is taken to described dual camera.
8. device according to claim 6, is characterized in that, the angle information of described camera site comprises: the elevation angle β of the azimuth angle theta of described camera site and the shooting of described dual camera; Described first latitude and longitude information is represented by (X1, Y1); Described first latitude and longitude information is represented by (X2, Y2); X1, X2 coordinate represents latitude, and Y1, Y2 coordinate represents longitude;
Described computing module comprises:
Second calculating sub module, for calculating the plane and straight line distance H of described camera site to described destination locations; Wherein H=S*Cos β;
3rd calculating sub module, for calculating described first latitude and longitude information by described first latitude and longitude information, described plane and straight line distance H and described azimuth angle theta;
Wherein X2=X1+H*Cos (90-θ)/111CosX1; Y 2=Y1+H*Sin (90-θ)/111.
9. device according to claim 8, is characterized in that, described navigation module comprises:
Input submodule, for inputting in described first latitude and longitude information (X1, Y1) and the second latitude and longitude information (X2, Y2) to the navigation map prestored the route information obtained from described camera site to described destination locations respectively;
Navigation submodule, for according to described route information, plans the route of described camera site to described destination locations.
10. the device according to any one of claim 6 ~ 9, is characterized in that, described navigation module is also for the route according to described planning, and described camera site of navigating is to described destination locations.
CN201510192131.9A 2015-03-30 2015-03-30 Navigation method and navigation apparatus Pending CN104897152A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510192131.9A CN104897152A (en) 2015-03-30 2015-03-30 Navigation method and navigation apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510192131.9A CN104897152A (en) 2015-03-30 2015-03-30 Navigation method and navigation apparatus

Publications (1)

Publication Number Publication Date
CN104897152A true CN104897152A (en) 2015-09-09

Family

ID=54029964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510192131.9A Pending CN104897152A (en) 2015-03-30 2015-03-30 Navigation method and navigation apparatus

Country Status (1)

Country Link
CN (1) CN104897152A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106705950A (en) * 2015-11-12 2017-05-24 天津三星电子有限公司 Method for determining geographic position of target object, and electronic device
CN107329161A (en) * 2017-06-06 2017-11-07 芜湖航飞科技股份有限公司 Precision monitoring system based on big-dipper satellite
CN107449432A (en) * 2016-05-31 2017-12-08 华为终端(东莞)有限公司 One kind utilizes dual camera air navigation aid, device and terminal
CN111148218A (en) * 2019-12-20 2020-05-12 联想(北京)有限公司 Information processing method and device and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1677057A (en) * 2004-03-31 2005-10-05 日本电气株式会社 Portable communication terminal equipped with navigation function and navigation method of portable communication terminal
US20100321489A1 (en) * 2009-06-23 2010-12-23 Xin Chen Determining Geographic Position Information from a Single Image
CN102132557A (en) * 2008-09-03 2011-07-20 三菱电机株式会社 Imaging system for vehicle
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN102997927A (en) * 2011-09-09 2013-03-27 中国电信股份有限公司 Information acquisition and processing method and apparatus
CN103134489A (en) * 2013-01-29 2013-06-05 北京凯华信业科贸有限责任公司 Method of conducting target location based on mobile terminal
CN103884334A (en) * 2014-04-09 2014-06-25 中国人民解放军国防科学技术大学 Moving target positioning method based on wide beam laser ranging and single camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1677057A (en) * 2004-03-31 2005-10-05 日本电气株式会社 Portable communication terminal equipped with navigation function and navigation method of portable communication terminal
CN102132557A (en) * 2008-09-03 2011-07-20 三菱电机株式会社 Imaging system for vehicle
US20100321489A1 (en) * 2009-06-23 2010-12-23 Xin Chen Determining Geographic Position Information from a Single Image
CN102997927A (en) * 2011-09-09 2013-03-27 中国电信股份有限公司 Information acquisition and processing method and apparatus
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN103134489A (en) * 2013-01-29 2013-06-05 北京凯华信业科贸有限责任公司 Method of conducting target location based on mobile terminal
CN103884334A (en) * 2014-04-09 2014-06-25 中国人民解放军国防科学技术大学 Moving target positioning method based on wide beam laser ranging and single camera

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106705950A (en) * 2015-11-12 2017-05-24 天津三星电子有限公司 Method for determining geographic position of target object, and electronic device
CN107449432A (en) * 2016-05-31 2017-12-08 华为终端(东莞)有限公司 One kind utilizes dual camera air navigation aid, device and terminal
CN107329161A (en) * 2017-06-06 2017-11-07 芜湖航飞科技股份有限公司 Precision monitoring system based on big-dipper satellite
CN111148218A (en) * 2019-12-20 2020-05-12 联想(北京)有限公司 Information processing method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN103718062B (en) Method and its equipment for the continuation of the service that ensures personal navigation equipment
US10338228B2 (en) Portable GNSS survey system
CN104380137B (en) Come the method for indirect distance measuring and hand-held distance-measuring equipment by the angle-determining function that image assists
JP5116555B2 (en) LOCATION DEVICE, LOCATION SYSTEM, LOCATION SERVER DEVICE, AND LOCATION METHOD
US7940211B2 (en) Land survey system
CN102435140B (en) Method for constructing geographic coordinate system with laser tracker
CN103017740B (en) Method and system for positioning monitoring target by using video monitoring devices
JP5610870B2 (en) Unmanned traveling vehicle guidance device and unmanned traveling vehicle guidance method
CN105973268B (en) A kind of Transfer Alignment precision quantitative evaluating method based on the installation of cobasis seat
US9411822B2 (en) System and method of generating and using open sky data
CN104897152A (en) Navigation method and navigation apparatus
CN105371827A (en) Full-functional GNSS stereo camera surveying instrument
CN105547282B (en) One kind being used for running fix mesh calibration method and measuring device
CN105571636A (en) Target positioning method and measuring equipment
CN111221020A (en) Indoor and outdoor positioning method, device and system
CN104063499A (en) Space vector POI extracting method based on vehicle-mounted space information collection
WO2014036776A1 (en) Combined gps measuring device
CN105959529B (en) It is a kind of single as method for self-locating and system based on panorama camera
US20150243037A1 (en) Method for a distance measurement
CN111623821B (en) Method for detecting tunnel drilling direction, detecting deviation and determining drilling position
CN207689674U (en) It is a kind of to take aim at the device for measuring target location based on sight
EP2696168A1 (en) Using gravity measurements within a photogrammetric adjustment
Tamimi et al. Performance Assessment of a Mini Mobile Mapping System: Iphone 14 pro Installed on a e-Scooter
TWI632390B (en) Adaptive weighting positioning method
Kim et al. A bimodal approach for land vehicle localization

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150909