WO2015149455A1 - 定位导航方法和装置 - Google Patents
定位导航方法和装置 Download PDFInfo
- Publication number
- WO2015149455A1 WO2015149455A1 PCT/CN2014/082911 CN2014082911W WO2015149455A1 WO 2015149455 A1 WO2015149455 A1 WO 2015149455A1 CN 2014082911 W CN2014082911 W CN 2014082911W WO 2015149455 A1 WO2015149455 A1 WO 2015149455A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinates
- user
- relative position
- coordinate
- coordinate point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000007613 environmental effect Effects 0.000 claims abstract description 76
- 238000004364 calculation method Methods 0.000 claims description 26
- 230000001960 triggered effect Effects 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims 2
- 238000004519 manufacturing process Methods 0.000 abstract description 13
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000003993 interaction Effects 0.000 description 6
- 238000003860 storage Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0257—Hybrid positioning
- G01S5/0263—Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
- G01C21/3638—Guidance using 3D or perspective road maps including 3D objects and buildings
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
Definitions
- the present disclosure relates to the field of map positioning, and in particular, to a positioning and navigation method and apparatus. Background technique
- map-targeting applications In everyday work and life, people often use map-targeting applications to determine their geographic location and the geographic location they need to reach their destination, and to seek access to the destination.
- a positioning and navigation method is provided.
- the coordinates of the current position of the user are obtained by using a positioning method such as a GPS (Global Positioning System), a base station, or a Wi-Fi (Wireless Fidelity).
- a positioning method such as a GPS (Global Positioning System), a base station, or a Wi-Fi (Wireless Fidelity).
- Information receiving the destination input by the user, and acquiring the coordinate information of the destination; and then determining the route to the destination based on the coordinate information of the current location and the coordinate information of the destination.
- the gyroscope in order to solve the problem that the user with poor sense of direction cannot distinguish the orientation of the southeast and northwest in the surrounding environment, and the problem that the route cannot be successfully arrived according to the provided route, after the above three steps, the gyroscope can also be passed. Or the electronic compass obtains the orientation of the user and informs the user. In this way, the user can be guided to the direction along the line to reach the destination
- the embodiment of the present disclosure provides a positioning navigation method and apparatus.
- the technical solution is as follows:
- a positioning navigation method comprising: acquiring initial coordinates of a user;
- An environment picture obtaining a relative position between the object in the environment picture and the user; determining current location information of the user according to the relative position and the absolute coordinates.
- the determining the current geographic location information of the user according to the relative location and the absolute coordinates includes:
- the environment picture When the environment picture is one, according to the relative position, the absolute coordinate, and the initial coordinate Determining a current orientation of the user, and determining the initial coordinates as actual coordinates of the user; when the environment picture is two, determining the location according to the relative position, the absolute coordinates, and the initial coordinates Describe the user's current orientation and actual coordinates;
- the current orientation and the actual coordinates of the user are determined based on the relative position and the absolute coordinates.
- the method includes:
- determining the current orientation and actual coordinates of the user according to the relative position, the absolute coordinate, and the initial coordinate including:
- the determining the current orientation and the actual coordinates of the user according to the relative position and the absolute coordinates when the environment picture is three including:
- the actual coordinates are determined based on the orientation and at least one of the relative positions.
- the determining the current orientation and the actual coordinates of the user according to the relative position and the absolute coordinates when the environment picture is three including:
- the predetermined condition being that the relative position is that the object is directly in front of the user, and the other relative position is that the object is positive in the user
- the left side and the further one of the relative positions is that the object is on the right side of the user
- the coordinates of the foot S are determined as the actual coordinates, and the direction of the foot S to the coordinate point is determined as the orientation.
- the obtaining an environment image of a predetermined number of sheets in a predetermined geographical range corresponding to the initial coordinates includes:
- the selecting according to the preset priority, the m candidate images according to the preset priority to obtain the candidate image sequence, and selecting the candidate image sequence from the m candidate images.
- the environmental picture of the predetermined number n is selected in the picture sequence;
- the obtaining the relative position between the object in the environment picture and the user includes: displaying the environment picture and the guiding information, where the guiding information is used to guide the user to face the user An object in the environment picture, and/or for guiding the user to move the environment picture in a corresponding direction according to a relative position of the object in the environment picture and itself, and/or for guiding the The user moves the environment picture to a corresponding position according to the relative position of the object in the environment picture and itself;
- the method further includes:
- the geographic location information, the destination coordinates, and the route are displayed.
- a positioning navigation apparatus configured to acquire initial coordinates of a user
- a picture acquisition module configured to acquire a predetermined number of environmental pictures in a predetermined geographic range corresponding to the initial coordinates, and obtain corresponding to each environmental picture from a preset correspondence between different environment pictures and different absolute coordinates Absolute coordinates
- a location obtaining module configured to acquire, for each environment picture, a relative position between the object in the environment picture and the user
- a map positioning module configured to determine a current geographic location of the user according to the relative position and the absolute coordinates location information.
- the map positioning module includes: a first positioning unit, and/or a second positioning unit, and/or a third positioning unit;
- the first positioning unit is configured to determine a current orientation of the user according to the relative position, the absolute coordinate, and the initial coordinate when the environment picture is one, and determine the initial coordinate as The actual coordinates of the user;
- the second positioning unit is configured to determine, according to the relative position, the absolute coordinate, and the initial coordinate, a current orientation and an actual coordinate of the user when the environment picture is two;
- the third positioning unit is configured to determine a current orientation and an actual coordinate of the user according to the relative position and the absolute coordinate when the environment picture is three or more.
- the first positioning unit includes: a direction acquiring subunit and an orientation determining subunit; and the direction acquiring subunit is configured to acquire a coordinate point A of the initial coordinate to a coordinate point B of the absolute coordinate Reference direction
- the orientation determining subunit is configured to determine the orientation according to the reference direction and the relative position.
- the second positioning unit includes: a first positioning subunit; or a second positioning subunit; the first positioning subunit, configured to acquire a coordinate point A that passes the initial coordinate, and the a straight line AB 1 of the first coordinate point in the absolute coordinates; determining the actual coordinate according to the second coordinate point B 2 in the absolute coordinate, the straight line ABi, and the two relative positions; The coordinates and at least one of the relative positions determine the orientation;
- the second positioning subunit is configured to respectively acquire a coordinate point A passing the initial coordinate and a straight line 8 1 of the first coordinate point of the absolute coordinate, and a coordinate point A passing the initial coordinate and the a straight line AB 2 of the second coordinate point B 2 in the absolute coordinates ; determining the orientation according to the straight line ABi, the straight line AB 2 and the two relative positions; according to the orientation and at least one of the relative positions Determine the actual coordinates.
- the third positioning unit includes: a line acquiring subunit, a direction calculating subunit, and a coordinate calculation subunit;
- the straight line acquisition subunit is configured to respectively acquire a coordinate point A passing through the initial coordinate and a straight line AB of the first coordinate point in the absolute coordinate through the coordinate point A of the initial coordinate and the absolute coordinate a straight line AB 2 of the second coordinate point B 2 and a straight line AB 3 passing through the coordinate point A of the initial coordinate and the third coordinate point B 3 in the absolute coordinate ;
- the orientation calculation subunit is configured to determine the orientation according to the straight line ⁇ the straight line AB 2 , the straight line AB 3 and three relative positions;
- the coordinate calculation subunit is configured to determine the actual coordinate according to the orientation and at least one of the relative positions.
- the third positioning unit includes: a condition detecting subunit, a horizontal connecting subunit, and a vertical line Taking a subunit and a result determining subunit;
- the condition detecting subunit is configured to detect whether the three relative positions satisfy a predetermined condition, where the predetermined condition is that the relative position is that the object is directly in front of the user, and the other relative position is The object is on the left side of the user and the other relative position is the object is on the right side of the user; the horizontal connection subunit is configured to detect that the predetermined condition is met, And acquiring the coordinate point B 2 corresponding to the absolute coordinate of the object on the left side of the relative position, and the straight line B 2 of the coordinate point B 3 corresponding to the absolute coordinate of the object on the right side of the positive right side. B 3 ;
- the vertical line acquiring subunit is configured to acquire a coordinate point corresponding to an absolute coordinate of the object in front of the front position and a perpendicular line of the straight line B 2 B 3
- the result determining subunit is configured to determine coordinates of the foot S as the actual coordinates, and to set the foot
- the direction of S to the coordinate point is determined as the orientation.
- the picture obtaining module includes: an alternative acquiring unit and an environment acquiring unit;
- the candidate acquiring unit is configured to acquire m candidate pictures in a predetermined geographic range corresponding to the initial coordinates
- the environment obtaining unit is configured to select an environment image of the predetermined number n from the m candidate images;
- the environment obtaining unit includes: automatically acquiring a subunit; or: the user selects a subunit; the automatic obtaining subunit is configured to sort the m candidate images according to a preset priority to obtain an alternative image. Sequence, selecting the predetermined number n of environmental pictures from the candidate picture sequence;
- the user selection subunit is configured to display part or all of the m candidate pictures, receive a selection signal corresponding to the candidate picture, and determine the predetermined number n of environment pictures according to the selection signal .
- the location acquiring module includes: an information display unit, a signal receiving unit, and a location determining unit;
- the information display unit is configured to display the environment picture and the guiding information, where the guiding information is used to guide the user to face an object in the environment picture, and/or to guide the user according to the The relative position of the object in the environment picture and the self moves the environment picture in a corresponding direction, and/or is used to guide the user to the environment according to the relative position of the object in the environment picture and itself The picture is moved to the corresponding location;
- the signal receiving unit is configured to receive an input signal triggered by the user according to the guiding information, where the location determining unit is configured to determine, between the object in the environment image and the user, according to the input signal Relative position.
- the device further includes:
- a positioning navigation apparatus including:
- a memory for storing executable instructions of the processor
- the processor is configured to:
- An environment picture obtaining a relative position between the object in the environment picture and the user; determining current location information of the user according to the relative position and the absolute coordinates.
- the technical solution provided by the embodiment of the present disclosure eliminates the need to install hardware such as a gyroscope or an electronic compass inside the electronic device.
- the component can obtain geographic location information including the orientation of the user, reducing the weight and volume of the electronic device, and saving production costs.
- FIG. 1 is an exemplary flowchart of a positioning navigation method according to an exemplary embodiment.
- FIG. 2A is an exemplary flowchart of a positioning navigation method according to another exemplary embodiment.
- FIG. 2B is another An exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to an exemplary embodiment;
- FIG. 2C is another exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to another exemplary embodiment
- FIG. 2D is still another exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to another exemplary embodiment
- FIG. 2E is still another exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to another exemplary embodiment
- FIG. 2D is still another exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to another exemplary embodiment
- FIG. 2E is still another exemplary schematic diagram involved in acquiring a relative position in a positioning navigation method according to another exemplary embodiment
- FIG. 2F is an exemplary schematic diagram involved in calculating location information in a positioning navigation method according to another exemplary embodiment
- FIG. 2G is another exemplary schematic diagram involved in calculating location information in a positioning navigation method according to another exemplary embodiment
- FIG. 2H is still another exemplary schematic diagram involved in calculating location information in a positioning navigation method according to another exemplary embodiment
- FIG. 21 is still another exemplary schematic diagram involved in calculating location information in a positioning navigation method according to another exemplary embodiment
- FIG. 3 is a schematic diagram of a positioning navigation device according to an exemplary embodiment
- FIG. 4 is a schematic diagram of a positioning navigation device according to another exemplary embodiment
- FIG. 5 is an exemplary block diagram of an apparatus for positioning navigation, according to an exemplary embodiment.
- the embodiments of the present disclosure have been shown by the above-described drawings, which will be described in more detail later.
- the drawings and the description of the figures are not intended to limit the scope of the present disclosure in any way, and the concepts of the present disclosure will be described by those skilled in the art by reference to the specific embodiments. detailed description
- FIG. 1 is a flowchart of a positioning and navigation method according to an exemplary embodiment. This embodiment is exemplified by the positioning navigation method for an electronic device.
- the positioning navigation method may include the following steps:
- step 102 the initial coordinates of the user are obtained.
- step 104 a predetermined number of environmental pictures in a predetermined geographical range corresponding to the initial coordinates are obtained, and absolute coordinates corresponding to each environmental picture are obtained from preset correspondences between different environmental pictures and different absolute coordinates. .
- step 106 for each environment picture, obtain the relative relationship between the object and the user in the environment picture. Location.
- step 108 the current geographic location information of the user is determined based on the relative position and the absolute coordinates.
- the positioning and navigation method obtains a predetermined number of environmental images in a predetermined geographical range corresponding to the initial coordinates after acquiring the initial coordinates of the user, and the preset different environment images.
- the location information of the present invention solves the problem that the hardware component of the gyroscope or the electronic compass is used to obtain the orientation of the user, which leads to an increase in the weight, the volume and the production cost of the electronic device.
- FIG. 2A is a flowchart of a positioning and navigation method according to another exemplary embodiment. This embodiment is exemplified by the positioning navigation method for an electronic device.
- the positioning navigation method may include the following steps: In step 201, the initial coordinates of the user are obtained.
- the electronic device acquires the initial coordinates of the user through positioning methods such as GPS, base station or Wi-Fi.
- the user's initial coordinates are the coordinates of the geographic location where the user is currently located in the absolute coordinate system of the Earth's coordinate system. Since the above three methods all generate different degrees of error, the initial coordinates of the user acquired by the electronic device at this time can be regarded as a relatively coarse value, that is, the initial coordinates are not necessarily the actual geographical location of the user. The actual coordinates are exactly the same.
- the initial coordinates of the user acquired by the positioning method such as GPS, base station or Wi-Fi are two-dimensional coordinates, and the altitude of the geographical location where the user is actually located is not considered.
- the three-dimensional initial coordinates can be obtained by the above three positioning methods, or in combination with components such as a pneumatic altimeter.
- step 202 a predetermined number of environmental images within a predetermined geographic range corresponding to the initial coordinates are obtained.
- the electronic device After acquiring the initial coordinates of the user, the electronic device acquires a predetermined number of environmental pictures within a predetermined geographical range corresponding to the initial coordinates.
- This step can include the following sub-steps:
- a plurality of candidate pictures are pre-stored in an electronic device or a server corresponding to an application for providing the positioning and navigation method according to the embodiment, and the candidate pictures are usually landmarks or winds of various places. Scenery, such as mountains, towers, tall buildings, schools, shops, etc.
- the candidate pictures can be collected and obtained by the technicians in advance, or can be obtained by sorting the pictures uploaded by different users.
- the electronic device or the server corresponding to the application for providing the positioning and navigation method according to the embodiment further stores a correspondence between different candidate pictures and different absolute coordinates, and the absolute coordinates are optional.
- the geographical position of the object in the picture is the coordinate corresponding to the absolute coordinate system in the Earth's coordinate system. Normally, the absolute coordinates will be corrected and proofed for a long time. It can be considered that the absolute coordinates corresponding to each candidate image are accurate, that is, the absolute coordinates can accurately reflect the actual object in the candidate image. Geographic location.
- the electronic device After acquiring the initial coordinates of the user, the electronic device determines a predetermined geographical range according to the initial coordinates, for example, the initial coordinate is the center of the circle, and the circular area having a radius of 500 meters is a predetermined geographical range. Thereafter, the electronic device obtains m candidate pictures whose absolute coordinates are within the predetermined geographic range.
- an environmental picture of a predetermined number n is selected from m candidate pictures, m ⁇ n>0.
- the m candidate images are sorted according to a preset priority to obtain an alternate image sequence, and the predetermined number n of environment images are selected from the candidate image sequence.
- the m candidate pictures may be sorted according to the preset priority to obtain an alternative picture sequence. For example, the electronic device sorts the candidate pictures according to the distance between the absolute coordinates of the different candidate pictures and the initial coordinates, to obtain an alternative picture sequence.
- the candidate picture sequence is from the front to the back of the m candidate images with the absolute coordinate distance from the initial coordinates from near and far.
- the electronic device automatically selects a predetermined number n of environmental pictures from the candidate picture sequence. Under normal circumstances, the electronic device selects an alternate picture of a predetermined number n of higher priority as an environment picture.
- the environment image is a picture of the environment around the user's current location.
- the predetermined number n is preset by the developer. According to different algorithms provided in this embodiment, the predetermined number n may be preset to 1, 2 or 3. Of course, this embodiment does not limit other possible values of the predetermined number n.
- part or all of the m candidate pictures are displayed, a selection signal corresponding to the candidate picture is received, and an environmental picture of a predetermined number n is determined according to the selection signal.
- some or all of the m candidate pictures may also be displayed, that is, the candidate pictures are displayed to the user; and then the user selects a predetermined number of candidate pictures from the candidate pictures.
- the environment picture In this case, the user can select an environment picture corresponding to an object that can be clearly seen or closer to itself according to the current actual surrounding environment, which can improve the accuracy of subsequent positioning to a certain extent, and also Can increase interaction and fun.
- step 203 the absolute coordinates corresponding to each environment picture are obtained from the correspondence between the preset different environment pictures and different absolute coordinates.
- the electronic device or the server corresponding to the application for providing the positioning and navigation method according to the embodiment further stores different candidate pictures and different absolute coordinates.
- the absolute coordinate is the coordinate corresponding to the geographical position of the object in the candidate picture in the absolute coordinate system of the earth coordinate system. After acquiring the predetermined number of environmental images, the electronic device acquires the absolute coordinates corresponding to each environmental image from the above correspondence.
- step 204 for each environment picture, the relative position between the object in the environment picture and the user is obtained.
- the electronic device acquires the relative position between the object in the environmental picture and the user.
- the relative position can be obtained through interaction with the user.
- the step may include the following steps: First, displaying the environment picture and the guiding information, the guiding information is used to guide the user to face the object in the environment picture, and/or, for guiding the user according to the object in the environment picture
- the relative position of the object moves the environment picture in the corresponding direction, and/or is used to guide the user to move the environment picture to the corresponding position according to the relative position of the object in the environment picture and itself.
- the electronic device displays the environment picture 21 and the guidance information 22.
- the guide information 22 is "Can you see and turn to the object in the picture?".
- the object in the environment picture 21 is turned to the direction of the guidance information 21, and the "confirm” button 23 is pressed.
- the electronic device acquires the relative position between the object in the environment picture and the user, and the relative position is that the object in the environment picture is directly in front of the user.
- the guide information 22 may be "Please slide the picture in the corresponding direction according to the relative position of the object in the picture to yourself!.
- the relative position of the object in the environment picture 21 and itself is first viewed, for example, on the right side; then the environment picture 21 is along the edge of the electronic device screen. Slide the right side a certain distance.
- the electronic device determines the relative position between the object in the environment picture and the user according to the sliding trajectory, that is, the object in the environment picture is on the right side of the user.
- the sliding direction can be arbitrary, and any sliding direction uniquely corresponds to a relative position. For example: sliding up the object corresponding to the environment picture in front of the user, sliding to the right corresponding to the object in the environment picture on the right side of the user, sliding to the left corresponding to the object in the environment picture on the left side of the user, down
- the sliding corresponding to the object in the environment picture is slid at the right angle of 45° to the upper right side of the user.
- the object corresponding to the environment picture is 45° to the right in front of the user, and 30° to the upper left corresponds to the object in the environment picture. 30° to the left in front of the user, etc.
- the electronic device still displays the environment picture 21 and the guidance information 22.
- the guide information 22 may be "Please face the object in the first picture, and slide the second picture in the corresponding direction according to the relative position of the object in the second picture to yourself!.
- User look After the two environment pictures 21 and the guide information 22, according to the prompt of the guide information 22, first turn to the object in the first environment picture 21, and then view the relative position of the object in the second environment picture 21 with itself. For example, on the right side; then the second environment picture 21 is slid along the right side of the electronic device screen by a certain distance.
- the electronic device After detecting the sliding signal of the user, the electronic device respectively acquires the relative position between the object in the two environment pictures and the user, where the relative position is that the object in the first environment picture is directly in front of the user, and the second piece The object in the environment picture is on the right side of the user.
- the guide information 22 may also be "Please slide the picture in the corresponding direction according to the relative position of the object in the picture and your own!”.
- the electronic device can determine the relative position between the objects in the two environment pictures and the user according to the two sliding trajectories.
- the electronic device still displays the environment picture 21 and the guidance information 22.
- the guide information 22 may be "Please put the picture into the corresponding position according to the relative position of the object in the picture and your own!.
- the relative positions of the objects in the three environmental pictures 21 and themselves are respectively viewed according to the prompts of the guiding information 22, including the front side, the front side, and the right side;
- the above three environmental pictures 21 are respectively placed in the corresponding boxes, and then the "confirm" button 23 is pressed.
- the electronic device After receiving the confirmation signal that the user presses the "confirm” button 23, the electronic device respectively acquires the relative position between the object in the environment picture and the user, where the relative position includes the object in the first environment picture directly in front of the user, The relationship between the objects in the two environment pictures on the right side of the user and the objects in the third environment picture on the left side of the user.
- the three relative positions of the front side, the front side, and the right side are not necessarily limited.
- the relative position between the object and the user in the environment picture is obtained according to the sliding track of any angle, which is not specifically limited.
- step 205 the current geographic location information of the user is determined based on the relative position and the absolute coordinates.
- the electronic device After acquiring the relative position between the object and the user in each environment picture and the absolute coordinates of the object in each environment picture, the electronic device calculates the current geographic location information of the user according to the relative position and the absolute coordinates, and the geographic location The information includes the actual coordinates and orientation of the user.
- the current orientation of the user is determined based on the relative position, the absolute coordinates, and the initial coordinates, and the initial coordinates are determined as the actual coordinates of the user.
- the initial coordinates are determined as the actual coordinates of the user.
- the initial coordinates are directly used as the actual coordinates.
- the initial coordinates can It can be less precise, but it can also reflect the geographical location of the user's current location within the tolerance. The main thing is that the algorithm can be simplified to improve the positioning and navigation efficiency.
- the current orientation of the user is determined based on the relative position, the absolute coordinates, and the initial coordinates.
- the second sub-step described above may also include the following two sub-steps:
- FIG. 2F or FIG. 2G in combination with the assumption in the absolute coordinate system (ie, the two-dimensional Cartesian coordinate system in the figure, where the vertical direction is assumed to be north and the lower is south, and the horizontal direction is assumed to be left to west and right to east) )
- the coordinate point A (xl, yl) of the initial coordinate, and the coordinate point B (x2, y2) of the absolute coordinate obtain the coordinate point B (x2, y2) from the coordinate point A (xl, yl) of the initial coordinate to the absolute coordinate
- the reference direction (indicated by the dashed arrow in the figure).
- the orientation of the user is the coordinate point A (xl, yl) to
- the reference direction of coordinate point B (x2, y2) is 90° counterclockwise (indicated by solid arrows in Figure 2G).
- the orientation of the user is that the reference direction of the coordinate point A to the coordinate point B is 90° clockwise (not shown); or
- the relative position is that the object in the environment picture is 30° to the left in front of the user
- the user's orientation is that the reference direction of the coordinate point A to the coordinate point B is 30° clockwise.
- the electronic device can determine the orientation of the user based on the reference direction and the relative position.
- the above embodiment is only exemplified by the angle ⁇ of the orientation of the user in the absolute coordinate system and the eastward direction.
- the orientation can be calculated in the absolute coordinate system after obtaining the orientation of the user.
- the angle between the middle and any direction including the east direction, the north direction, the west direction, and so on.
- the current orientation and actual coordinates of the user are determined based on the relative position, the absolute coordinates, and the initial coordinates.
- FIG. 2D and FIG. 2 ⁇ on the left side diagram.
- the straight line AB 1 passing through the coordinate point ⁇ of the initial coordinate and the first coordinate point in the absolute coordinate is acquired ;
- the second coordinate point B 2 , the straight line ABi, and the two relative positions in the coordinates determine the actual coordinates;
- the orientation is determined according to the actual coordinates and the at least one relative position.
- the relative position between the object corresponding to the first coordinate point (x2, y2) in the absolute coordinate and the user is 30° to the left in front of the user, and the second coordinate point B in the absolute coordinate 2 (x3, y3)
- the coordinates (x0, y0) of point S are the actual coordinates.
- the selection of the first coordinate point of the two absolute coordinates is required to be high, and the relative position between the corresponding object and the user is accurate. Degree is directly related to the accuracy of the actual calculated coordinates and orientation. Therefore, the coordinate point corresponding to the object in front of the user is usually selected as the first coordinate point B.
- the midpoint of the line segment AS shown in the left side of FIG. 2H may be selected as the coordinate point of the actual coordinate of the user, and the GPS positioning function of the electronic device may not be directly opened.
- the point S in the illustration on the left side of Fig. 2H is selected as the coordinate point of the actual coordinates of the user.
- different algorithms can be selected according to actual needs to obtain the actual coordinates of the user, which is not specifically limited in this embodiment. Set.
- the left side of FIG. 2H is illustrated as a possible calculation method in the case where the environment picture is two sheets.
- the environment picture is two sheets.
- the calculation method of the second Please refer to the right side of FIG. 2D and FIG. 2H.
- the relative position between the object and the user in the first environment picture acquired by the electronic device in the above step 203 is 30° to the left in front of the user, and in the second environment picture
- the relative position between the object and the user is 45° to the right in front of the user.
- the coordinate point B 2 (x3, y3) and the coordinate point A (xl, yl) of the initial coordinates are obtained by a straight line AB 2 (indicated by a broken line in the figure).
- the first alternative orientation is determined based on the relative position between the object in the line A 8 1 and the first environment picture and the user. For example, since the relative position between the object in the first environment picture and the user is 30° to the left in front of the user, the vertex with A (xl, yl) as an angle, the line ABi is the side of the angle, along the edge Counterclockwise for 30° The direction of the ray is the first alternative direction.
- a second alternative orientation is determined based on the relative position between the object in the line A 8 2 and the second environment picture and the user. For example, since the relative position between the object in the second environment picture and the user is 45° to the right in front of the user, the apex with A (xl, yl) as an angle and the line AB 2 be the side of the angle, ZB 2 AC 2 with a size of 45° is made clockwise.
- the direction of the ray AC 2 is the second alternative direction.
- the direction of the ray AC 3 is the orientation of the user.
- the calculation method provided by the right side diagram of FIG. 2H described above simultaneously considers the positional relationship between the object and the user in the two environmental images when calculating the orientation and the actual coordinates of the user, and adopts an algorithm for averaging.
- the orientation and actual coordinates of the user are determined according to the two alternative orientations, and the stability of the calculation result is improved.
- the coordinate point A passing through the initial coordinates and the straight line ⁇ of the first coordinate point in the absolute coordinates are respectively acquired.
- the coordinate point A and the straight line AB 2 of the second coordinate point B 2 in the absolute coordinate and the coordinate point A passing the initial coordinate and the straight line AB 3 of the third coordinate point B 3 in the absolute coordinate ; according to the straight line AB ⁇ straight line AB 2 , the line AB 3 and the three relative positions determine the orientation; the actual coordinates are determined according to the orientation and the at least one relative position.
- the relative position between the object and the user in the first environment picture acquired by the electronic device in the above step 203 is 30° to the left in front of the user, in the second environment picture.
- the relative position between the object and the user is 45° to the right in front of the user and the relative position between the object and the user in the third environment picture is 120° to the left of the user.
- Point B 2 (x3, y3) and the coordinate point A (xl, yl) of the initial coordinates get the line AB 2; connect the third coordinate point B 3 (x4, y4) in the absolute coordinates and the coordinate point A of the initial coordinates ( Xl, yl) get the line AB 3 (indicated by the dotted line in the figure).
- (1) determined in accordance with a first alternative orientation relative position between the first straight line 81 and the environment of the object image with the user. For example, since the relative position between the object in the first environment picture and the user is 30° to the left in front of the user, the vertex with A (xl, yl) as an angle, the line ABi is the side of the angle, along the edge Counterclockwise for 30° The direction of the ray is the first alternative direction.
- the direction of the ray AC 5 is the orientation of the user.
- the angle of the user's orientation in the absolute coordinate system with the eastward direction 5 arctan
- the first calculation method when the number of sheets of the environment picture is 3 is the same as or similar to the second calculation method when the number of sheets of the environment picture is 2, and the second calculation method when the number of sheets of the environment picture is 2 can be referred to. .
- the relative position between the object and the user in the second environment picture is 45° to the right in front of the user and the relative position between the object and the user in the third environment picture is directly in front of the user 120° to the left, so pick the point Si and the point S 2 on the line ABi, BfSOQ+AS ⁇ SQ and please refer to the middle diagram of FIG.
- the coordinates of the midpoint of the line segment SiSs can be selected as the actual coordinates of the user (not shown in the figure) ); it is also possible to make a reverse extension line along B 3 S 2 to point S 3 and select the coordinates (x0, y0) of the midpoint S of ASiS ⁇ s as the actual coordinates of the user.
- the orientation of the user is determined based on at least one relative position.
- the second calculation method when the number of sheets of the above environment picture is 3 is the same as or similar to the first calculation method when the number of sheets of the environment picture is 2, and the first calculation method when the number of sheets of the environment picture is 2 can be referred to. .
- the predetermined condition is that the relative position is that the object is directly in front of the user, the other relative position is that the object is on the right side of the user, and the other relative position is that the object is at the right side of the user. side.
- the coordinate point B 2 corresponding to the absolute coordinate of the object whose relative position is the positive left side and the straight line B of the coordinate point B 3 corresponding to the absolute coordinate of the object whose positive position is the right side are obtained. 2 B 3 .
- the calculation method provided at the lower end of FIG. 21 above does not need to use the initial coordinates in the process of calculating the actual coordinates and orientation of the user, so it is particularly suitable for the geographical location where the user actually lives in the case where the acquisition of the initial coordinates is less accurate. Positioning.
- the three-dimensional initial coordinates can be combined to obtain the environment pictures of different altitudes, and the positioning and navigation on different floors in the room are realized, and the positioning and navigation method provided by the embodiment is fully improved. Scope and ease of use.
- the electronic device may perform the following steps:
- step 206 the destination coordinates of the user to reach the destination are obtained.
- the electronic device acquires the destination coordinates that the user needs to reach the destination.
- the destination name is usually entered by the user, and then the electronic device acquires the destination coordinates of the destination in the absolute coordinate system based on the destination name entered by the user.
- step 207 at least one route is determined based on the destination coordinates and the geographic location information.
- the electronic device determines at least one route based on the destination coordinates and geographic location information. Since the geographic location information includes the current actual coordinates of the user, after acquiring the destination coordinates and the actual coordinates, the electronic device can determine at least one route from the actual coordinates to the destination coordinate, that is, the current geographic location of the user. The route to the destination that the user needs to arrive at.
- step 208 the geographic location information, the destination coordinates, and the route are displayed.
- the electronic device displays the user's orientation, the user's actual coordinates, the destination coordinates, and the route, and guides the user to the destination according to the displayed information.
- the positioning and navigation method obtains a predetermined number of environmental images in a predetermined geographical range corresponding to the initial coordinates after acquiring the initial coordinates of the user, and the preset different environment images.
- the location information of the present invention solves the problem that the hardware component of the gyroscope or the electronic compass is used to obtain the orientation of the user, which leads to an increase in the weight, the volume and the production cost of the electronic device.
- By installing hardware components such as a gyroscope or an electronic compass inside the electronic device geographic location information including the orientation of the user can be obtained, the weight and volume of the electronic device are reduced, and the production cost is saved.
- the embodiment provides a plurality of methods for calculating the user orientation and the actual coordinates, and some algorithms are simple, and the positioning and navigation efficiency is high; and some algorithms using the averaging method are based on two or three devices.
- the selection direction determines the user's orientation and actual coordinates, which improves the stability of the calculation results; and the interaction with the user is high.
- different calculation methods can be used according to different requirements.
- the following is an embodiment of the apparatus of the present disclosure, which may be used to implement the method embodiments of the present disclosure. For details not disclosed in the embodiments of the disclosed device, please refer to the method embodiments of the present disclosure.
- FIG. 3 is a schematic diagram of a positioning navigation device, which may be implemented as part or all of an electronic device by software, hardware, or a combination of both, according to an exemplary embodiment.
- the positioning and navigation device may include: an initial acquisition module 3 10, a picture acquisition module 320, a location acquisition module 330, and a map location module 340.
- the initial acquisition module 3 10 is configured to acquire the initial coordinates of the user.
- the image obtaining module 320 is configured to acquire a predetermined number of environmental images within a predetermined geographic range corresponding to the initial coordinates, and obtain each environmental image from a preset correspondence between different environmental images and different absolute coordinates. Corresponding absolute coordinates.
- the location acquisition module 330 is configured to obtain a relative location between the object in the environment picture and the user for each environmental picture.
- the map location module 340 is configured to determine current geographic location information for the user based on the relative location and the absolute coordinates.
- the positioning navigation device obtains a predetermined number of environmental images in a predetermined geographical range corresponding to the initial coordinates after acquiring the initial coordinates of the user, and the preset different environment images.
- FIG. 4 is a schematic diagram of a positioning navigation device, which may be implemented as part or all of an electronic device by software, hardware, or a combination of both, according to another exemplary embodiment.
- the positioning navigation device may include: an initial acquisition module 310, a picture acquisition module 320, a location acquisition module 330, a map location module 340, a destination acquisition module 350, a line determination module 360, and a navigation display module 370.
- the initial acquisition module 3 10 is configured to acquire the initial coordinates of the user.
- the image obtaining module 320 is configured to acquire a predetermined number of environmental images within a predetermined geographic range corresponding to the initial coordinates, and obtain each environmental image from a preset correspondence between different environmental images and different absolute coordinates. Corresponding absolute coordinates.
- the picture obtaining module 320 includes: an optional obtaining unit 320a and an environment obtaining unit 320b.
- the candidate acquisition unit 320a is configured to acquire a predetermined geographic range corresponding to the initial coordinates m alternative pictures.
- the environment acquisition unit 320b is configured to select the predetermined number n of environment pictures from the m candidate pictures.
- the environment obtaining unit 320b includes: an automatic acquiring subunit 320M; or, the user selecting the subunit 320b2.
- the automatic acquisition sub-unit 320M is configured to sort the m candidate pictures according to a preset priority to obtain an alternate picture sequence, and select the predetermined number n of environmental pictures from the candidate picture sequence.
- the user selection sub-unit 320b2 is configured to display part or all of the m candidate pictures, receive a selection signal corresponding to the candidate picture, and determine the predetermined number n of environments according to the selection signal. image.
- the location acquisition module 330 is configured to obtain a relative location between the object in the environment picture and the user for each environmental picture.
- the location acquiring module 330 includes: an information display unit 330a, a signal receiving unit 330b, and a location determining unit 330c.
- the information display unit 330a is configured to display the environment picture and the guide information, the guide information is used to guide the user to face an object in the environment picture, and/or to guide the user And moving the environmental picture in a corresponding direction according to the relative position of the object in the environment picture and the self, and/or, for guiding the user according to the relative position of the object in the environment picture and the self
- the environment image is moved to the corresponding location.
- the signal receiving unit 330b is configured to receive an input signal that the user triggers according to the guidance information.
- the position determining unit 330c is configured to determine a relative position between the object in the environment picture and the user based on the input signal.
- the map location module 340 is configured to determine current geographic location information for the user based on the relative location and the absolute coordinates.
- the map positioning module 340 includes: a first positioning unit 340a, and/or a second positioning unit 340b, and/or a third positioning unit 340c.
- the first positioning unit 340a is configured to determine a current orientation of the user according to the relative position, the absolute coordinate, and the initial coordinate when the environment picture is one, and determine the initial coordinate Is the actual coordinates of the user.
- the first positioning unit 340a includes: a direction acquiring subunit 340a1 and an orientation determining subunit 340a2.
- the direction acquisition sub-unit 340a1 is configured to acquire a reference direction of the coordinate point A of the initial coordinate to the coordinate point B of the absolute coordinate.
- the orientation determining sub-unit 340a2 is configured to determine the orientation based on the reference direction and the relative position.
- the second positioning unit 340b is configured to determine the current orientation and actual coordinates of the user according to the relative position, the absolute coordinates, and the initial coordinates when the environment picture is two sheets.
- the second positioning unit 340b includes: a first positioning subunit 340M; or a second positioning subunit 340b2.
- the first positioning sub-unit 340M is configured to acquire a straight line ABi passing through the coordinate point A of the initial coordinate and the first coordinate point in the absolute coordinate, and the second coordinate point B in the absolute coordinate 2 , the straight line ABi and the two relative positions determine the actual coordinates; the orientation is determined according to the actual coordinates and at least one of the relative positions.
- the second positioning sub-unit 340b2 is configured to respectively acquire a coordinate point A passing the initial coordinate and a straight line 8 1 of the first coordinate point of the absolute coordinate, and a coordinate point A and the coordinate point passing the initial coordinate a straight line AB 2 of the second coordinate point B 2 in the absolute coordinates ; determining the orientation according to the straight line AB 2 and the two relative positions; according to the orientation and at least one of the relative positions Determine the actual coordinates.
- the third positioning unit 340c is configured to determine the current orientation and actual coordinates of the user according to the relative position and the absolute coordinates when the environment picture is 3 or more.
- the third positioning unit 340c includes: a straight line obtaining subunit 340cl, an orientation calculating subunit 340c2, and a coordinate calculating subunit 340c3.
- the straight line acquisition subunit 340cl is configured to respectively acquire a coordinate point A passing through the initial coordinate and a straight line ⁇ of the first coordinate point in the absolute coordinate, a coordinate point A passing through the initial coordinate, and the absolute coordinate A straight line AB 2 of the second coordinate point B 2 and a coordinate point A passing the initial coordinate and a straight line AB 3 of the third coordinate point B 3 in the absolute coordinate.
- the orientation calculation sub-unit 340c2 is configured to determine the orientation according to the line ⁇ the line AB 2 , the line AB 3 and the three relative positions.
- the coordinate calculation sub-unit 340c3 is configured to determine the actual coordinates based on the orientation and at least one of the relative positions.
- the third positioning unit 340c includes: a condition detecting subunit 340c4, a horizontal connecting subunit 340c5, a vertical line obtaining subunit 340c6, and a result determining subunit 340c7.
- the condition detecting sub-unit 340c4 is configured to detect whether three of the relative positions satisfy a predetermined condition, the predetermined condition being that the relative position is that the object is directly in front of the user, and the other of the relative positions The object is on the right side of the user and the other relative position is that the object is on the right side of the user.
- the lateral connection sub-unit 340c5 is configured to, if it is detected that the predetermined condition is satisfied, acquire the coordinate point B 2 corresponding to the absolute coordinate of the object whose front position is the left side, and the relative position is Said positive Absolute coordinates of the object on the right side of the coordinate point corresponding to the straight line B 3 to B 2 B 3.
- the perpendicular acquisition sub-unit 340c6 is configured to acquire a coordinate point corresponding to an absolute coordinate of the object in front of the object and a perpendicular ⁇ of the line B 2 B 3 .
- the result determination sub-unit 340c7 is configured to determine the coordinates of the foot S as the actual coordinates, and determine the direction of the foot S to the coordinate point as the orientation.
- the destination acquisition module 350 is configured to obtain a destination coordinate that the user needs to reach the destination.
- the line determination module 360 is configured to determine at least one route based on the destination coordinates and the geographic location information.
- the navigation display module 370 is configured to display the geographic location information, the destination coordinates, and the route.
- the positioning navigation device obtains a predetermined number of environmental images in a predetermined geographical range corresponding to the initial coordinates after acquiring the initial coordinates of the user, and the preset different environment images.
- the embodiment provides a plurality of methods for calculating the user orientation and the actual coordinates, and some algorithms are simple, and the positioning and navigation efficiency is high; and some algorithms using the averaging method are based on two or three preparations.
- the selection direction determines the user's orientation and actual coordinates, which improves the stability of the calculation results; and the interaction with the user is high.
- different calculation methods can be adopted according to different requirements.
- FIG. 5 is a block diagram of an apparatus for positioning navigation, according to an exemplary embodiment.
- device 500 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
- apparatus 500 can include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, And a communication component 516.
- Processing component 502 typically controls the overall operation of device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- Processing component 502 can include one or more processors 520 to execute instructions to perform all or part of the steps described above. Additionally, processing component 502 can include a One or more modules facilitate handling of interactions between component 502 and other components. For example, processing component 502 can include a multimedia module to facilitate interaction between multimedia component 508 and processing component 502.
- Memory 504 is configured to store various types of data to support operation at device 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 504 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
- SRAM static random access memory
- EEPROM electrically erasable programmable read only memory
- EPROM Electrically erasable programmable read only memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Disk Disk or Optical Disk.
- Power component 506 provides power to various components of device 500.
- Power component 506 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 500.
- the multimedia component 508 includes a screen between the device 500 and the user that provides an output interface.
- the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
- the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor can sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
- the multimedia component 508 includes a front camera and/or a rear camera. When the device 500 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- Audio component 510 is configured to output and/or input audio signals.
- audio component 510 includes a microphone (MIC) that is configured to receive an external audio signal when device 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in memory 504 or transmitted via communication component 516.
- audio component 510 also includes a speaker for outputting an audio signal.
- the I/O interface 512 provides an interface between the processing component 502 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons can include, but are not limited to: Home button, Volume button, Start button, and Lock button.
- Sensor assembly 514 includes one or more sensors for providing device 500 with various aspects of status assessment.
- sensor component 514 can detect an open/closed state of device 500, relative positioning of components, such as the display and keypad of device 500, and sensor component 514 can also detect the location of device 500 or device 500 components. The presence or absence of contact by the user with the device 500, the orientation or acceleration/deceleration of the device 500, and the temperature change of the device 500.
- Sensor assembly 514 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- Sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor component 514 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor Or temperature sensor.
- Communication component 516 is configured to facilitate wired or wireless communication between device 500 and other devices.
- the device 500 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
- communication component 516 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
- the communication component 516 also includes a near field communication (NFC) module to facilitate short range communication.
- NFC near field communication
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth Bluetooth
- device 500 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A gate array
- controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- non-transitory computer readable storage medium comprising instructions, such as a memory 504 comprising instructions executable by processor 520 of apparatus 500 to perform the above method.
- the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
- a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of apparatus 500, to enable apparatus 500 to perform the positioning navigation method illustrated in Figure 1 or Figure 2A above.
- Other embodiments of the invention will be apparent to those skilled in the ⁇ RTIgt;
- the present application is intended to cover any variations, uses, or adaptations of the present invention, which are in accordance with the general principles of the invention and include common general knowledge or common technical means in the art that are not disclosed in the present disclosure. .
- the specification and examples are to be regarded as illustrative only,
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016510932A JP6116756B2 (ja) | 2014-03-31 | 2014-07-24 | 測位・ナビゲーション方法、装置、プログラム、及び記録媒体 |
RU2015134187A RU2608971C1 (ru) | 2014-03-31 | 2014-07-24 | Способ и устройство для позиционирования и навигации |
KR1020147026581A KR101639312B1 (ko) | 2014-03-31 | 2014-07-24 | 측위 네비게이션 방법, 장치, 프로그램 및 기록매체 |
MX2014011940A MX350053B (es) | 2014-03-31 | 2014-07-24 | Metodo y aparato de posicionamiento y navegacion. |
US14/543,106 US9818196B2 (en) | 2014-03-31 | 2014-11-17 | Method and device for positioning and navigating |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410126006.3A CN103968846B (zh) | 2014-03-31 | 2014-03-31 | 定位导航方法和装置 |
CN201410126006.3 | 2014-03-31 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/543,106 Continuation US9818196B2 (en) | 2014-03-31 | 2014-11-17 | Method and device for positioning and navigating |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015149455A1 true WO2015149455A1 (zh) | 2015-10-08 |
Family
ID=51238639
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/082911 WO2015149455A1 (zh) | 2014-03-31 | 2014-07-24 | 定位导航方法和装置 |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP2927638B1 (zh) |
JP (1) | JP6116756B2 (zh) |
KR (1) | KR101639312B1 (zh) |
CN (1) | CN103968846B (zh) |
MX (1) | MX350053B (zh) |
RU (1) | RU2608971C1 (zh) |
WO (1) | WO2015149455A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108363603A (zh) * | 2018-01-29 | 2018-08-03 | 上海闻泰电子科技有限公司 | 信息指引方法、装置、移动终端以及存储装置 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104581637A (zh) * | 2015-01-20 | 2015-04-29 | 北京嘀嘀无限科技发展有限公司 | 定位的方法及设备 |
CN105992148B (zh) * | 2015-02-15 | 2020-07-03 | 索尼公司 | 用于无线通信系统的通信设备和通信方法 |
CN105451179A (zh) * | 2015-12-25 | 2016-03-30 | 百度在线网络技术(北京)有限公司 | 一种定位方法及装置 |
US10168173B2 (en) * | 2016-10-26 | 2019-01-01 | Google Llc | Systems and methods for using visual landmarks in initial navigation |
CN106658409A (zh) * | 2016-12-07 | 2017-05-10 | 雷蕾 | 一种定位方法和系统 |
CN107687854A (zh) * | 2017-07-20 | 2018-02-13 | 努比亚技术有限公司 | 一种室内导航方法、终端和计算机可读存储介质 |
CN107727104B (zh) * | 2017-08-16 | 2019-04-30 | 北京极智嘉科技有限公司 | 结合标识的同时定位和地图创建导航方法、装置及系统 |
CN109146932B (zh) * | 2018-07-17 | 2021-08-24 | 北京旷视科技有限公司 | 确定图像中目标点的世界坐标的方法、装置和系统 |
KR102212825B1 (ko) | 2019-04-08 | 2021-02-08 | 네이버랩스 주식회사 | 이미지를 기반으로 포즈 계산을 위한 지도의 최신성을 유지하는 방법 및 시스템 |
CN110162658A (zh) * | 2019-05-31 | 2019-08-23 | 广东小天才科技有限公司 | 位置信息获取方法、装置、终端及存储介质 |
KR102383567B1 (ko) * | 2019-12-16 | 2022-04-06 | 네이버랩스 주식회사 | 시각 정보 처리 기반의 위치 인식 방법 및 시스템 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101334285A (zh) * | 2007-06-29 | 2008-12-31 | 鸿富锦精密工业(深圳)有限公司 | 车辆导航装置及导航方法 |
US20110118973A1 (en) * | 2009-11-16 | 2011-05-19 | Industrial Technology Research Institute | Image processing method and system |
CN103052151A (zh) * | 2011-10-14 | 2013-04-17 | 中国电信股份有限公司 | 终端定位方法、装置和移动终端 |
CN103249142A (zh) * | 2013-04-26 | 2013-08-14 | 东莞宇龙通信科技有限公司 | 一种定位方法、系统及移动终端 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SU1747905A1 (ru) * | 1990-10-31 | 1992-07-15 | Botuz Sergej P | Способ многоканальной регистрации результатов измерений и устройство дл его осуществлени |
US5862511A (en) * | 1995-12-28 | 1999-01-19 | Magellan Dis, Inc. | Vehicle navigation system and method |
US5948043A (en) * | 1996-11-08 | 1999-09-07 | Etak, Inc. | Navigation system using GPS data |
EP1059510A1 (en) * | 1999-06-10 | 2000-12-13 | Texas Instruments Incorporated | Wireless location |
JP4672190B2 (ja) * | 2001-04-26 | 2011-04-20 | 三菱電機株式会社 | 映像ナビゲーション装置 |
US6615135B2 (en) * | 2001-05-24 | 2003-09-02 | Prc Inc. | Satellite based on-board vehicle navigation system including predictive filtering and map-matching to reduce errors in a vehicular position |
US6766245B2 (en) * | 2002-03-14 | 2004-07-20 | Microsoft Corporation | Landmark-based location of users |
US8768617B2 (en) * | 2003-10-06 | 2014-07-01 | Csr Technology Inc. | Method and system for a data interface for aiding a satellite positioning system receiver |
JP4273119B2 (ja) * | 2003-10-21 | 2009-06-03 | 和郎 岩根 | ナビゲーション装置 |
US8942483B2 (en) * | 2009-09-14 | 2015-01-27 | Trimble Navigation Limited | Image-based georeferencing |
KR100855657B1 (ko) * | 2006-09-28 | 2008-09-08 | 부천산업진흥재단 | 단안 줌 카메라를 이용한 이동로봇의 자기위치 추정 시스템및 방법 |
US20080153516A1 (en) * | 2006-12-20 | 2008-06-26 | Via Technologies, Inc. | Visual Positioning System and Method for Mobile User Equipment |
JP5145735B2 (ja) * | 2007-03-02 | 2013-02-20 | 株式会社豊田中央研究所 | 測位装置及び測位システム |
JP2010197209A (ja) * | 2009-02-25 | 2010-09-09 | Toshiba Corp | ナビゲーションシステム装置 |
KR20130089068A (ko) * | 2012-02-01 | 2013-08-09 | 현대모비스 주식회사 | 카메라를 이용한 차량 위치 보정 장치 및 방법 |
KR101339354B1 (ko) * | 2012-03-26 | 2013-12-09 | 한국철도기술연구원 | 영상을 이용한 철도차량의 위치검지 시스템 및 위치검지방법 |
CN102829775A (zh) * | 2012-08-29 | 2012-12-19 | 成都理想境界科技有限公司 | 一种室内导航方法、系统及设备 |
CN102889892B (zh) * | 2012-09-13 | 2015-11-25 | 东莞宇龙通信科技有限公司 | 实景导航的方法及导航终端 |
CN103424113B (zh) * | 2013-08-01 | 2014-12-31 | 毛蔚青 | 移动终端基于图像识别技术的室内定位与导航方法 |
CN103398717B (zh) * | 2013-08-22 | 2016-04-20 | 成都理想境界科技有限公司 | 全景地图数据库采集系统及基于视觉的定位、导航方法 |
-
2014
- 2014-03-31 CN CN201410126006.3A patent/CN103968846B/zh active Active
- 2014-07-24 KR KR1020147026581A patent/KR101639312B1/ko active IP Right Grant
- 2014-07-24 RU RU2015134187A patent/RU2608971C1/ru active
- 2014-07-24 MX MX2014011940A patent/MX350053B/es active IP Right Grant
- 2014-07-24 WO PCT/CN2014/082911 patent/WO2015149455A1/zh active Application Filing
- 2014-07-24 JP JP2016510932A patent/JP6116756B2/ja active Active
- 2014-12-09 EP EP14197062.4A patent/EP2927638B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101334285A (zh) * | 2007-06-29 | 2008-12-31 | 鸿富锦精密工业(深圳)有限公司 | 车辆导航装置及导航方法 |
US20110118973A1 (en) * | 2009-11-16 | 2011-05-19 | Industrial Technology Research Institute | Image processing method and system |
CN103052151A (zh) * | 2011-10-14 | 2013-04-17 | 中国电信股份有限公司 | 终端定位方法、装置和移动终端 |
CN103249142A (zh) * | 2013-04-26 | 2013-08-14 | 东莞宇龙通信科技有限公司 | 一种定位方法、系统及移动终端 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108363603A (zh) * | 2018-01-29 | 2018-08-03 | 上海闻泰电子科技有限公司 | 信息指引方法、装置、移动终端以及存储装置 |
CN108363603B (zh) * | 2018-01-29 | 2022-04-01 | 上海闻泰电子科技有限公司 | 信息指引方法、装置、移动终端以及存储装置 |
Also Published As
Publication number | Publication date |
---|---|
MX2014011940A (es) | 2016-04-26 |
JP2016522895A (ja) | 2016-08-04 |
MX350053B (es) | 2017-08-23 |
RU2608971C1 (ru) | 2017-01-30 |
KR101639312B1 (ko) | 2016-07-13 |
CN103968846B (zh) | 2017-02-08 |
JP6116756B2 (ja) | 2017-04-19 |
EP2927638A1 (en) | 2015-10-07 |
EP2927638B1 (en) | 2017-03-29 |
KR20150124372A (ko) | 2015-11-05 |
CN103968846A (zh) | 2014-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015149455A1 (zh) | 定位导航方法和装置 | |
CN105203100B (zh) | 智能引导用户搭乘电梯的方法及装置 | |
US20170064577A1 (en) | Information Display Method and Apparatus, and Storage Medium | |
US9818196B2 (en) | Method and device for positioning and navigating | |
US20190082289A1 (en) | Portable apparatus and method of controlling location information of portable apparatus | |
US8433244B2 (en) | Orientation based control of mobile device | |
CN104034309B (zh) | 角度测量方法、装置及终端 | |
US10356558B2 (en) | Obstacle locating method and apparatus | |
EP3848773A1 (en) | Smart globe and control method therefor | |
US20140324938A1 (en) | Device and Method for Generating Data for Generating or Modifying a Display Object | |
US20180032152A1 (en) | Mobile terminal and method for determining scrolling speed | |
US9883018B2 (en) | Apparatus for recording conversation and method thereof | |
CN111126697B (zh) | 人员情况预测方法、装置、设备及存储介质 | |
CN103906235A (zh) | 终端定位的方法及终端 | |
EP2998705A1 (en) | Mobile terminal and control method for the mobile terminal | |
CN105892869A (zh) | 图片位置调整方法及装置 | |
CN106231237B (zh) | 视频通话方法及装置 | |
CN109696166A (zh) | 一种导航方法及装置 | |
WO2022237071A1 (zh) | 定位方法及装置、电子设备、存储介质和计算机程序 | |
CN112432636B (zh) | 定位方法及装置、电子设备和存储介质 | |
CN106289161A (zh) | 高度测量方法及装置 | |
WO2022110777A1 (zh) | 定位方法及装置、电子设备、存储介质、计算机程序产品、计算机程序 | |
EP2924568A1 (en) | Execution method and device for program string | |
CN104199640B (zh) | 方向测量方法、装置及终端 | |
CN112860827A (zh) | 设备间交互控制方法、设备间交互控制装置及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016510932 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20147026581 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2014/011940 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 2015134187 Country of ref document: RU Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14887788 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014024315 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14887788 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112014024315 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140930 |