US20180357824A1 - Augmented Reality Positioning Method and Apparatus for Location-Based Service LBS - Google Patents

Augmented Reality Positioning Method and Apparatus for Location-Based Service LBS Download PDF

Info

Publication number
US20180357824A1
US20180357824A1 US15/991,343 US201815991343A US2018357824A1 US 20180357824 A1 US20180357824 A1 US 20180357824A1 US 201815991343 A US201815991343 A US 201815991343A US 2018357824 A1 US2018357824 A1 US 2018357824A1
Authority
US
United States
Prior art keywords
information
terminal
location
server
drawn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/991,343
Other versions
US11164379B2 (en
Inventor
Zhongqin Wu
Miao Yao
Yongjie Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Assigned to BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) CO., LTD. reassignment BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WU, ZHONGQIN, YAO, Miao, ZHANG, YONGJIE
Publication of US20180357824A1 publication Critical patent/US20180357824A1/en
Application granted granted Critical
Publication of US11164379B2 publication Critical patent/US11164379B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • H04L67/18
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds

Definitions

  • the present disclosure relates to the field of Internet application, and particularly to an augmented reality positioning method and apparatus for location-based service LBS.
  • a positioning module is built in a terminal which may perform positioning via a positioning system such as Global Positioning System (GPS) or a base station to obtain a geographical location of the terminal.
  • GPS Global Positioning System
  • the positioning function of the terminal enables a user to acquire his own geographical location even if he is in an unfamiliar environment and not to get lost.
  • LBS Location Based Service
  • Three major goals of LBS are: where are you, who are you together with, and what resources are there nearby, wherein “where are you” is a kernel of the LBS.
  • Dual-terminal users may perform positioning via the GPS modules of the mobile terminals, display location information on interfaces of their terminals, and meanwhile obtain substantially rough prompts of route navigation planning, distance and direction. For example, in some ride-hailing applications, a user and a driver share locations to facilitate accurate acquisition of the user and driver's current locations.
  • a consumption-level mobile terminal e.g., a mobile phone or a tablet computer
  • the obtained location is usually inaccurate.
  • a location positioned by the GPS there might occur a deviation of tens of meters due to factors such as an environment factor, and it is difficult to obtain precise geographical location information so that the navigation route is inaccurate.
  • the terminals After the terminals enter a location where they are close to each other and the positioning information is substantially in a coincident or close-to-each other range, positioning cannot be performed on the interfaces of the terminals.
  • a plurality of aspects of the present disclosure provide an augmented reality positioning method and apparatus for location-based service LBS, to help the user to quickly position another one terminal or more terminals to be found.
  • an augmented reality positioning method for location-based service LBS comprising:
  • a first terminal obtains image information captured by a camera, and receives AR information transmitted by a server; the AR information being generated according to location information of a second terminal;
  • the first terminal displays the image information drawn with the AR information.
  • the direction information comprises a tilt angle posture of the terminal.
  • the first terminal obtaining image information captured by the camera comprises:
  • the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
  • the method further comprises:
  • the first terminal transmits the location direction and direction information to the server;
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • the method further comprises:
  • the first terminal transmits the location direction and direction information to the server;
  • the first terminal transmits the image information to the server
  • the first terminal receiving the AR information transmitted by the server comprises: the first terminal receiving the image information transmitted by the server end and drawn with the AR information.
  • the AR information comprises:
  • an augmented reality positioning method for location-based service LBS comprising:
  • the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • the server transmits AR information to the first terminal, and the AR information is generated based on location information of the second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
  • the direction information comprises a tilt angle posture of the terminal.
  • the first terminal obtaining image information captured by the camera comprises:
  • the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
  • the method further comprises:
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • the method further comprises:
  • the server transmitting the AR information to the first terminal comprises: the server transmits the image information drawn with the AR information to the first terminal.
  • the AR information comprises:
  • an augmented reality positioning apparatus for location-based service LBS comprising:
  • a positioning module configured to obtain location information and direction information of a terminal
  • a transmitting module configured to transmit the location direction and direction information to a server
  • a receiving module configured to receive AR information transmitted by the server, the AR information being generated according to the location information of the second terminal;
  • a display module configured to display the image information drawn with the AR information.
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the apparatus further comprises a drawing module configured to draw the AR information to the image information.
  • the transmitting module is used to transmit the image information to the server
  • the receiving the AR information transmitted by the server comprises: receiving the image information transmitted by the server end and drawn with the AR information.
  • an augmented reality positioning apparatus for location-based service LBS comprising:
  • a receiving module configured to receive location information and direction information transmitted by terminals, the terminals comprising a first terminal and a second terminal;
  • a transmitting module configured to transmit AR information to the first terminal, the AR information being generated based on the location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • the receiving module is configured to receive the image information transmitted by the first terminal
  • the transmitting the AR information to the first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
  • an apparatus wherein the apparatus comprises:
  • processors one or more processors
  • a storage device for storing one or more programs
  • said one or more processors are enabled to implement the above-mentioned method.
  • a computer readable storage medium on which a computer program is stored, wherein the program, when executed by a processor, implements the above-mentioned method.
  • the image information captured by the camera is obtained, and the image information drawn with the AR information is displayed, to help the user to quickly position another one terminal or more terminals to be found.
  • FIG. 1 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure
  • FIG. 2 is schematic diagram of use of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a human-machine interaction interface of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure
  • FIG. 4 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure
  • FIG. 5 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure
  • FIG. 6 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure.
  • FIG. 7 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure.
  • FIG. 8 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure.
  • FIG. 9 is a block diagram of an example computer system/server adapted to implement an embodiment of the present disclosure.
  • the term “and/or” used in the text is only an association relationship depicting associated objects and represents that three relations might exist, for example, A and/or B may represents three cases, namely, A exists individually, both A and B coexist, and B exists individually.
  • the symbol “/” in the text generally indicates associated objects before and after the symbol are in an “or” relationship.
  • first terminal and a second terminal are provided.
  • the provided first terminal and second terminal are configured to illustrate embodiments of the present disclosure, and technical ideas of the present disclosure are not limited to this.
  • exemplary embodiments of the present disclosure may be adapted to provide navigation and positioning situations to a plurality of terminals.
  • the terminal comprises a smart terminal device such as a mobile phone or a tablet computer.
  • Operating systems installed on the smart terminal device comprise but are not limited to iOS, Android, Windows, Linux and Mac OS.
  • the first terminal and the second terminal activate a plane navigation mode according to the user's instruction to enter the plane navigation mode, and respectively send a navigation request to a server.
  • the navigation request respectively includes location information of the first terminal/second terminal.
  • the server based on the location information of the first terminal and second terminal, calculates a first path from a location of the first terminal to a location of the second terminal to navigate the first terminal towards the second terminal; calculates a second path from the location of the second terminal to the location of the first terminal to navigate the second terminal towards the first terminal.
  • An exemplary embodiment of the present disclosure provides an augmented reality positioning method for location-based service LBS, wherein the augmented reality positioning mode is activated when a relative distance of the first terminal and the second terminal is smaller than a preset threshold.
  • the terminal acquires an image or video of real-time surrounding environment via a camera, generates virtual AR information having peer GPS location information and superimposes it on the image or video, and displays on the terminal screen.
  • the preset threshold may be set according to actual needs, for example may be set as 20 meters. That is, judgment is made as to whether the relative distance of the first terminal and second terminal is smaller than 20 meters. If the relative distance is judged as being smaller than 20 meters, it is believed that the first terminal and second terminal enter a short-distance scope.
  • FIG. 1 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure. As shown in FIG. 1 , the method comprises the following steps:
  • the first terminal transmits location information and direction information to the server;
  • the first terminal is a mobile terminal of a ride-hailing user.
  • the first terminal may acquire its own location information via a satellite navigation system such as GPS, GLONASS or BeiDou; may acquire its own direction information (e.g., azimuth information or geomagnetic information) via an inertia navigation unit such as a gyro or a magnetometer.
  • the direction information further comprises a tilt angle posture of the terminal, namely, an angle between the terminal and a horizontal plane.
  • detection information at least one of GPS positioning information, azimuth information and geomagnetic information
  • detection information at least one of GPS positioning information, azimuth information and geomagnetic information
  • the first terminal obtains image information captured by the camera
  • the user holds the first terminal with a hand, and the camera arranged on the back of the first terminal faces towards a direction in which the user advances, whereupon the image information including images or video data is obtained via the camera arranged on the back of the first terminal.
  • the step comprises:
  • the event that real scene navigation function is triggered comprises:
  • a virtual key for activating the real scene navigation is provided in a plane navigation interface.
  • the camera arranged on the back of the first terminal is activated.
  • the camera captures the image or video data at the current location and in the current direction.
  • the camera arranged on the back of the first terminal is activated to acquire the image or video data at the current location and in the current direction.
  • the first terminal transmits the captured image or video data at the current location and in the current direction to the server;
  • the first terminal receives calibrated location and direction information transmitted by the server, and the calibrated location and direction information is calibrated according to the image or video data;
  • a database including a lot of real scene pictures/three-dimensional model-building images is pre-arranged in the server, and the real scene pictures/three-dimensional model-building images are stored corresponding to corresponding location and direction information; the image frames or video key frames are obtained by processing the image or video data transmitted by the first terminal; the location and direction information of the first terminal is calibrated by comparing the real scene pictures/three-dimensional model-building images.
  • the location information determined by the positioning system has certain errors, but may substantially determine a substantial geographical location range. More accurate location and direction information of the first terminal may be obtained through the above processing.
  • the first terminal receives AR information transmitted by the server, and the AR information is generated according to the location information of the second terminal.
  • the AR information comprises: a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal.
  • the AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account such as the called taxi driver's information, car model information and car plate number information.
  • the AR information is used to present, in an augmented reality manner, the location information of the second terminal in the image or video data captured by the first terminal.
  • a 3D model of a navigation area is preset in the server.
  • the 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • the first terminal draws the AR information on the image information and displays.
  • the location information of the second terminal is mapped to a 2-dimensional location of a view finder of the camera of the first terminal, and the AR information is displayed at the 2-dimensional location.
  • the AR information is drawn on the image frame or video stream by using a computer graph processing technology.
  • the AR information and the image frame or video stream are subject to a rendering operation to finally obtain an image frame or video stream for output;
  • the frame image or video stream obtained by rendering is drawn in a memory for input;
  • the image frame or video stream drawn in the memory is displayed on a screen of the first terminal.
  • the AR information is displayed in the image frame or video stream in the form of a symbol or icon such as an arrow or a balloon-shaped guide identifier and used to indicate the second terminal.
  • the display content comprises the location of the second terminal, and distance information of the first terminal and second terminal; may further comprise other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • the first terminal further provides a human-machine interaction interface.
  • the symbol or icon is clicked on a display unit to further display other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • the symbol or icon may be clicked to further display auxiliary information such as a taxi driver's information like name, car model, car plate number, the distance between the first terminal and second terminal, and wait duration; furthermore, there is further included a virtual key which is clicked to make a phone call.
  • the virtual key may be clicked to invoke a phone function to contact with the taxi driver.
  • the first terminal when the direction information of the first terminal changes, for example, when the first terminal moves up, down, leftward, rightward, forward, backward or angularly, the first terminal, according to a sensor's latest data, re-calculates the latest data in real time and updates the latest data in the current image frame or video stream.
  • FIG. 4 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 4 , the method comprises:
  • the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • the first terminal is the ride-hailing user's mobile terminal
  • the second terminal is a called taxi driver's mobile terminal.
  • the server transmits AR information to the first terminal.
  • the AR information is generated based on location information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, draws the AR information on the image information and displays.
  • the AR information comprises: a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal.
  • the AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account such as a called taxi driver's information, car model information and car plate number.
  • a 3D model of a navigation area is preset in the server.
  • the 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • the server transmits the 3D model to the first terminal so that the first terminal performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location.
  • FIG. 5 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 5 , the method comprises the following steps:
  • the first terminal transmits location information and direction information to the server;
  • the first terminal is a mobile terminal of a ride-hailing user.
  • the terminal may acquire the location information via a satellite navigation system such as GPS, GLONASS or BeiDou; may acquire the direction information (e.g., azimuth information or geomagnetic information) via an inertia navigation unit such as a gyro or a magnetometer.
  • the direction information further comprises a tilt angle posture of the terminal, namely, an angle between the terminal and a horizontal plane.
  • detection information at least one of GPS positioning information, azimuth information and geomagnetic information
  • detection information at least one of GPS positioning information, azimuth information and geomagnetic information
  • the first terminal obtains image information captured by the camera and transmits the obtained image information to the server;
  • the user holds the first terminal with a hand, and the camera arranged on the back of the first terminal faces towards a direction in which the user advances, whereupon the image information including image or video data is obtained via the camera arranged on the back of the first terminal.
  • the step comprises:
  • the event that real scene navigation function is triggered comprises:
  • a virtual key for activating the real scene navigation is provided in a plane navigation interface.
  • the camera arranged on the back of the first terminal is activated.
  • the camera captures the image information at the current location and in the current direction.
  • the camera arranged on the back of the first terminal is activated to acquire the image information at the current location and in the current direction.
  • the first terminal receives AR information transmitted by the server, and the AR information is generated according to the location information of the second terminal.
  • the first terminal receiving the AR information transmitted by the server comprises: the first terminal receiving the image information transmitted by the server end and drawn with the AR information.
  • the AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account, for example, the called taxi driver information, car model information and car plate number information.
  • the AR information is used to present, in an augmented reality manner, the location information of the second terminal in the image or video data captured by the first terminal.
  • a 3D model of a navigation area is preset in the server.
  • the 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • the server performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location.
  • the AR information is drawn on the image information by using a computer graph processing technology, to obtain the image information sent to the first terminal for display and drawn with the AR information.
  • the first terminal displays the image information drawn with the AR information on its display module.
  • the AR information is displayed in the image information in the form of a symbol or icon such as an arrow or a balloon-shaped guide identifier and used to indicate the second terminal.
  • the display content comprises the location of the second terminal, and distance information of the first terminal and second terminal; may further comprise other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • the first terminal further provides a human-machine interaction interface.
  • the symbol or icon is clicked on a display unit to further display other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • the symbol or icon may be clicked to further display auxiliary information such as a taxi driver's information like name, car model, car plate number, the distance between the first terminal and second terminal, and wait duration; furthermore, there is further included a virtual key which is clicked to make a phone call.
  • the virtual key may be clicked to invoke a phone function to contact with the taxi driver.
  • FIG. 6 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 6 , the method comprises:
  • the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • the server receives image information transmitted by the first terminal and captured by the camera.
  • the server transmits AR information to the first terminal.
  • the AR information is generated based on location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, displays the image information drawn with the AR information.
  • the server sending AR information to the first terminal comprises: the server transmits the image information drawn with the AR information to the first terminal.
  • the AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account, for example, the called taxi driver information, car model information and car plate number information.
  • a 3D model of a navigation area is preset in the server.
  • the 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • the server performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location.
  • the AR information is drawn on the image information by using a computer graph processing technology, to obtain the image information sent to the first terminal for display and drawn with the AR information.
  • the technical solutions provided by the above embodiments can be employed to avoid the following drawbacks in the prior art: the determined location as obtained is usually inaccurate; when mutual information indicates that the terminals enter a location where they are close to each other, positioning cannot be performed on the interfaces of the terminals; if the environment is complicated, it is very difficult for the user to quickly perform accurate and direct judgment, and even completely impossible to obtain more accurate mutual suggestive location information.
  • the technical solutions can help the user to perform quick positioning judgment; furthermore, it is possible to more directly combine more interactable content or presented information with the real scenario; this may be implemented through the mobile terminals such as mobile phones without using extra hardware devices.
  • FIG. 7 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure.
  • the apparatus comprises a positioning module 71 , a transmitting module 72 , a receiving module 73 and a displaying module 74 ; wherein
  • the positioning module is configured to obtain location information and direction information of a terminal
  • the transmitting module is configured to transmit the location direction and direction information to a server
  • the receiving module is configured to receive AR information transmitted by the server, the AR information being generated according to the location information of the second terminal;
  • the display module is configured to display the image information drawn with the AR information.
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal.
  • the apparatus further comprises a drawing module configured to draw the AR information to the image information.
  • the transmitting module is used to transmit the image information to the server; the receiving the AR information transmitted by the server comprises: receiving the image information transmitted by the server end and drawn with the AR information.
  • FIG. 8 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 8 , the apparatus comprises a receiving module 81 and a transmitting module 82 ; wherein,
  • the receiving module is configured to receive location information and direction information transmitted by terminals, the terminals comprising a first terminal and a second terminal;
  • the transmitting module is configured to transmit AR information to the first terminal.
  • the AR information is generated based on the location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, displays the image information drawn with the AR information.
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal.
  • the image information drawn with the AR information is drawn by the first terminal.
  • the receiving module is configured to receive the image information transmitted by the first terminal; the transmitting the AR information to the first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
  • the revealed method and apparatus can be implemented in other ways.
  • the above-described embodiments for the apparatus are only exemplary, e.g., the division of the units is merely logical one, and, in reality, they can be divided in other ways upon implementation.
  • a plurality of units or components may be combined or integrated into another system, or some features may be neglected or not executed.
  • mutual coupling or direct coupling or communicative connection as displayed or discussed may be indirect coupling or communicative connection performed via some interfaces, means or units and may be electrical, mechanical or in other forms.
  • the units described as separate parts may be or may not be physically separated, the parts shown as units may be or may not be physical units, i.e., they can be located in one place, or distributed in a plurality of network units. One can select some or all the units to achieve the purpose of the embodiment according to the actual needs. Further, in the embodiments of the present disclosure, functional units can be integrated in one processing unit, or they can be separate physical presences; or two or more units can be integrated in one unit.
  • the integrated unit described above can be implemented in the form of hardware, or they can be implemented with hardware plus software functional units.
  • FIG. 9 illustrates a block diagram of an example computer system/server 012 adapted to implement an implementation mode of the present disclosure.
  • the computer system/server 012 shown in FIG. 9 is only an example and should not bring about any limitation to the function and scope of use of the embodiments of the present disclosure.
  • the computer system/server 012 is shown in the form of a general-purpose computing device.
  • the components of computer system/server 012 may include, but are not limited to, one or more processors or processing units 016 , a memory 028 , and a bus 018 that couples various system components including system memory 028 and the processor 016 .
  • Bus 018 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
  • Computer system/server 012 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 012 , and it includes both volatile and non-volatile media, removable and non-removable media.
  • Memory 028 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 030 and/or cache memory 032 .
  • Computer system/server 012 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 034 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown in FIG. 9 and typically called a “hard drive”).
  • a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media
  • each drive can be connected to bus 018 by one or more data media interfaces.
  • the memory 028 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the present disclosure.
  • Program/utility 040 having a set (at least one) of program modules 042 , may be stored in the system memory 028 by way of example, and not limitation, as well as an operating system, one or more disclosure programs, other program modules, and program data. Each of these examples or a certain combination thereof might include an implementation of a networking environment.
  • Program modules 042 generally carry out the functions and/or methodologies of embodiments of the present disclosure.
  • Computer system/server 012 may also communicate with one or more external devices 014 such as a keyboard, a pointing device, a display 024 , etc.; with one or more devices that enable a user to interact with computer system/server 012 ; and/or with any devices (e.g., network card, modem, etc.) that enable computer system/server 012 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 022 . Still yet, computer system/server 012 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 020 . As depicted in FIG.
  • LAN local area network
  • WAN wide area network
  • public network e.g., the Internet
  • network adapter 020 communicates with the other communication modules of computer system/server 012 via bus 018 .
  • bus 018 It should be understood that although not shown, other hardware and/or software modules could be used in conjunction with computer system/server 012 . Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
  • the processing unit 016 executes functions and/or methods in the embodiments described in the present disclosure by running programs stored in the memory 028 .
  • the above computer program may be stored in a computer storage medium, i.e., the computer storage medium is encoded with a computer program.
  • the program when executed by one or more computers, enables one or more computers to execute steps of the method and/or operations of the apparatus shown in the above embodiments of the present disclosure.
  • a propagation channel of the computer program is no longer limited to tangible medium, and it may also be directly downloaded from the network.
  • the computer-readable medium of the present embodiment may employ any combinations of one or more computer-readable media.
  • the machine readable medium may be a machine readable signal medium or a machine readable storage medium.
  • a machine readable medium may include, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • the machine readable storage medium can be any tangible medium that include or store programs for use by an instruction execution system, apparatus or device or a combination thereof.
  • the computer-readable signal medium may be included in a baseband or serve as a data signal propagated by part of a carrier, and it carries a computer-readable program code therein. Such propagated data signal may take many forms, including, but not limited to, electromagnetic signal, optical signal or any suitable combinations thereof.
  • the computer-readable signal medium may further be any computer-readable medium besides the computer-readable storage medium, and the computer-readable medium may send, propagate or transmit a program for use by an instruction execution system, apparatus or device or a combination thereof.
  • the program codes included by the computer-readable medium may be transmitted with any suitable medium, including, but not limited to radio, electric wire, optical cable, RF or the like, or any suitable combination thereof.
  • Computer program code for carrying out operations disclosed herein may be written in one or more programming languages or any combination thereof. These programming languages include an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)

Abstract

An augmented reality positioning method and apparatus for location-based service LBS, comprising: a first terminal obtains image information captured by a camera, and receives AR information transmitted by a server; the AR information is generated according to location information of a second terminal; the first terminal displays the image information drawn with the AR information. The apparatus avoids the following drawbacks in the prior art: location inaccuracy; when the terminals are close to each other, positioning cannot be performed on the interfaces of the terminals; if the environment is complicated, it is difficult for the user to quickly perform accurate and direct judgment, and even impossible to obtain more accurate mutual suggestive location information. The apparatus enables the user to perform quick positioning judgment making it possible to directly combine more interactable content or presented information with a real scenario using mobile terminals such as mobile phones without extra hardware.

Description

  • The present application claims the priority of Chinese Patent Application No. 2017104283427, filed on Jun. 8, 2017, with the title of “Augmented reality positioning method and apparatus for location-based service LBS”. The disclosure of the above applications is incorporated herein by reference in its entirety.
  • FIELD OF THE DISCLOSURE
  • The present disclosure relates to the field of Internet application, and particularly to an augmented reality positioning method and apparatus for location-based service LBS.
  • BACKGROUND OF THE DISCLOSURE
  • Current terminals such as mobile phones mostly have a positioning function. A positioning module is built in a terminal which may perform positioning via a positioning system such as Global Positioning System (GPS) or a base station to obtain a geographical location of the terminal. The positioning function of the terminal enables a user to acquire his own geographical location even if he is in an unfamiliar environment and not to get lost.
  • In real life, the positioning function of the terminal is applied in many aspects, for example, LBS (Location Based Service). Three major goals of LBS are: where are you, who are you together with, and what resources are there nearby, wherein “where are you” is a kernel of the LBS. Dual-terminal users may perform positioning via the GPS modules of the mobile terminals, display location information on interfaces of their terminals, and meanwhile obtain substantially rough prompts of route navigation planning, distance and direction. For example, in some ride-hailing applications, a user and a driver share locations to facilitate accurate acquisition of the user and driver's current locations.
  • However, the positioning of the GPS based on a consumption-level mobile terminal (e.g., a mobile phone or a tablet computer) has the following drawbacks:
  • The obtained location is usually inaccurate. For example, as for a location positioned by the GPS, there might occur a deviation of tens of meters due to factors such as an environment factor, and it is difficult to obtain precise geographical location information so that the navigation route is inaccurate.
  • After the terminals enter a location where they are close to each other and the positioning information is substantially in a coincident or close-to-each other range, positioning cannot be performed on the interfaces of the terminals.
  • If the environment is complicated, for example in the case that the outdoor environment is complicated and there are many barriers, it is very difficult for the user to quickly perform accurate and direct judgment, and completely impossible to obtain more accurate mutual suggestive location information. Even in some cases, even though the GPS geographical location information of the two terminals is very close even coincident, accurate final path information judgment cannot be smoothly obtained due to various environment factors (e.g., blocking, dim light, weather or the like), and the user's experience is affected seriously.
  • SUMMARY OF THE DISCLOSURE
  • A plurality of aspects of the present disclosure provide an augmented reality positioning method and apparatus for location-based service LBS, to help the user to quickly position another one terminal or more terminals to be found.
  • According to an aspect of the present disclosure, there is provided an augmented reality positioning method for location-based service LBS, comprising:
  • a first terminal obtains image information captured by a camera, and receives AR information transmitted by a server; the AR information being generated according to location information of a second terminal;
  • the first terminal displays the image information drawn with the AR information.
  • The above aspect and any possible implementation mode further provide an implementation mode:
  • the direction information comprises a tilt angle posture of the terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode:
  • the first terminal obtaining image information captured by the camera comprises:
  • capturing an event that a real scene navigation function is triggered, and activating the camera of the first terminal; wherein the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
  • The above aspect and any possible implementation mode further provide an implementation mode: the method further comprises:
  • the first terminal transmits the location direction and direction information to the server;
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode: the method further comprises:
  • the first terminal transmits the location direction and direction information to the server;
  • the first terminal transmits the image information to the server;
  • the first terminal receiving the AR information transmitted by the server comprises: the first terminal receiving the image information transmitted by the server end and drawn with the AR information.
  • The above aspect and any possible implementation mode further provide an implementation mode: the AR information comprises:
  • distance information of the first terminal and second terminal, relevant prompt auxiliary information.
  • According to another aspect of the present disclosure, there is provided an augmented reality positioning method for location-based service LBS, comprising:
  • the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • the server transmits AR information to the first terminal, and the AR information is generated based on location information of the second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
  • The above aspect and any possible implementation mode further provide an implementation mode: the direction information comprises a tilt angle posture of the terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode: the first terminal obtaining image information captured by the camera comprises:
  • capturing an event that a real scene navigation function is triggered, and activating the camera of the first terminal; wherein the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
  • The above aspect and any possible implementation mode further provide an implementation mode: the method further comprises:
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode: the method further comprises:
  • receiving the image information transmitted by the first terminal;
  • the server transmitting the AR information to the first terminal comprises: the server transmits the image information drawn with the AR information to the first terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode: the AR information comprises:
  • distance information of the first terminal and second terminal, relevant prompt auxiliary information.
  • According to another aspect of the present disclosure, there is provided an augmented reality positioning apparatus for location-based service LBS, comprising:
  • a positioning module configured to obtain location information and direction information of a terminal;
  • a transmitting module configured to transmit the location direction and direction information to a server;
  • a receiving module configured to receive AR information transmitted by the server, the AR information being generated according to the location information of the second terminal;
  • a display module configured to display the image information drawn with the AR information.
  • The above aspect and any possible implementation mode further provide an implementation mode: the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the apparatus further comprises a drawing module configured to draw the AR information to the image information.
  • The above aspect and any possible implementation mode further provide an implementation mode:
  • the transmitting module is used to transmit the image information to the server;
  • the receiving the AR information transmitted by the server comprises: receiving the image information transmitted by the server end and drawn with the AR information.
  • According to a further aspect of the present disclosure, there is provided an augmented reality positioning apparatus for location-based service LBS, comprising:
  • a receiving module configured to receive location information and direction information transmitted by terminals, the terminals comprising a first terminal and a second terminal;
  • a transmitting module configured to transmit AR information to the first terminal, the AR information being generated based on the location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
  • The above aspect and any possible implementation mode further provide an implementation mode:
  • the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
  • the image information drawn with the AR information is drawn by the first terminal.
  • The above aspect and any possible implementation mode further provide an implementation mode:
  • the receiving module is configured to receive the image information transmitted by the first terminal;
  • the transmitting the AR information to the first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
  • According to a further aspect of the present disclosure, there is provided an apparatus, wherein the apparatus comprises:
  • one or more processors;
  • a storage device for storing one or more programs,
  • when said one or more programs are executed by said one or more processors, said one or more processors are enabled to implement the above-mentioned method.
  • According to a further aspect of the present disclosure, there is provided a computer readable storage medium on which a computer program is stored, wherein the program, when executed by a processor, implements the above-mentioned method.
  • As known from the above technical solutions defined in the embodiments of the present disclosure, the image information captured by the camera is obtained, and the image information drawn with the AR information is displayed, to help the user to quickly position another one terminal or more terminals to be found.
  • BRIEF DESCRIPTION OF DRAWINGS
  • To describe technical solutions of embodiments of the present disclosure more clearly, figures to be used in the embodiments or in depictions regarding the prior art will be described briefly. Obviously, the figures described below are only some embodiments of the present disclosure. Those having ordinary skill in the art appreciate that other figures may be acquired from these figures without making inventive efforts.
  • FIG. 1 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure;
  • FIG. 2 is schematic diagram of use of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure;
  • FIG. 3 is a schematic diagram of a human-machine interaction interface of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure;
  • FIG. 4 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure;
  • FIG. 5 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure;
  • FIG. 6 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure;
  • FIG. 7 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure;
  • FIG. 8 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure;
  • FIG. 9 is a block diagram of an example computer system/server adapted to implement an embodiment of the present disclosure.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • To make objectives, technical solutions and advantages of embodiments of the present disclosure clearer, technical solutions of embodiment of the present disclosure will be described clearly and completely with reference to figures in embodiments of the present disclosure. Obviously, embodiments described here are partial embodiments of the present disclosure, not all embodiments. All other embodiments acquired by those having ordinary skill in the art based on the embodiments of the present disclosure, without making any inventive efforts, fall within the protection scope of the present disclosure.
  • In addition, the term “and/or” used in the text is only an association relationship depicting associated objects and represents that three relations might exist, for example, A and/or B may represents three cases, namely, A exists individually, both A and B coexist, and B exists individually. In addition, the symbol “/” in the text generally indicates associated objects before and after the symbol are in an “or” relationship.
  • In the depictions hereunder, it is assumed that a first terminal and a second terminal are provided. However, the provided first terminal and second terminal are configured to illustrate embodiments of the present disclosure, and technical ideas of the present disclosure are not limited to this. For example, exemplary embodiments of the present disclosure may be adapted to provide navigation and positioning situations to a plurality of terminals.
  • The terminal comprises a smart terminal device such as a mobile phone or a tablet computer. Operating systems installed on the smart terminal device comprise but are not limited to iOS, Android, Windows, Linux and Mac OS.
  • The first terminal and the second terminal activate a plane navigation mode according to the user's instruction to enter the plane navigation mode, and respectively send a navigation request to a server.
  • The navigation request respectively includes location information of the first terminal/second terminal.
  • The server, based on the location information of the first terminal and second terminal, calculates a first path from a location of the first terminal to a location of the second terminal to navigate the first terminal towards the second terminal; calculates a second path from the location of the second terminal to the location of the first terminal to navigate the second terminal towards the first terminal.
  • An exemplary embodiment of the present disclosure provides an augmented reality positioning method for location-based service LBS, wherein the augmented reality positioning mode is activated when a relative distance of the first terminal and the second terminal is smaller than a preset threshold. The terminal acquires an image or video of real-time surrounding environment via a camera, generates virtual AR information having peer GPS location information and superimposes it on the image or video, and displays on the terminal screen.
  • The preset threshold may be set according to actual needs, for example may be set as 20 meters. That is, judgment is made as to whether the relative distance of the first terminal and second terminal is smaller than 20 meters. If the relative distance is judged as being smaller than 20 meters, it is believed that the first terminal and second terminal enter a short-distance scope.
  • FIG. 1 is a flow chart of an augmented reality positioning method for location-based service LBS according to an embodiment of the present disclosure. As shown in FIG. 1, the method comprises the following steps:
  • In 101, the first terminal transmits location information and direction information to the server;
  • In the present embodiment, the first terminal is a mobile terminal of a ride-hailing user.
  • The first terminal may acquire its own location information via a satellite navigation system such as GPS, GLONASS or BeiDou; may acquire its own direction information (e.g., azimuth information or geomagnetic information) via an inertia navigation unit such as a gyro or a magnetometer. The direction information further comprises a tilt angle posture of the terminal, namely, an angle between the terminal and a horizontal plane.
  • Preferably, it is possible to transmit information only when detection information (at least one of GPS positioning information, azimuth information and geomagnetic information) changes as the first terminal moves and/or the direction of the first terminal changes, other than constantly transmitting information from the first terminal to the server.
  • In 102, the first terminal obtains image information captured by the camera;
  • Wherein the user holds the first terminal with a hand, and the camera arranged on the back of the first terminal faces towards a direction in which the user advances, whereupon the image information including images or video data is obtained via the camera arranged on the back of the first terminal.
  • Specifically, the step comprises:
  • according to a captured event that real scene navigation function is triggered, activating the camera of the first terminal, and obtaining image information at a current location and in a current direction as captured by the camera. The event that real scene navigation function is triggered comprises:
  • A virtual key for activating the real scene navigation is provided in a plane navigation interface. When the user clicks the virtual key to activate real scene navigation, the camera arranged on the back of the first terminal is activated. The camera captures the image or video data at the current location and in the current direction. Exemplarily, it is possible to set the virtual key for activating the real scene navigation at a right upper corner of the navigation interface.
  • If it is monitored that the first terminal is in a state of a plane navigation interface and the tilt angle posture of the first terminal is in a preset vertical range, namely, an angle between the first terminal and the horizontal plane is within a preset angle interval, exemplarily the preset angle interval is 70°-90°, the camera arranged on the back of the first terminal is activated to acquire the image or video data at the current location and in the current direction.
  • Preferably, it is feasible to modify the location and direction information of the terminal through mode recognition technology/image contrast technology of image frames or video key frames; specifically,
  • The first terminal transmits the captured image or video data at the current location and in the current direction to the server;
  • The first terminal receives calibrated location and direction information transmitted by the server, and the calibrated location and direction information is calibrated according to the image or video data;
  • A database including a lot of real scene pictures/three-dimensional model-building images is pre-arranged in the server, and the real scene pictures/three-dimensional model-building images are stored corresponding to corresponding location and direction information; the image frames or video key frames are obtained by processing the image or video data transmitted by the first terminal; the location and direction information of the first terminal is calibrated by comparing the real scene pictures/three-dimensional model-building images.
  • The location information determined by the positioning system has certain errors, but may substantially determine a substantial geographical location range. More accurate location and direction information of the first terminal may be obtained through the above processing.
  • In 103, the first terminal receives AR information transmitted by the server, and the AR information is generated according to the location information of the second terminal.
  • Specifically,
  • The AR information comprises: a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal. The AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account such as the called taxi driver's information, car model information and car plate number information. The AR information is used to present, in an augmented reality manner, the location information of the second terminal in the image or video data captured by the first terminal.
  • A 3D model of a navigation area is preset in the server. The 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • In 104, the first terminal draws the AR information on the image information and displays.
  • Spatial calculation is performed according to the 3D model, the location information of the second terminal is mapped to a 2-dimensional location of a view finder of the camera of the first terminal, and the AR information is displayed at the 2-dimensional location.
  • Specifically, the AR information is drawn on the image frame or video stream by using a computer graph processing technology.
  • The AR information and the image frame or video stream are subject to a rendering operation to finally obtain an image frame or video stream for output;
  • The frame image or video stream obtained by rendering is drawn in a memory for input;
  • The image frame or video stream drawn in the memory is displayed on a screen of the first terminal.
  • Preferably, as shown in FIG. 2, the AR information is displayed in the image frame or video stream in the form of a symbol or icon such as an arrow or a balloon-shaped guide identifier and used to indicate the second terminal. The display content comprises the location of the second terminal, and distance information of the first terminal and second terminal; may further comprise other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • Preferably, the first terminal further provides a human-machine interaction interface. The symbol or icon is clicked on a display unit to further display other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account. For example, as shown in FIG. 3, the symbol or icon may be clicked to further display auxiliary information such as a taxi driver's information like name, car model, car plate number, the distance between the first terminal and second terminal, and wait duration; furthermore, there is further included a virtual key which is clicked to make a phone call. The virtual key may be clicked to invoke a phone function to contact with the taxi driver.
  • Preferably, when the direction information of the first terminal changes, for example, when the first terminal moves up, down, leftward, rightward, forward, backward or angularly, the first terminal, according to a sensor's latest data, re-calculates the latest data in real time and updates the latest data in the current image frame or video stream.
  • Correspondingly, FIG. 4 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 4, the method comprises:
  • 401: the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • In the present embodiment, the first terminal is the ride-hailing user's mobile terminal, and the second terminal is a called taxi driver's mobile terminal.
  • 402: the server transmits AR information to the first terminal. The AR information is generated based on location information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, draws the AR information on the image information and displays.
  • The AR information comprises: a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal. The AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account such as a called taxi driver's information, car model information and car plate number.
  • A 3D model of a navigation area is preset in the server. The 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal. The server transmits the 3D model to the first terminal so that the first terminal performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location.
  • FIG. 5 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 5, the method comprises the following steps:
  • In 501, the first terminal transmits location information and direction information to the server;
  • In the present embodiment, the first terminal is a mobile terminal of a ride-hailing user.
  • The terminal may acquire the location information via a satellite navigation system such as GPS, GLONASS or BeiDou; may acquire the direction information (e.g., azimuth information or geomagnetic information) via an inertia navigation unit such as a gyro or a magnetometer. The direction information further comprises a tilt angle posture of the terminal, namely, an angle between the terminal and a horizontal plane.
  • Preferably, it is possible to transmit information only when detection information (at least one of GPS positioning information, azimuth information and geomagnetic information) changes as the terminal moves and/or the direction of the terminal changes, other than constantly transmitting information from the terminal to the server.
  • In 502, the first terminal obtains image information captured by the camera and transmits the obtained image information to the server;
  • Wherein the user holds the first terminal with a hand, and the camera arranged on the back of the first terminal faces towards a direction in which the user advances, whereupon the image information including image or video data is obtained via the camera arranged on the back of the first terminal.
  • Specifically, the step comprises:
  • According to a captured event that real scene navigation function is triggered, activating the camera of the first terminal, and obtaining image information at a current location and in a current direction as captured by the camera. The event that real scene navigation function is triggered comprises:
  • A virtual key for activating the real scene navigation is provided in a plane navigation interface. When the user clicks the virtual key to activate real scene navigation, the camera arranged on the back of the first terminal is activated. The camera captures the image information at the current location and in the current direction. Exemplarily, it is possible to set the virtual key for activating the real scene navigation at a right upper corner of the navigation interface.
  • If it is monitored that the first terminal is in a state of a plane navigation interface and the tilt angle posture of the first terminal is in a preset vertical range, namely, an angle between the first terminal and the horizontal plane is within a preset angle interval, exemplarily the preset angle interval is 70°-90°, the camera arranged on the back of the first terminal is activated to acquire the image information at the current location and in the current direction.
  • In 503, the first terminal receives AR information transmitted by the server, and the AR information is generated according to the location information of the second terminal.
  • Specifically,
  • the first terminal receiving the AR information transmitted by the server comprises: the first terminal receiving the image information transmitted by the server end and drawn with the AR information. The AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account, for example, the called taxi driver information, car model information and car plate number information. The AR information is used to present, in an augmented reality manner, the location information of the second terminal in the image or video data captured by the first terminal.
  • A 3D model of a navigation area is preset in the server. The 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • Specifically, the server performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location. Specifically, the AR information is drawn on the image information by using a computer graph processing technology, to obtain the image information sent to the first terminal for display and drawn with the AR information.
  • In 504, the first terminal displays the image information drawn with the AR information on its display module.
  • Preferably, as shown in FIG. 2, the AR information is displayed in the image information in the form of a symbol or icon such as an arrow or a balloon-shaped guide identifier and used to indicate the second terminal. The display content comprises the location of the second terminal, and distance information of the first terminal and second terminal; may further comprise other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account.
  • Preferably, the first terminal further provides a human-machine interaction interface. The symbol or icon is clicked on a display unit to further display other relevant auxiliary information such as the user-related personal information and other information that can be further obtained through a judgment account. For example, as shown in FIG. 3, the symbol or icon may be clicked to further display auxiliary information such as a taxi driver's information like name, car model, car plate number, the distance between the first terminal and second terminal, and wait duration; furthermore, there is further included a virtual key which is clicked to make a phone call. The virtual key may be clicked to invoke a phone function to contact with the taxi driver.
  • Correspondingly, FIG. 6 is a flow chart of an augmented reality positioning method for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 6, the method comprises:
  • 601: the server receives location information and direction information transmitted by terminals, and the terminals comprise a first terminal and a second terminal;
  • Specifically, the server receives image information transmitted by the first terminal and captured by the camera.
  • 602: the server transmits AR information to the first terminal. The AR information is generated based on location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, displays the image information drawn with the AR information.
  • The server sending AR information to the first terminal comprises: the server transmits the image information drawn with the AR information to the first terminal. The AR information further comprises: distance information of the first terminal and second terminal, relevant prompt auxiliary information such as user-related personal information, and other information that can be further obtained through a judgment account, for example, the called taxi driver information, car model information and car plate number information.
  • Specifically,
  • a 3D model of a navigation area is preset in the server. The 3D model carrying the location and direction information of the first terminal and location information of the second terminal is generated according to the location and direction information of the first terminal and the location information of the second terminal.
  • The server performs spatial calculation according to the 3D model, maps the location information of the second terminal to a 2-dimensional location of a view finder of the camera of the first terminal, and displays the AR information at the 2-dimensional location.
  • Specifically, the AR information is drawn on the image information by using a computer graph processing technology, to obtain the image information sent to the first terminal for display and drawn with the AR information.
  • The technical solutions provided by the above embodiments can be employed to avoid the following drawbacks in the prior art: the determined location as obtained is usually inaccurate; when mutual information indicates that the terminals enter a location where they are close to each other, positioning cannot be performed on the interfaces of the terminals; if the environment is complicated, it is very difficult for the user to quickly perform accurate and direct judgment, and even completely impossible to obtain more accurate mutual suggestive location information. The technical solutions can help the user to perform quick positioning judgment; furthermore, it is possible to more directly combine more interactable content or presented information with the real scenario; this may be implemented through the mobile terminals such as mobile phones without using extra hardware devices.
  • As appreciated, for ease of description, the aforesaid method embodiments are all described as a combination of a series of actions, but those skilled in the art should appreciated that the present disclosure is not limited to the described order of actions because some steps may be performed in other orders or simultaneously according to the present disclosure. Secondly, those skilled in the art should appreciate the embodiments described in the description all belong to preferred embodiments, and the involved actions and modules are not necessarily requisite for the present disclosure.
  • In the above embodiments, different emphasis is placed on respective embodiments, and reference may be made to related depictions in other embodiments for portions not detailed in a certain embodiment.
  • FIG. 7 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 7, the apparatus comprises a positioning module 71, a transmitting module 72, a receiving module 73 and a displaying module 74; wherein
  • the positioning module is configured to obtain location information and direction information of a terminal;
  • the transmitting module is configured to transmit the location direction and direction information to a server;
  • the receiving module is configured to receive AR information transmitted by the server, the AR information being generated according to the location information of the second terminal;
  • the display module is configured to display the image information drawn with the AR information.
  • Preferably, the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal. The apparatus further comprises a drawing module configured to draw the AR information to the image information.
  • Preferably, the transmitting module is used to transmit the image information to the server; the receiving the AR information transmitted by the server comprises: receiving the image information transmitted by the server end and drawn with the AR information.
  • FIG. 8 is a block diagram of an augmented reality positioning apparatus for location-based service LBS according to another embodiment of the present disclosure. As shown in FIG. 8, the apparatus comprises a receiving module 81 and a transmitting module 82; wherein,
  • the receiving module is configured to receive location information and direction information transmitted by terminals, the terminals comprising a first terminal and a second terminal;
  • the transmitting module is configured to transmit AR information to the first terminal. The AR information is generated based on the location information and direction information of the second terminal so that the first terminal, upon obtaining image information captured by the camera, displays the image information drawn with the AR information.
  • Preferably, the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal. The image information drawn with the AR information is drawn by the first terminal.
  • Preferably, the receiving module is configured to receive the image information transmitted by the first terminal; the transmitting the AR information to the first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
  • Those skilled in the art may clearly understand for the sake of convenient and brief description, a specific working procedure of the aforesaid terminals and server is not detailed any more here and reference may be made to the corresponding procedures in the above method embodiment.
  • In the embodiments provided by the present disclosure, it should be understood that the revealed method and apparatus can be implemented in other ways. For example, the above-described embodiments for the apparatus are only exemplary, e.g., the division of the units is merely logical one, and, in reality, they can be divided in other ways upon implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be neglected or not executed. In addition, mutual coupling or direct coupling or communicative connection as displayed or discussed may be indirect coupling or communicative connection performed via some interfaces, means or units and may be electrical, mechanical or in other forms.
  • The units described as separate parts may be or may not be physically separated, the parts shown as units may be or may not be physical units, i.e., they can be located in one place, or distributed in a plurality of network units. One can select some or all the units to achieve the purpose of the embodiment according to the actual needs. Further, in the embodiments of the present disclosure, functional units can be integrated in one processing unit, or they can be separate physical presences; or two or more units can be integrated in one unit. The integrated unit described above can be implemented in the form of hardware, or they can be implemented with hardware plus software functional units.
  • FIG. 9 illustrates a block diagram of an example computer system/server 012 adapted to implement an implementation mode of the present disclosure. The computer system/server 012 shown in FIG. 9 is only an example and should not bring about any limitation to the function and scope of use of the embodiments of the present disclosure.
  • As shown in FIG. 9, the computer system/server 012 is shown in the form of a general-purpose computing device. The components of computer system/server 012 may include, but are not limited to, one or more processors or processing units 016, a memory 028, and a bus 018 that couples various system components including system memory 028 and the processor 016.
  • Bus 018 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
  • Computer system/server 012 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 012, and it includes both volatile and non-volatile media, removable and non-removable media.
  • Memory 028 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 030 and/or cache memory 032. Computer system/server 012 may further include other removable/non-removable, volatile/non-volatile computer system storage media. By way of example only, storage system 034 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown in FIG. 9 and typically called a “hard drive”). Although not shown in FIG. 9, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each drive can be connected to bus 018 by one or more data media interfaces. The memory 028 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the present disclosure.
  • Program/utility 040, having a set (at least one) of program modules 042, may be stored in the system memory 028 by way of example, and not limitation, as well as an operating system, one or more disclosure programs, other program modules, and program data. Each of these examples or a certain combination thereof might include an implementation of a networking environment. Program modules 042 generally carry out the functions and/or methodologies of embodiments of the present disclosure.
  • Computer system/server 012 may also communicate with one or more external devices 014 such as a keyboard, a pointing device, a display 024, etc.; with one or more devices that enable a user to interact with computer system/server 012; and/or with any devices (e.g., network card, modem, etc.) that enable computer system/server 012 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 022. Still yet, computer system/server 012 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 020. As depicted in FIG. 9, network adapter 020 communicates with the other communication modules of computer system/server 012 via bus 018. It should be understood that although not shown, other hardware and/or software modules could be used in conjunction with computer system/server 012. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
  • The processing unit 016 executes functions and/or methods in the embodiments described in the present disclosure by running programs stored in the memory 028.
  • The above computer program may be stored in a computer storage medium, i.e., the computer storage medium is encoded with a computer program. The program, when executed by one or more computers, enables one or more computers to execute steps of the method and/or operations of the apparatus shown in the above embodiments of the present disclosure.
  • As time goes by and technologies develop, the meaning of medium is increasingly broad. A propagation channel of the computer program is no longer limited to tangible medium, and it may also be directly downloaded from the network. The computer-readable medium of the present embodiment may employ any combinations of one or more computer-readable media. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the text herein, the computer readable storage medium can be any tangible medium that include or store programs for use by an instruction execution system, apparatus or device or a combination thereof.
  • The computer-readable signal medium may be included in a baseband or serve as a data signal propagated by part of a carrier, and it carries a computer-readable program code therein. Such propagated data signal may take many forms, including, but not limited to, electromagnetic signal, optical signal or any suitable combinations thereof. The computer-readable signal medium may further be any computer-readable medium besides the computer-readable storage medium, and the computer-readable medium may send, propagate or transmit a program for use by an instruction execution system, apparatus or device or a combination thereof.
  • The program codes included by the computer-readable medium may be transmitted with any suitable medium, including, but not limited to radio, electric wire, optical cable, RF or the like, or any suitable combination thereof.
  • Computer program code for carrying out operations disclosed herein may be written in one or more programming languages or any combination thereof. These programming languages include an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Finally, it is appreciated that the above embodiments are only used to illustrate the technical solutions of the present disclosure, not to limit the present disclosure; although the present disclosure is described in detail with reference to the above embodiments, those having ordinary skill in the art should understand that they still can modify technical solutions recited in the aforesaid embodiments or equivalently replace partial technical features therein; these modifications or substitutions do not make essence of corresponding technical solutions depart from the spirit and scope of technical solutions of embodiments of the present disclosure.

Claims (27)

What is claimed is:
1. An augmented reality positioning method for location-based service LBS, wherein the method executed by a first terminal comprises:
obtaining image information captured by a camera, and receiving AR information transmitted by a server, wherein the AR information is generated according to location information of a second terminal;
displaying the image information drawn with the AR information.
2. The augmented reality positioning method for location-based service LBS according to claim 1, wherein the obtaining image information captured by a camera comprises:
capturing an event that a real scene navigation function is triggered, and activating a camera of the first terminal; wherein the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
3. The augmented reality positioning method for location-based service LBS according to claim 1, wherein the method further comprises:
transmitting location information and direction information to the server;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
4. The augmented reality positioning method for location-based service LBS according to claim 1, wherein the method further comprises:
transmitting the location information and direction information to the server;
transmitting the image information to the server;
the receiving AR information transmitted by the server comprises: receiving the image information transmitted by the server and drawn with the AR information.
5. The augmented reality positioning method for location-based service LBS according to claim 1, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
6. An augmented reality positioning method for location-based service LBS, wherein the method executed by a server comprises:
transmitting AR information to a first terminal, and the AR information is generated based on location information of a second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
7. The augmented reality positioning method for location-based service LBS according to claim 6, wherein, the method further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
8. The augmented reality positioning method for location-based service LBS according to claim 6, wherein the method further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
receiving the image information transmitted by the first terminal;
the transmitting the AR information to a first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
9. The augmented reality positioning method for location-based service LBS according to claim 6, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
10. A first terminal, wherein the apparatus comprises:
one or more processors;
a storage device for storing one or more programs,
when said one or more programs are executed by said one or more processors, said one or more processors are enabled to implement the following operation:
obtaining image information captured by a camera, and receiving AR information transmitted by a server, wherein the AR information is generated according to location information of a second terminal;
displaying the image information drawn with the AR information.
11. The first terminal according to claim 10, wherein the obtaining image information captured by a camera comprises:
capturing an event that a real scene navigation function is triggered, and activating a camera of the first terminal; wherein the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
12. The first terminal according to claim 10, wherein the operation further comprises:
transmitting location information and direction information to the server;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
13. The first terminal according to claim 10, wherein the operation further comprises:
transmitting the location information and direction information to the server;
transmitting the image information to the server;
the receiving AR information transmitted by the server comprises: receiving the image information transmitted by the server and drawn with the AR information.
14. The first terminal according to claim 10, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
15. A server, wherein the apparatus comprises:
one or more processors;
a storage device for storing one or more programs,
when said one or more programs are executed by said one or more processors, said one or more processors are enabled to implement the following operation:
transmitting AR information to a first terminal, and the AR information is generated based on location information of a second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
16. The server according to claim 15, wherein, the operation further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
17. The server according to claim 15, wherein the operation further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
receiving the image information transmitted by the first terminal;
the transmitting the AR information to a first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
18. The server according to claim 15, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
19. A computer-readable storage medium on which a computer program is stored, wherein the program, when executed by a processor of a first terminal, implements the following operation:
obtaining image information captured by a camera, and receiving AR information transmitted by a server, wherein the AR information is generated according to location information of a second terminal;
displaying the image information drawn with the AR information.
20. The computer-readable storage medium according to claim 19, wherein the obtaining image information captured by a camera comprises:
capturing an event that a real scene navigation function is triggered, and activating a camera of the first terminal; wherein the event that real scene navigation function is triggered comprises: click of a real scene navigation button, or the tilt angle posture of the first terminal being in a preset range.
21. The computer-readable storage medium according to claim 19, wherein the operation further comprises:
transmitting location information and direction information to the server;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
22. The computer-readable storage medium according to claim 19, wherein the operation further comprises:
transmitting the location information and direction information to the server;
transmitting the image information to the server;
the receiving AR information transmitted by the server comprises: receiving the image information transmitted by the server and drawn with the AR information.
23. The computer-readable storage medium according to claim 19, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
24. A computer-readable storage medium on which a computer program is stored, wherein the program, when executed by a processor, implements the following operation:
transmitting AR information to a first terminal, and the AR information is generated based on location information of a second terminal so that the first terminal, upon obtaining image information captured by a camera, displays the image information drawn with the AR information.
25. The computer-readable storage medium according to claim 24, wherein, the operation further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
wherein the AR information comprises a 3D model carrying the location and direction information of the first terminal and the location information of the second terminal;
the image information drawn with the AR information is drawn by the first terminal.
26. The computer-readable storage medium according to claim 25, wherein the operation further comprises:
receiving location information and direction information transmitted by terminals, and the terminals comprise the first terminal and the second terminal;
receiving the image information transmitted by the first terminal;
the transmitting the AR information to a first terminal comprises: transmitting the image information drawn with the AR information to the first terminal.
27. The computer-readable storage medium according to claim 25, wherein the AR information comprises:
distance information of the first terminal and second terminal, relevant prompt auxiliary information.
US15/991,343 2017-06-08 2018-05-29 Augmented reality positioning method and apparatus for location-based service LBS Active 2038-08-16 US11164379B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201710428342.7A CN107450088B (en) 2017-06-08 2017-06-08 Location-based service LBS augmented reality positioning method and device
CN201710428342.7 2017-06-08
CN2017104283427 2017-06-08

Publications (2)

Publication Number Publication Date
US20180357824A1 true US20180357824A1 (en) 2018-12-13
US11164379B2 US11164379B2 (en) 2021-11-02

Family

ID=60486852

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/991,343 Active 2038-08-16 US11164379B2 (en) 2017-06-08 2018-05-29 Augmented reality positioning method and apparatus for location-based service LBS

Country Status (2)

Country Link
US (1) US11164379B2 (en)
CN (1) CN107450088B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109767095A (en) * 2018-12-28 2019-05-17 中交一航局安装工程有限公司 A kind of construction system and its implementation based on BIM model and AR technology
JP2020008561A (en) * 2018-07-06 2020-01-16 本田技研工業株式会社 Device and method for presenting information and program
CN110928413A (en) * 2019-11-22 2020-03-27 北京新势界科技有限公司 Method and device for establishing AR information point through terminal
US20200118338A1 (en) * 2018-10-12 2020-04-16 Mapbox, Inc. Candidate geometry displays for augmented reality
CN111982108A (en) * 2019-05-24 2020-11-24 北京京东尚科信息技术有限公司 Mobile robot positioning method, device, equipment and storage medium
WO2020264013A1 (en) * 2019-06-28 2020-12-30 Snap Inc. Real-time augmented-reality costuming
US11302079B2 (en) * 2017-06-09 2022-04-12 Nearme AR, LLC Systems and methods for displaying and interacting with a dynamic real-world environment
CN114935973A (en) * 2022-04-11 2022-08-23 北京达佳互联信息技术有限公司 Interactive processing method, device, equipment and storage medium
US11461976B2 (en) 2018-10-17 2022-10-04 Mapbox, Inc. Visualization transitions for augmented reality
US20230101411A1 (en) * 2021-09-30 2023-03-30 Gm Cruise Holdings Llc User preview of rideshare service vehicle surroundings
US20230154058A1 (en) * 2021-11-17 2023-05-18 Meta Platforms Technologies, Llc Systems and Methods for Content Sharing Between Artificial-Reality Devices
US11807278B2 (en) 2020-10-21 2023-11-07 Gm Cruise Holdings Llc Autonomous vehicle passenger safety monitoring

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109996032B (en) * 2017-12-29 2020-10-02 杭州海康威视系统技术有限公司 Information display method and device, computer equipment and storage medium
CN111044061B (en) * 2018-10-12 2023-03-28 腾讯大地通途(北京)科技有限公司 Navigation method, device, equipment and computer readable storage medium
CN109348425B (en) * 2018-11-13 2021-01-01 苏州达家迎信息技术有限公司 Positioning information updating method, device, equipment and storage medium
CN109489654B (en) * 2018-11-20 2020-07-28 百度在线网络技术(北京)有限公司 Navigation route presenting method, device, equipment and storage medium
US11397503B2 (en) * 2019-06-28 2022-07-26 Snap Inc. Association of user identifiers to augmented-reality content
CN112432636B (en) * 2020-11-30 2023-04-07 浙江商汤科技开发有限公司 Positioning method and device, electronic equipment and storage medium
CN114167985B (en) * 2021-11-29 2022-08-12 中国科学院计算机网络信息中心 Emergency task augmented reality application method and system based on 5G

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020069013A1 (en) * 2000-10-05 2002-06-06 Nassir Navab Method and system for computer assisted localization, site navigation, and data navigation
US20110181598A1 (en) * 2010-01-25 2011-07-28 O'neall Andrew J Displaying Maps of Measured Events
US20150269190A1 (en) * 2014-03-18 2015-09-24 Yuan-Ze University Method and system for vehicle identification
US20170103452A1 (en) * 2015-10-13 2017-04-13 Xperiel, Inc. Platform for Providing Customizable User Brand Experiences, Sponsorship Junctions, and Conversion Attribution
CN106982240A (en) * 2016-01-18 2017-07-25 腾讯科技(北京)有限公司 The display methods and device of information
US20180322707A1 (en) * 2013-01-25 2018-11-08 Tencent Technology (Shenzhen) Company Limited Method and system for performing interaction based on augmented reality

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101330807B1 (en) * 2011-08-31 2013-11-18 주식회사 팬택 Apparatus and method for sharing data using augmented reality
KR101892280B1 (en) * 2011-09-21 2018-08-28 엘지전자 주식회사 Mobile terminal and control method therof
CN102801788B (en) * 2012-07-17 2015-12-16 中兴通讯股份有限公司 A kind of methods, devices and systems realizing augmented reality information sharing
US20140278053A1 (en) * 2013-03-14 2014-09-18 Samsung Electronics Co., Ltd. Navigation system with dynamic update mechanism and method of operation thereof
US20150193982A1 (en) * 2014-01-03 2015-07-09 Google Inc. Augmented reality overlays using position and orientation to facilitate interactions between electronic devices
WO2016168678A1 (en) * 2015-04-15 2016-10-20 Uber Technologies, Inc. Programmatically providing information in connection with location-based services to service providers
CN105973231A (en) * 2016-06-30 2016-09-28 百度在线网络技术(北京)有限公司 Navigation method and navigation device
CN106250187A (en) * 2016-07-29 2016-12-21 宇龙计算机通信科技(深圳)有限公司 The information processing method of a kind of augmented reality AR, Apparatus and system
CN107024980A (en) * 2016-10-26 2017-08-08 阿里巴巴集团控股有限公司 Customer location localization method and device based on augmented reality
CN106767754A (en) * 2016-11-30 2017-05-31 宇龙计算机通信科技(深圳)有限公司 A kind of processing method of navigation information, terminal and server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020069013A1 (en) * 2000-10-05 2002-06-06 Nassir Navab Method and system for computer assisted localization, site navigation, and data navigation
US20110181598A1 (en) * 2010-01-25 2011-07-28 O'neall Andrew J Displaying Maps of Measured Events
US20180322707A1 (en) * 2013-01-25 2018-11-08 Tencent Technology (Shenzhen) Company Limited Method and system for performing interaction based on augmented reality
US20150269190A1 (en) * 2014-03-18 2015-09-24 Yuan-Ze University Method and system for vehicle identification
US20170103452A1 (en) * 2015-10-13 2017-04-13 Xperiel, Inc. Platform for Providing Customizable User Brand Experiences, Sponsorship Junctions, and Conversion Attribution
CN106982240A (en) * 2016-01-18 2017-07-25 腾讯科技(北京)有限公司 The display methods and device of information

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11302079B2 (en) * 2017-06-09 2022-04-12 Nearme AR, LLC Systems and methods for displaying and interacting with a dynamic real-world environment
US12086374B2 (en) 2017-06-09 2024-09-10 Nearme AR, LLC Systems and methods for displaying and interacting with a dynamic real-world environment
JP2020008561A (en) * 2018-07-06 2020-01-16 本田技研工業株式会社 Device and method for presenting information and program
JP7361486B2 (en) 2018-07-06 2023-10-16 本田技研工業株式会社 Information presentation device, information presentation method, and program
US20200118338A1 (en) * 2018-10-12 2020-04-16 Mapbox, Inc. Candidate geometry displays for augmented reality
US10964112B2 (en) * 2018-10-12 2021-03-30 Mapbox, Inc. Candidate geometry displays for augmented reality
US11461976B2 (en) 2018-10-17 2022-10-04 Mapbox, Inc. Visualization transitions for augmented reality
CN109767095A (en) * 2018-12-28 2019-05-17 中交一航局安装工程有限公司 A kind of construction system and its implementation based on BIM model and AR technology
CN111982108A (en) * 2019-05-24 2020-11-24 北京京东尚科信息技术有限公司 Mobile robot positioning method, device, equipment and storage medium
WO2020264013A1 (en) * 2019-06-28 2020-12-30 Snap Inc. Real-time augmented-reality costuming
CN110928413A (en) * 2019-11-22 2020-03-27 北京新势界科技有限公司 Method and device for establishing AR information point through terminal
US11807278B2 (en) 2020-10-21 2023-11-07 Gm Cruise Holdings Llc Autonomous vehicle passenger safety monitoring
US20230101411A1 (en) * 2021-09-30 2023-03-30 Gm Cruise Holdings Llc User preview of rideshare service vehicle surroundings
US20230116185A1 (en) * 2021-09-30 2023-04-13 Gm Cruise Holdings Llc User preview of rideshare service vehicle surroundings
US11859995B2 (en) * 2021-09-30 2024-01-02 Gm Cruise Holdings Llc User preview of rideshare service vehicle surroundings
US11761781B2 (en) * 2021-09-30 2023-09-19 Gm Cruise Holdings Llc User preview of rideshare service vehicle surroundings
US20230154058A1 (en) * 2021-11-17 2023-05-18 Meta Platforms Technologies, Llc Systems and Methods for Content Sharing Between Artificial-Reality Devices
US11941725B2 (en) * 2021-11-17 2024-03-26 Meta Platforms Technologies, Llc Systems and methods for content sharing between artificial-reality devices
CN114935973A (en) * 2022-04-11 2022-08-23 北京达佳互联信息技术有限公司 Interactive processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN107450088B (en) 2021-05-14
US11164379B2 (en) 2021-11-02
CN107450088A (en) 2017-12-08

Similar Documents

Publication Publication Date Title
US11164379B2 (en) Augmented reality positioning method and apparatus for location-based service LBS
US10445945B2 (en) Directional and X-ray view techniques for navigation using a mobile device
US10677596B2 (en) Image processing device, image processing method, and program
US9074899B2 (en) Object guiding method, mobile viewing system and augmented reality system
JP6116756B2 (en) Positioning / navigation method, apparatus, program, and recording medium
TWI694298B (en) Information display method, device and terminal
CN107480173B (en) POI information display method and device, equipment and readable medium
KR20190018243A (en) Method and system for navigation using video call
CN109489654B (en) Navigation route presenting method, device, equipment and storage medium
CN112307363A (en) Virtual-real fusion display method and device, electronic equipment and storage medium
CN112432636B (en) Positioning method and device, electronic equipment and storage medium
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
WO2022237071A1 (en) Locating method and apparatus, and electronic device, storage medium and computer program
WO2022110777A1 (en) Positioning method and apparatus, electronic device, storage medium, computer program product, and computer program
CN113452842B (en) Flight AR display method, system, computer equipment and storage medium
KR20180019861A (en) Street view translation supplying apparatus and method for mobile
CN112689114B (en) Method, apparatus, device and medium for determining target position of vehicle
CN112965652A (en) Information display method and device, electronic equipment and storage medium
CN108897841A (en) Panorama sketch searching method, device, equipment, server and storage medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, ZHONGQIN;YAO, MIAO;ZHANG, YONGJIE;REEL/FRAME:045977/0149

Effective date: 20180517

Owner name: BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) CO., LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, ZHONGQIN;YAO, MIAO;ZHANG, YONGJIE;REEL/FRAME:045977/0149

Effective date: 20180517

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE