CN111238495A - Method for positioning vehicle and terminal equipment - Google Patents

Method for positioning vehicle and terminal equipment Download PDF

Info

Publication number
CN111238495A
CN111238495A CN202010010221.2A CN202010010221A CN111238495A CN 111238495 A CN111238495 A CN 111238495A CN 202010010221 A CN202010010221 A CN 202010010221A CN 111238495 A CN111238495 A CN 111238495A
Authority
CN
China
Prior art keywords
live
target vehicle
user
target
action map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010010221.2A
Other languages
Chinese (zh)
Inventor
何君尧
钟钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010010221.2A priority Critical patent/CN111238495A/en
Publication of CN111238495A publication Critical patent/CN111238495A/en
Priority to PCT/CN2020/141166 priority patent/WO2021139574A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention provides a method for positioning a vehicle and terminal equipment, wherein the method comprises the following steps: determining the live-action information in the target range according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; displaying the current position of the target vehicle in the live-action map. The method for positioning the vehicle and the terminal equipment provided by the embodiment of the invention can display the position of the target vehicle through the live-action map, so that the target vehicle can be conveniently and quickly positioned.

Description

Method for positioning vehicle and terminal equipment
Technical Field
The invention relates to the technical field of terminal equipment, in particular to a method for positioning a vehicle and the terminal equipment.
Background
In some scenarios, a user cannot conveniently and quickly locate a target vehicle. For example, when a network appointment car is received, a user can only look for the car through a navigation map and real-time road surface observation of the terminal device; for another example, in a large garage, a user can only look for a vehicle by observing and memorizing, the efficiency of locating the vehicle is low, and the user time is wasted.
Disclosure of Invention
The embodiment of the invention aims to provide a method for positioning a vehicle and terminal equipment, which can conveniently and quickly position a target vehicle.
To solve the above technical problem, embodiments of the present invention are achieved by the following aspects.
In a first aspect, an embodiment of the present invention provides a method for locating a vehicle, including: determining the live-action information in the target range according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; displaying the current position of the target vehicle in the live-action map.
In a second aspect, an embodiment of the present invention provides a terminal device, including: the processing module is used for determining the live-action information in the target range according to the current position of the user; a determination module to determine a target route from a current location of a target vehicle to a current location of the user; the acquisition module is used for drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; and the display module is used for displaying the current position of the target vehicle in the live-action map.
In a third aspect, an embodiment of the present invention provides a terminal device, including: a memory, a processor and computer executable instructions stored on the memory and executable on the processor, the computer executable instructions when executed by the processor implementing the steps of the method of locating a vehicle as described in the first aspect above.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium for storing computer-executable instructions which, when executed by a processor, implement the steps of the method of locating a vehicle as described in the first aspect above.
According to the method and the terminal equipment for positioning the vehicle, provided by the embodiment of the invention, the live-action information in the target range is determined according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; and displaying the current position of the target vehicle in the live-action map, and displaying the position of the target vehicle through the live-action map so as to conveniently and quickly position the target vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating a method for locating a vehicle according to an embodiment of the present invention;
FIG. 2 shows a schematic diagram of a live action map;
FIG. 3 is a schematic flow chart illustrating a method for locating a vehicle according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating a method for locating a vehicle according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 6 is a schematic diagram showing a hardware structure of a terminal device for implementing an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 shows a flowchart of a method for locating a vehicle according to an embodiment of the present invention, which may be executed by a terminal device. In other words, the method may be performed by software or hardware installed in the terminal device. As shown, the method may include the following steps.
S102: and determining the live-action information in the target range according to the current position of the user.
In this step, in one implementation, the target range may be dynamically determined based on a range between the user and the current location of the target vehicle. For example, a range between the user and the current position of the target vehicle is determined as a target range, and the live-action information within the target range is determined based on the target range. For example, the user is currently located at a place a, the target vehicle is currently located at a place B, and at this time, a circle may be drawn with the center from the place a to the place B as the center and the distance from the center to the place a or the place B as the radius, and the live-action information in the range of the circle may be determined. Of course, a similar method may be used to determine the live-action information within the rectangular range.
In another implementation manner, a preset range may be set as the target range according to the current position of the user, for example, a range 200 meters ahead of the current position of the user is used as the target range, for example, a circle is made with the current position of the user as a center of the circle, a circle is made with a radius of 200 meters, a circle with the radius of 200 meters is used as the preset range, and the like. And, determining the live-action information within the target range based on the target range.
The implementation mode is suitable for the situation that the distance between the current position A of the user and the current position B of the target vehicle is far beyond the human eye observation range of the user, if the real scene information in a large range is called, unnecessary data processing amount is brought to the terminal equipment, and at the moment, a preset range can be set as a target range according to the human eye observation range, so that the observation of the user is facilitated, and the data processing amount of the terminal equipment can be reduced.
S104: a target route is determined from a current location of a target vehicle to a current location of the user.
A target route between the current location of the target vehicle to the current location of the user may be determined based on existing GPS location or navigation techniques based on the current location of the user and the current location of the target vehicle.
S106: and drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle.
Fig. 2 is a schematic diagram of a live-action map, which is drawn according to the live-action information in the target range and the target route, and with the current position of the user as a viewing angle.
S108: displaying the current position of the target vehicle in the live-action map.
When a taxi is taken, for example, a user can conveniently and quickly find the target vehicle through the live-action map. In a large garage scene, for example, a user can conveniently and quickly find a target vehicle through a real-scene map without searching for the vehicle through memory.
Therefore, according to the method for positioning the vehicle provided by the embodiment of the invention, the live-action information in the target range is determined according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; and displaying the current position of the target vehicle in the live-action map, and displaying the position of the target vehicle through the live-action map so as to conveniently and quickly position the target vehicle.
Fig. 3 is a flowchart illustrating a method for locating a vehicle according to an embodiment of the present invention, which may be performed by a terminal device. In other words, the method may be performed by software or hardware installed in the terminal device. As shown, the method may include the following steps.
S302: and determining the live-action information in the target range according to the current position of the user.
S304: a target route is determined from a current location of a target vehicle to a current location of the user.
S306: and drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle.
Steps S302-S306 may be similar to steps S102-S106 of the embodiment of fig. 1, and are not repeated herein.
S308: and displaying the current position of the target vehicle in the live-action map, namely marking the current position of the target vehicle in the live-action map, wherein a starting end and a tail end are marked in the live-action map, and at least one preset unit distance is marked between the starting end and the tail end.
In one implementation, as shown in fig. 2, the start and end of the live-action map can be determined by two-way confirmation of the start and end, specifically, the end can be seen by the start of the live-action map, and the start in the field of view becomes the end can be seen by the live-action map. Marking the start end and end positions on the live-action map and outputting the distance as the farthest distance that the user can recognize through the live-action map at the start point, optionally, the farthest distance may be used as the preset range in step S102 or S202.
In addition, in order to facilitate the user to judge the distance, at least one preset unit distance, such as the unit distance N, may be marked between the starting end and the end as a scale. In one implementation, since the vehicle body + front-rear vehicle distance on the lane is at least 7 meters, the road live-action distance N can be marked in the live-action map by using 7 meters as a scale.
In one implementation, the target vehicle identification information may also be marked, for example, a vehicle may be identified with a first preset mark, such as a star in fig. 2. For example, vehicle identification information, such as one or more of license plate number, vehicle type, color, may be marked.
In one implementation, the real-time distance between the target vehicle and the user may also be tagged.
S310: and acquiring information of other vehicles between the starting end and the tail end, and marking the information of the other vehicles in the live-action map.
On the basis of marking the target vehicle in the above steps, information of other vehicles nearby, for example, information of other vehicles between the starting end and the end, can be further acquired through the internet of vehicles and GPS positioning technology, and the information of the other vehicles is marked in the live-action map, so that the user can conveniently use the other vehicles as reference objects to find the target vehicle more efficiently.
In one implementation, information of other vehicles in the same lane between the starting end and the tail end can be acquired, and the information of the other vehicles can be marked in the live-action map. The identification degree of other vehicles in the same-direction lane is higher, so that a user can more conveniently find a target vehicle.
In one implementation, the information of other vehicles may include, for example: one or more of license plate number, color and vehicle type. In one implementation, the information of the other vehicle may be marked in the live-action map, for example, the information of the other vehicle may be marked with a second preset mark, which is different from the first preset mark, thereby facilitating the user to distinguish the other vehicle from the target vehicle.
For example, the information that the other vehicles are marked in the live-action map is red audi and is located in front of the target vehicle, so that when the red audi passes through, the user can conveniently and quickly know that the target vehicle is close to the user.
When a taxi is taken, for example, a user can conveniently and quickly find the target vehicle through the live-action map. In a large garage scene, for example, a user can conveniently and quickly find a target vehicle through a real-scene map without searching for the vehicle through memory.
Therefore, according to the method for positioning the vehicle provided by the embodiment of the invention, the live-action information in the target range is determined according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; and displaying the current position of the target vehicle in the live-action map, and displaying the position of the target vehicle through the live-action map so as to conveniently and quickly position the target vehicle.
In addition, according to the method for positioning the vehicle provided by the embodiment of the invention, the starting end and the tail end are marked in the live-action map, and at least one preset unit distance is marked between the starting end and the tail end, so that the preset unit distance can be used as a scale, and a user can conveniently check the position of the target vehicle.
In addition, according to the method for positioning the vehicle provided by the embodiment of the invention, the information of other vehicles between the starting end and the tail end is obtained; the information of the other vehicles is marked in the live-action map, so that a reference object can be provided for a user, and the user can conveniently check the position of the target vehicle.
In addition, according to the method for positioning the vehicle provided by the embodiment of the invention, the current position of the target vehicle is marked in the live-action map by marking the identification information and/or the real-time distance of the target vehicle, wherein the real-time distance is the real-time distance between the target vehicle and the user, so that the user can conveniently and accurately check the position of the target vehicle.
Fig. 4 is a flowchart illustrating a method for locating a vehicle according to an embodiment of the present invention, which may be performed by a terminal device. In other words, the method may be performed by software or hardware installed in the terminal device. As shown, the method may include the following steps.
S402: and determining the live-action information in the target range according to the current position of the user.
S404: a target route is determined from a current location of a target vehicle to a current location of the user.
S406: and drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle.
Steps S402-S406 may be similar to steps S102-S106 of the embodiment of fig. 1, and are not repeated herein.
S408: marking the current location of the target vehicle at the end if the current location of the target vehicle is not within the target range.
The live-action map is marked with a starting end and a tail end, and at least one preset unit distance is marked between the starting end and the tail end.
In one implementation, the start and end of the live-action map are determined by two-way confirmation of the start and end, specifically, the end is seen by the live-action start and the start in the field of view becomes the end is seen by the end live-action. Marking the starting end and the tail end on the live-action map and outputting the distance as the farthest distance which can be identified by the live-action map at the starting point by the user.
In addition, in order to facilitate the user to judge the distance, at least one preset unit distance, such as the unit distance N, may be marked between the starting end and the end as a scale. In one implementation, since the vehicle body + front-rear vehicle distance on the lane is at least 7 meters, the road live-action distance N can be marked in the live-action map by using 7 meters as a scale.
In this embodiment, since the target vehicle is far away from the user and may exceed the observation range of the user, at this time, the current position of the target vehicle is marked at the end of the target range, such as the dot mark at the end of the target range in fig. 2, the dot mark shown in the figure is only for distinguishing the star mark in the previous embodiment and does not indicate that the mark of the target vehicle needs to be distinguished in the embodiment of the present invention, and in a specific implementation manner, a person skilled in the art may select to distinguish or not distinguish the mark of the target vehicle according to actual needs.
In one implementation, the vehicle's identification information, such as one or more of license plate number, model, color, may also be tagged. In one implementation, the real-time distance between the target vehicle and the user may also be tagged.
When the current position of the target vehicle exceeds the observation range of the user, the current position of the target vehicle can be marked at the end of the target range, so that the user can conveniently know the direction and the approximate position of the target vehicle.
In addition, in conjunction with the related description of step S310 in the previous embodiment, information of other vehicles may be marked between the starting end and the ending end, and the information of the other vehicles is marked in the live-action map, so that the user can conveniently use the other vehicles as reference objects to more efficiently find the target vehicle. The specific implementation manner is similar to the related steps in the embodiment of fig. 3, and is not described herein again.
Therefore, according to the method for positioning the vehicle provided by the embodiment of the invention, the live-action information in the target range is determined according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; and displaying the current position of the target vehicle in the live-action map, and displaying the position of the target vehicle through the live-action map so as to conveniently and quickly position the target vehicle.
In addition, according to the method for locating the vehicle provided by the embodiment of the invention, the current position of the target vehicle is marked at the tail end under the condition that the current position of the target vehicle is not in the target range, so that the user can conveniently locate the direction and the approximate position of the target vehicle under the condition that the target vehicle is far away.
Fig. 5 is a schematic structural diagram of a terminal device provided in an embodiment of the present invention, where the terminal device 500 includes: a processing module 510, a determining module 520, an obtaining module 530, and a display module 540.
The processing module 510 is configured to determine the live-action information within the target range according to the current position of the user. The determination module 520 is used to determine a target route from the current location of the target vehicle to the current location of the user. The obtaining module 530 is configured to draw a live-action map according to the live-action information in the target range and the target route, where the live-action map is drawn with the current position of the user as a viewing angle. The display module 540 is configured to display the current position of the target vehicle in the live-action map.
In one implementation, the display module 540 is configured to mark the current position of the target vehicle in the live-action map, wherein the live-action map is marked with a start point and an end point, and at least one preset unit distance is marked between the start point and the end point.
In one implementation, the display module 540 is configured to obtain information of other vehicles between the starting end and the ending end after displaying the current position of the target vehicle in the live-action map; and marking the information of the other vehicles in the live-action map.
In one implementation, the display module 540 is configured to mark the current location of the target vehicle in the live-action map by marking the target vehicle identification information and/or a real-time distance, where the real-time distance is a real-time distance between the target vehicle and the user.
In one implementation, the display module 540 is configured to mark the current location of the target vehicle at the end if the current location of the target vehicle is not within the target range.
The terminal device 500 provided in the embodiment of the present invention may execute the method for positioning a vehicle described in the foregoing method embodiment, and implement the functions and beneficial effects of the methods described in the foregoing method embodiment, which are not described herein again.
Fig. 6 is a schematic hardware structure diagram of a terminal device for executing the method for locating a vehicle according to the embodiment of the invention. As shown, the terminal device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the terminal device configurations shown in the figures are not intended to be limiting, and that terminal devices may include more or fewer components than those shown, or some of the components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The processor 610 is configured to determine, according to the current position of the user, real-scene information within a target range; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; displaying the current position of the target vehicle in the live-action map.
The terminal device according to the embodiment of the present invention may execute the method for positioning a vehicle described in the foregoing method embodiment, and implement the functions and beneficial effects of the methods described in the foregoing method embodiment, which are not described herein again.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 602, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the terminal apparatus 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The terminal device 600 further comprises at least one sensor 605, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the luminance of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the terminal apparatus 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensor 605 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., wherein the infrared sensor can measure a distance between an object and a terminal device by emitting and receiving infrared light, which is not described herein again. The pressure sensors may include 2 pressure sensors respectively disposed on the front screen and the back screen of the terminal device to respectively detect touch operations from the front screen and the back screen of the terminal device.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 6, the touch panel 6071 and the display panel 6061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to implement the input and output functions of the terminal device, and this is not limited here.
The interface unit 608 is an interface for connecting an external device to the terminal apparatus 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 600 or may be used to transmit data between the terminal apparatus 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal device. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal device 600 may further include a power supply 611 (such as a battery) for supplying power to various components, and preferably, the power supply 611 may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 600 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides a terminal device, which includes a processor 610, a memory 609, and a computer program that is stored in the memory 609 and can be run on the processor 610, and when being executed by the processor 610, the computer program implements each process of the foregoing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
An embodiment of the present invention also provides a computer-readable storage medium storing one or more programs that, when executed by a terminal device including a plurality of application programs, cause the terminal device to perform the following operations: determining the live-action information in the target range according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; displaying the current position of the target vehicle in the live-action map.
The computer-readable storage medium includes a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
When being executed by a processor, the computer program realizes the processes of the method, can achieve the same technical effect, and is not repeated herein to avoid repetition.
Further, an embodiment of the present invention also provides a computer program product, the computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, which when executed by a computer, implement the following process: determining the live-action information in the target range according to the current position of the user; determining a target route from a current location of a target vehicle to a current location of the user; drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle; displaying the current position of the target vehicle in the live-action map.
When being executed by a processor, the computer program realizes the processes of the method, can achieve the same technical effect, and is not repeated herein to avoid repetition.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (10)

1. A method of locating a vehicle, comprising:
determining the live-action information in the target range according to the current position of the user;
determining a target route from a current location of a target vehicle to a current location of the user;
drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle;
displaying the current position of the target vehicle in the live-action map.
2. The method of claim 1, wherein said displaying the current location of the target vehicle in the live-action map comprises:
marking the current position of the target vehicle in the live-action map, wherein a starting end and a tail end are marked in the live-action map, and at least one preset unit distance is marked between the starting end and the tail end.
3. The method of claim 2, wherein after displaying the current location of the target vehicle in the live-action map, further comprising:
acquiring information of other vehicles between the starting end and the tail end;
and marking the information of the other vehicles in the live-action map.
4. The method of claim 2, wherein said marking the current location of the target vehicle in the live-action map comprises:
marking the current position of the target vehicle in the live-action map by marking the target vehicle identification information and/or a real-time distance, wherein the real-time distance is a real-time distance between the target vehicle and the user.
5. The method of claim 2, wherein said displaying the current location of the target vehicle in the live-action map comprises:
marking the current location of the target vehicle at the end if the current location of the target vehicle is not within the target range.
6. A terminal device, comprising:
the processing module is used for determining the live-action information in the target range according to the current position of the user;
a determination module to determine a target route from a current location of a target vehicle to a current location of the user;
the acquisition module is used for drawing a live-action map according to the live-action information in the target range and the target route, wherein the live-action map is drawn by taking the current position of the user as a visual angle;
and the display module is used for displaying the current position of the target vehicle in the live-action map.
7. The terminal device of claim 6, wherein the display module is configured to: marking the current position of the target vehicle in the live-action map, wherein a starting end and a tail end are marked in the live-action map, and at least one preset unit distance is marked between the starting end and the tail end.
8. The terminal device of claim 7, wherein the display module is configured to: after the current position of the target vehicle is displayed in the live-action map, acquiring information of other vehicles between the starting end and the tail end; and marking the information of the other vehicles in the live-action map.
9. The terminal device of claim 7, wherein the display module is configured to: marking the current position of the target vehicle in the live-action map by marking the target vehicle identification information and/or a real-time distance, wherein the real-time distance is a real-time distance between the target vehicle and the user.
10. The terminal device of claim 7, wherein the display module is configured to: marking the current location of the target vehicle at the end if the current location of the target vehicle is not within the target range.
CN202010010221.2A 2020-01-06 2020-01-06 Method for positioning vehicle and terminal equipment Pending CN111238495A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010010221.2A CN111238495A (en) 2020-01-06 2020-01-06 Method for positioning vehicle and terminal equipment
PCT/CN2020/141166 WO2021139574A1 (en) 2020-01-06 2020-12-30 Vehicle locating method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010010221.2A CN111238495A (en) 2020-01-06 2020-01-06 Method for positioning vehicle and terminal equipment

Publications (1)

Publication Number Publication Date
CN111238495A true CN111238495A (en) 2020-06-05

Family

ID=70869251

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010010221.2A Pending CN111238495A (en) 2020-01-06 2020-01-06 Method for positioning vehicle and terminal equipment

Country Status (2)

Country Link
CN (1) CN111238495A (en)
WO (1) WO2021139574A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021139574A1 (en) * 2020-01-06 2021-07-15 维沃移动通信有限公司 Vehicle locating method and terminal device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606301A (en) * 2013-11-27 2014-02-26 南通芯迎设计服务有限公司 Parking and vehicle locating method based on intelligent terminal
US9146125B2 (en) * 2012-06-05 2015-09-29 Apple Inc. Navigation application with adaptive display of graphical directional indicators
CN106658415A (en) * 2017-02-21 2017-05-10 上海量明科技发展有限公司 Method for seeking shared vehicle in realistic view, vehicle-booking and system
CN107084740A (en) * 2017-03-27 2017-08-22 宇龙计算机通信科技(深圳)有限公司 A kind of air navigation aid and device
CN107452220A (en) * 2016-05-30 2017-12-08 长城汽车股份有限公司 A kind of car-mounted terminal and intelligent vehicle-tracing system
CN207909351U (en) * 2018-03-16 2018-09-25 上海芭比信息技术服务有限公司 A kind of guiding of parking stall and reverse vehicle searching system
CN108627159A (en) * 2017-03-16 2018-10-09 北京嘀嘀无限科技发展有限公司 A kind of method and device for assisting user's positioning vehicle
CN109547925A (en) * 2018-12-07 2019-03-29 纳恩博(北京)科技有限公司 Location updating method, the display methods of position and navigation routine, vehicle and system
CN109872556A (en) * 2018-12-27 2019-06-11 福建农林大学 Vehicle system is sought in a kind of parking garage based on augmented reality

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103220415B (en) * 2013-03-28 2015-01-07 东软集团(上海)有限公司 One-to-one cellphone live-action position trailing method and system
US10332292B1 (en) * 2017-01-17 2019-06-25 Zoox, Inc. Vision augmentation for supplementing a person's view
CN111238495A (en) * 2020-01-06 2020-06-05 维沃移动通信有限公司 Method for positioning vehicle and terminal equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9146125B2 (en) * 2012-06-05 2015-09-29 Apple Inc. Navigation application with adaptive display of graphical directional indicators
CN103606301A (en) * 2013-11-27 2014-02-26 南通芯迎设计服务有限公司 Parking and vehicle locating method based on intelligent terminal
CN107452220A (en) * 2016-05-30 2017-12-08 长城汽车股份有限公司 A kind of car-mounted terminal and intelligent vehicle-tracing system
CN106658415A (en) * 2017-02-21 2017-05-10 上海量明科技发展有限公司 Method for seeking shared vehicle in realistic view, vehicle-booking and system
CN108627159A (en) * 2017-03-16 2018-10-09 北京嘀嘀无限科技发展有限公司 A kind of method and device for assisting user's positioning vehicle
CN107084740A (en) * 2017-03-27 2017-08-22 宇龙计算机通信科技(深圳)有限公司 A kind of air navigation aid and device
CN207909351U (en) * 2018-03-16 2018-09-25 上海芭比信息技术服务有限公司 A kind of guiding of parking stall and reverse vehicle searching system
CN109547925A (en) * 2018-12-07 2019-03-29 纳恩博(北京)科技有限公司 Location updating method, the display methods of position and navigation routine, vehicle and system
CN109872556A (en) * 2018-12-27 2019-06-11 福建农林大学 Vehicle system is sought in a kind of parking garage based on augmented reality

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021139574A1 (en) * 2020-01-06 2021-07-15 维沃移动通信有限公司 Vehicle locating method and terminal device

Also Published As

Publication number Publication date
WO2021139574A1 (en) 2021-07-15

Similar Documents

Publication Publication Date Title
CN109164477B (en) Application positioning method and mobile terminal
CN109862504B (en) Display method and terminal equipment
CN107826109B (en) Lane keeping method and apparatus
US10636228B2 (en) Method, device, and system for processing vehicle diagnosis and information
CN108051010B (en) Method for determining time of arrival at destination and mobile terminal
CN109993234B (en) Unmanned driving training data classification method and device and electronic equipment
CN110456395B (en) Positioning method and terminal equipment
CN108551525B (en) State determination method of movement track and mobile terminal
CN111399792B (en) Content sharing method and electronic equipment
CN111107219B (en) Control method and electronic equipment
CN110536236B (en) Communication method, terminal equipment and network equipment
CN109784234B (en) Right-angled bend identification method based on forward fisheye lens and vehicle-mounted equipment
CN109618055B (en) Position sharing method and mobile terminal
CN111196281A (en) Page layout control method and device for vehicle display interface
CN112927539B (en) Mapping method and device for automatic parking
CN108196663B (en) Face recognition method and mobile terminal
CN109618278B (en) Positioning method and mobile terminal
CN110148167B (en) Distance measuring method and terminal equipment
CN111238495A (en) Method for positioning vehicle and terminal equipment
CN111148180A (en) Method for connecting base station and terminal equipment
CN110824516B (en) Positioning method and electronic equipment
CN114598992A (en) Information interaction method, device, equipment and computer readable storage medium
CN110046569B (en) Unmanned driving data processing method and device and electronic equipment
CN110440825B (en) Distance display method and terminal
CN110995816B (en) Sharing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200605