WO2022206179A1 - 定位方法和装置 - Google Patents

定位方法和装置 Download PDF

Info

Publication number
WO2022206179A1
WO2022206179A1 PCT/CN2022/075723 CN2022075723W WO2022206179A1 WO 2022206179 A1 WO2022206179 A1 WO 2022206179A1 CN 2022075723 W CN2022075723 W CN 2022075723W WO 2022206179 A1 WO2022206179 A1 WO 2022206179A1
Authority
WO
WIPO (PCT)
Prior art keywords
server
terminal device
target
lane
camera
Prior art date
Application number
PCT/CN2022/075723
Other languages
English (en)
French (fr)
Inventor
龙祁峰
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2022206179A1 publication Critical patent/WO2022206179A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules

Definitions

  • the present application relates to the technical field of terminals, and in particular, to a positioning method and apparatus.
  • an overpass that contains multiple layers of pavement.
  • the navigation cannot distinguish which layer of the overpass the current user is located on, and thus cannot provide an accurate navigation route according to the accurate pavement layer; especially When the user goes wrong according to the navigation route, the navigation software cannot identify which layer of the overpass the user is currently on in time, and thus cannot update the correct navigation route in time, which brings great trouble to the user when driving on the overpass.
  • RFID radio frequency identification
  • RFID tags can be laid on the road surface, and based on the signals transmitted by the RFID received when the vehicle passes the road surface, the road surface layer where the vehicle is located can be positioned and identified.
  • the vehicle may not be able to recognize the signal sent by the RFID tag on the road because the vehicle on the road is far away from the RFID tag, and thus the road surface layer where the vehicle is located in the overpass cannot be performed.
  • Precise positioning; in addition, laying RFID on the road surface of the overpass is easy to be crushed by vehicles on the one hand, and on the other hand, it needs to destroy the existing road surface for laying, which requires a large amount of engineering.
  • Embodiments of the present application provide a positioning method and apparatus, which can accurately determine which lane of multiple lanes a vehicle is located in, and then a terminal device that receives information about the lane can navigate based on the accurate lane.
  • an embodiment of the present application provides a positioning method, which is applied to a positioning system.
  • the positioning system includes: a terminal device and a first server. the starting position and the destination; the first server sends the first navigation route to the terminal device according to the starting position and the destination; in the process that the terminal device travels according to the first navigation route, the terminal device reports the position of the terminal device to the first server information; when the location information reflects that the terminal device is about to enter the intersection, the first server obtains the target lane where the terminal device is located; wherein, multiple cameras are set in the intersection, and the multiple cameras are used to photograph objects in different lanes in the intersection; the target The lane is determined by the first server based on the content obtained by the camera, or the target lane is determined by the first server based on the information received from the second server; the first server sends the instruction information for indicating the target lane to the terminal device; the terminal device According to the indication information, the target lane that the user is in is prompted. In this way, the terminal
  • the lane can be understood as pavement layer or road layer;
  • the first server can be a navigation server;
  • the second server can be a traffic platform server;
  • the intersection can be understood as the intersection of multi-layer roads, or the intersection of multiple roads;
  • the first navigation route may be understood as a navigation route obtained based on the GPS positioning of the terminal device.
  • the terminal device can be a mobile phone or a vehicle.
  • the first server obtains the target lane where the terminal device is located, including: the first server photographs objects in multiple lanes of the intersection based on multiple cameras, and obtains multiple first association relationships, any one of the first An association relationship includes the image and the identification of the camera that captured the image; when the first server recognizes the identification of the target object in multiple images, the first server determines the target camera corresponding to the target image including the identification of the target object; the first server according to The second association relationship determines the target lane where the target camera is located; the second association relationship includes the correspondence between the camera and the lane.
  • the first server can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device can navigate based on the accurate lane sent by the first server.
  • the object in the lane can be the license plate of the vehicle in the lane;
  • the image can be the license plate photo containing the license plate information;
  • the identification of the camera can be the camera number;
  • the identification corresponding to the target can be the license plate number;
  • acquiring the target lane where the terminal device is located by the first server includes: the first server sends a query request to the second server, and the query request includes the identifier of the target object and any one of the following: the target The location information of the object or the identification of the intersection; the first server receives the identification of the target lane from the second server.
  • the second server saves the correspondence between the camera and the lane
  • the second server can accurately determine which lane the user is in based on the correspondence between the camera and the lane, and send the information of the lane to the first server, and then the terminal The device may navigate based on the exact lane sent by the first server.
  • acquiring the target lane where the terminal device is located by the first server includes: the first server sends a query request to the second server, and the query request includes the identifier of the target object and any one of the following: the target The location information of the object or the identification of the intersection; the first server receives the identification of the target camera from the second server, and the target camera is the camera that captured the target object; the first server determines the target lane where the target camera is located according to the second association relationship; The second association relationship includes the corresponding relationship between the camera and the lane.
  • the first server when the first server stores the corresponding relationship between the camera and the lane, the first server can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device can use the accurate lane sent by the first server. navigation.
  • the method further includes: when the target lane is different from the lane indicated in the first navigation route, the first server sends the second navigation route to the terminal device according to the target lane and the destination. In this way, the terminal device can provide the user with a more accurate navigation route according to the different lanes determined in different scenarios.
  • the second navigation route may be a navigation route corresponding to the target lane obtained based on the correspondence between the camera and the lane.
  • the method further includes: receiving the location information from the terminal device within the first time period by the first server When the first server continues to navigate the terminal device according to the second navigation route within the first time period; when the first server receives the location information from the terminal device after the first time period, the first server according to the first time period
  • the position information of the terminal device received afterward is the navigation of the terminal device. In this way, under different time conditions, more accurate lane information during the period can be obtained, and then the navigation software can provide the user with a more accurate navigation route based on the accurate lane information.
  • the method further includes: the first server sets a first weight for the lane indicated in the first navigation route, and sets a second weight for the target lane according to the environment information; wherein, when the environment information indicates that the environment is not When it is conducive to image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the first server The second navigation route is sent to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination. In this way, by setting different weights for lanes determined by different devices, more accurate lane information can be obtained according to the weights under different applicable conditions, and then the terminal device can provide users with accurate navigation routes based on the accurate lane information. .
  • the environment that affects the image recognition may be a thunderstorm day or a haze day, etc., bad weather, or an environment with low visibility.
  • the method further includes: when the target lane is different from the lane indicated in the first navigation route, and the distance between the target lane and the lane indicated in the first navigation route is greater than the distance threshold, the first The server continues to navigate the terminal device according to the first navigation route. In this way, the terminal device can judge the distance between the lanes obtained in different scenarios, and obtain more accurate lane information, and then the navigation software can provide the user with an accurate navigation route based on the accurate lane information.
  • the first navigation route can be the terminal device navigation.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • an embodiment of the present application provides a positioning method.
  • the method includes: a first server receives an identifier, a starting position, and a destination of a target object that needs to be navigated from a terminal device; ground, send the first navigation route to the terminal device; the first server receives the location information of the terminal device in the process of driving the first navigation route; when the location information reflects that the terminal device is about to enter the intersection, the first server obtains the location where the terminal device is located.
  • the target lane wherein, multiple cameras are set at the intersection, and the multiple cameras are used to photograph objects in different lanes at the intersection; the target lane is determined by the first server based on the content obtained by the cameras, or the target lane is determined by the first server according to the It is determined from the information received from the second server; the first server sends indication information for indicating the target lane to the terminal device.
  • the terminal device can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device that receives the lane information can navigate based on the accurate lane.
  • the first server obtains the target lane where the terminal device is located, including: the first server photographs objects in multiple lanes of the intersection based on multiple cameras, and obtains multiple first association relationships, any one of the first An association relationship includes the image and the identification of the camera that captured the image; when the first server recognizes the identification of the target object in multiple images, the first server determines the target camera corresponding to the target image including the identification of the target object; the first server according to The second association relationship determines the target lane where the target camera is located; the second association relationship includes the correspondence between the camera and the lane.
  • the first server can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device can navigate based on the accurate lane sent by the first server.
  • acquiring the target lane where the terminal device is located by the first server includes: the first server sends a query request to the second server, and the query request includes the identifier of the target object and any one of the following: the target The location information of the object or the identification of the intersection; the first server receives the identification of the target lane from the second server.
  • the second server saves the correspondence between the camera and the lane
  • the second server can accurately determine which lane the user is in based on the correspondence between the camera and the lane, and send the information of the lane to the first server, and then the terminal The device may navigate based on the exact lane sent by the first server.
  • acquiring the target lane where the terminal device is located by the first server includes: the first server sends a query request to the second server, and the query request includes the identifier of the target object and any one of the following: the target The location information of the object or the identification of the intersection; the first server receives the identification of the target camera from the second server, and the target camera is the camera that captured the target object; the first server determines the target lane where the target camera is located according to the second association relationship; The second association relationship includes the corresponding relationship between the camera and the lane.
  • the first server when the first server stores the corresponding relationship between the camera and the lane, the first server can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device can use the accurate lane sent by the first server. navigation.
  • the method further includes: when the target lane is different from the lane indicated in the first navigation route, the first server sends the second navigation route to the terminal device according to the target lane and the destination. In this way, the terminal device can provide the user with a more accurate navigation route according to the different lanes determined in different scenarios.
  • the method further includes: receiving the location information from the terminal device within the first time period by the first server When the first server continues to navigate the terminal device according to the second navigation route within the first time period; when the first server receives the location information from the terminal device after the first time period, the first server according to the first time period
  • the position information of the terminal device received afterward is the navigation of the terminal device. In this way, under different time conditions, more accurate lane information during the period can be obtained, and then the navigation software can provide the user with an accurate navigation route based on the accurate lane information.
  • the method further includes: the first server sets a first weight for the lane indicated in the first navigation route, and sets a second weight for the target lane according to the environment information; wherein, when the environment information indicates that the environment is not When it is conducive to image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the first server The second navigation route is sent to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination. In this way, by setting different weights for lanes determined by different devices, more accurate lane information can be obtained according to the weights under different applicable conditions, and then the terminal device can provide users with accurate navigation routes based on the accurate lane information. .
  • the method further includes: when the target lane is different from the lane indicated in the first navigation route, and the distance between the target lane and the lane indicated in the first navigation route is greater than the distance threshold, the first The server continues to navigate the terminal device according to the first navigation route. In this way, the terminal device can judge the distance between the lanes obtained in different scenarios, and obtain more accurate lane information, and then the navigation software can provide the user with an accurate navigation route based on the accurate lane information.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • an embodiment of the present application provides a positioning method.
  • the method includes: a terminal device sends an identifier, a starting position and a destination of a target object to be navigated to a first server; the terminal device receives a first navigation route; the first navigation route is related to the starting position and the destination; in the process of the terminal device traveling according to the first navigation route, the terminal device reports the location information of the terminal device to the first server; the location information reflects that the terminal device is about to drive
  • the terminal device sends prompt information to the first server; the prompt information is used to prompt the terminal device to enter the intersection; the terminal device receives the instruction information from the first server for indicating the target lane; the terminal device prompts the user according to the instruction information in the target lane.
  • the terminal device can accurately determine which lane the user is in based on the corresponding relationship between the camera and the lane, and then the terminal device that receives the lane information can navigate based on the accurate lane.
  • an embodiment of the present application provides a positioning device, which is applied to a positioning system.
  • the positioning system includes: a terminal device and a first server, and the device includes: a communication unit for sending to the first server a target object that needs to be navigated.
  • the communication unit is also used to send the first navigation route to the terminal device according to the starting position and the destination; during the process of the terminal device traveling according to the first navigation route, the communication unit is also used for to report the location information of the terminal device to the first server; when the location information reflects that the terminal device is about to enter the intersection, the processing unit is used to obtain the target lane where the terminal device is located; used for photographing objects in different lanes at the intersection; the target lane is determined by the first server based on the content captured by the camera, or the target lane is determined by the first server based on information received from the second server; the communication unit is also used for Sending indication information for indicating the target lane to the terminal device; the processing unit is further configured to prompt the target lane where the user is located according to the indication information.
  • the processing unit is specifically configured to photograph objects in multiple lanes of the intersection based on multiple cameras, and obtain multiple first association relationships, and any first association relationship includes an image and a camera that captures the image the identification of the target object; the processing unit is also specifically used for identifying the target camera corresponding to the target image including the identification of the target object when the identification of the target object is identified in the plurality of images; the processing unit is also specifically used for according to the second The association relationship determines the target lane where the target camera is located; the second association relationship includes the correspondence between the camera and the lane.
  • the communication unit is specifically configured to send a query request to the second server, where the query request includes an identifier of the target object and any of the following: location information of the target object or an identifier of an intersection; communication The unit is also specifically configured to receive the identification of the target lane from the second server.
  • the communication unit is specifically configured to send a query request to the second server, where the query request includes an identifier of the target object and any of the following: location information of the target object or an identifier of an intersection; communication The unit is also specifically used to receive the identification of the target camera from the second server, and the target camera is the camera that captures the target object; the communication unit is specifically used to determine the target lane where the target camera is located according to the second association relationship; the second association relationship Including the correspondence between cameras and lanes.
  • the communication unit is further configured to send the second navigation route to the terminal device according to the target lane and the destination.
  • the processing unit when the first server receives the location information from the terminal device within the first time period, the processing unit is specifically configured to continue to provide the terminal device according to the second navigation route within the first time period Navigation; when the first server receives the location information from the terminal device after the first time period, the processing unit is further specifically configured to navigate the terminal device according to the location information of the terminal device received after the first time period.
  • the processing unit is further configured to set a first weight for the lane indicated in the first navigation route, and set a second weight for the target lane according to the environment information; wherein the environment information indicates that the environment is not conducive to During image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the processing unit also It is used to send the second navigation route to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination.
  • the processing unit is further configured to: Continue to navigate the terminal device according to the second navigation route.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • an embodiment of the present application provides a positioning apparatus, the apparatus includes: a communication unit, configured to receive an identifier, a starting position, and a destination of a target object that needs to be navigated from a terminal device; The starting position and destination are used to send the first navigation route to the terminal device; the communication unit is also used to receive the position information of the terminal device in the process of driving the first navigation route; when the position information reflects that the terminal device is about to enter the intersection, The processing unit is used to obtain the target lane where the terminal device is located; wherein, a plurality of cameras are arranged in the intersection, and the plurality of cameras are used to photograph objects in different lanes in the intersection; the target lane is determined by the first server based on the content obtained by the cameras. , or, the target lane is determined by the first server according to the information received from the second server; the communication unit is further configured to send indication information for indicating the target lane to the terminal device.
  • the processing unit is specifically configured to photograph objects in multiple lanes of the intersection based on multiple cameras, and obtain multiple first association relationships, and any first association relationship includes an image and a camera that captures the image the identification of the target object; when the first server recognizes the identification of the target object in the multiple images, the processing unit is also specifically used to determine the target camera corresponding to the target image including the identification of the target object; the processing unit is also specifically used for according to the second The association relationship determines the target lane where the target camera is located; the second association relationship includes the correspondence between the camera and the lane.
  • the communication unit is specifically configured to send a query request to the second server, where the query request includes an identifier of the target object and any of the following: location information of the target object or an identifier of an intersection; communication The unit is specifically configured to receive the identification of the target lane from the second server.
  • the communication unit is specifically configured to send a query request to the second server, where the query request includes an identifier of the target object and any of the following: location information of the target object or an identifier of an intersection; communication The unit is also specifically used to receive the identification of the target camera from the second server, and the target camera is the camera that captures the target object; the processing unit is specifically used to determine the target lane where the target camera is located according to the second association relationship; the second association relationship Including the correspondence between cameras and lanes.
  • the communication unit is further configured to send the second navigation route to the terminal device according to the target lane and the destination.
  • the processing unit when the first server receives the location information from the terminal device within the first time period, the processing unit is specifically configured to continue to provide the terminal device according to the second navigation route within the first time period Navigation; when the first server receives the location information from the terminal device after the first time period, the processing unit is further specifically configured to navigate the terminal device according to the location information of the terminal device received after the first time period.
  • the processing unit is further configured to set a first weight for the lane indicated in the first navigation route, and set a second weight for the target lane according to the environment information; wherein the environment information indicates that the environment is not conducive to During image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the communication unit also It is used to send the second navigation route to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination.
  • the processing unit is further configured to: Continue to navigate the terminal device according to the second navigation route.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • an embodiment of the present application provides a positioning device, the device includes: a communication unit, configured to send an identifier, a starting position, and a destination of a target object that needs to be navigated to a first server; a communication unit, further configured to receive The first navigation route from the first server; the first navigation route is related to the starting position and the destination; in the process of the terminal device traveling according to the first navigation route, the communication unit is also used to report the terminal device's information to the first server. location information; when the location information reflects that the terminal device is about to enter the intersection, the communication unit is also used to send prompt information to the first server; the prompt information is used to prompt the terminal device to enter the intersection; the communication unit is also used to receive information from the first server. Instruction information of a server for indicating the target lane; and a processing unit for prompting the user to be in the target lane according to the instruction information.
  • an embodiment of the present application provides a positioning apparatus, including a processor and a memory, where the memory is used for storing code instructions; the processor is used for running the code instructions, so that the electronic device executes any one of the first aspect or the first aspect
  • the positioning method described in the first implementation manner is, for example, the positioning method described in the second aspect or any implementation manner of the second aspect, or the positioning method described in the third aspect or any implementation manner of the third aspect.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores instructions, and when the instructions are executed, the computer executes the first aspect or any implementation manner of the first aspect.
  • the described positioning method is the positioning method described in the second aspect or any implementation manner of the second aspect, or the positioning method described in the third aspect or any implementation manner of the third aspect.
  • a ninth aspect a computer program product, comprising a computer program that, when the computer program is executed, causes a computer to perform the positioning method as described in the first aspect or any one of the implementations of the first aspect, such as the second aspect or the first aspect.
  • FIG. 1 is a schematic diagram of a scenario provided by an embodiment of the present application.
  • FIG. 2 is a schematic frame diagram of a terminal device 200 provided by an embodiment of the present application.
  • FIG. 3 is a schematic frame diagram of a navigation system 300 provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a scenario based on navigation server positioning provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a scenario based on positioning of a navigation server and a transportation platform server according to an embodiment of the present application
  • FIG. 6 is a schematic flowchart of a navigation server-based positioning provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an interface for inputting license plate information according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an interface for displaying a pavement layer according to an embodiment of the present application.
  • FIG. 9 is another schematic diagram of an interface for displaying a pavement layer provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a user reporting interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic flowchart of positioning based on a navigation server and a transportation platform server according to an embodiment of the present application
  • FIG. 12 is a schematic structural diagram of a positioning device according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a hardware structure of a control device provided by an embodiment of the application.
  • FIG. 14 is a schematic structural diagram of a chip according to an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect.
  • the first value and the second value are only used to distinguish different values, and do not limit their order.
  • the words “first”, “second” and the like do not limit the quantity and execution order, and the words “first”, “second” and the like are not necessarily different.
  • At least one means one or more
  • plural means two or more.
  • And/or which describes the association relationship of the associated objects, indicates that there can be three kinds of relationships, for example, A and/or B, which can indicate: the existence of A alone, the existence of A and B at the same time, and the existence of B alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b, or c can represent: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c may be single or multiple .
  • overpasses and tunnels With the development of urban transportation, the number of overpasses and tunnels is increasing, and their construction is becoming more and more complex. For example, an overpass that contains multiple layers of pavement. While the overpass brings convenience to traffic, it also brings challenges to road navigation. Normally, when a user drives into an overpass using navigation, since the overpass is provided with multiple layers of roads at the same position, the navigation cannot distinguish which layer of the overpass the current user is on. It may happen that the user has driven on the wrong road layer, but the navigation still indicates the route as if the user was on the correct road layer.
  • the overpass in the embodiment of the present application can also be replaced with a road including a main road and a side road.
  • the main road and the side road can be on the same layer or on different layers.
  • navigation cannot distinguish between the main road and the side road.
  • Auxiliary roads resulting in the inability to achieve accurate navigation for users.
  • the pavement layer may also be referred to as a road layer.
  • the pavement layer can be used to represent pavements of different layers in a multi-layer pavement; or, the pavement layer can also represent different roads in the same layer of pavement, for example, the main road and auxiliary road in adjacent roads can use different pavements layer representation.
  • the overpass and the pavement layer are all described in the following, and the description does not constitute a specific limitation on the scene.
  • FIG. 1 is a schematic diagram of a scenario provided by an embodiment of the present application.
  • the scene includes an overpass 100 .
  • the overpass 100 includes a plurality of roads: eg, road 101 , road 102 , and road 103 .
  • the road 101 may be a road on the first floor for entering the overpass;
  • the road 102 may be a road on the second floor;
  • the road 103 may be a road on the third floor.
  • the vehicle 104 may enter the overpass 100 along the road 101, the vehicle 104 may enter the road 102 in the direction indicated by arrow 2, or the vehicle 104 may enter the road 103 in the direction indicated by arrow 1.
  • the vehicle 104 may travel based on the route indicated by the navigation 106 .
  • the navigation 106 may be: vehicle navigation or mobile phone navigation.
  • the overpass 100 may include a plurality of cameras, such as a camera 112 arranged on a special camera fixing pole, a camera 108 arranged on a street lamp, a camera 110 arranged on a billboard, or a camera 111 arranged under the bridge. It can be understood that the camera may be set at other positions according to the actual scene, which is not limited in this embodiment of the present application.
  • the user drives the vehicle 104 along the road 101 into the overpass 100 according to the route indicated by the navigation 106 .
  • the user should drive the vehicle 104 on the road 103 according to the route indicated by the navigation 106 , but The user drives the vehicle 104 onto the road 102 .
  • the navigation 106 should have discovered in time that the user's driving route is wrong, and then the navigation 106 can re-plan the route for the user according to the road 102 .
  • the navigation 106 cannot distinguish which level of the overpass the current user is located on, it cannot recognize that the user has taken the wrong route. In GPS positioning, the positions are almost the same, so the navigation cannot recognize that the user is not driving on the road 103 indicated by the navigation, and thus the navigation cannot provide the user with an accurate route based on the correct road surface.
  • an RFID-based intelligent navigation method for an overpass is provided in the prior art.
  • an RFID tag can be set on the overpass, and when the vehicle determines that the current road is an overpass road (or a road with other multi-layer roads), the radio frequency antenna can be used to receive the information sent by the RFID tag set on the overpass, and the signal can be The radio frequency signal with the current road number, so that the vehicle can accurately know the road layer where it is located, and navigate based on the acquired GPS positioning signal, road number and driving route.
  • the above methods have the following problems: First, when GPS is used for positioning, since the positioning accuracy of GPS positioning signals is about ten meters, inaccurate navigation or even navigation errors will occur in dense overpasses; second, the radio frequency The method of setting the antenna at the bottom of the vehicle and setting the RFID tag under the road surface requires laying the RFID on the road surface of the overpass. In this case, the RFID is not only easy to be crushed by the passing vehicles, but also the amount of engineering required for laying the RFID. Third, the identification width of RFID is limited, and the vehicle is far away from the RFID tag on the road, and the vehicle may not be able to recognize the signal sent by the RFID tag on the road.
  • the embodiment of the present application provides a positioning method, which can make full use of the cameras arranged in the multi-layer road, and accurately determine that the user is located in the multi-layer road based on the corresponding relationship between the cameras in the multi-layer road and the road surface layers which layer of the pavement layer, and then the terminal device that receives the pavement layer information can navigate based on the accurate pavement layer.
  • the terminal device may be a vehicle with navigation capability or a device such as a mobile phone.
  • the above-mentioned terminal device may also be referred to as a terminal (terminal), user equipment (user equipment, UE), a mobile station (mobile station, MS), a mobile terminal (mobile terminal, MT), and the like.
  • the terminal device can be a mobile phone (mobile phone), a smart TV, a wearable device, a tablet computer (Pad), a computer with wireless transceiver function, a virtual reality (virtual reality, VR) terminal device, an augmented reality (augmented reality, AR) terminal Equipment, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery, wireless terminals in smart grid, transportation Wireless terminals in security (transportation safety), wireless terminals in smart cities, wireless terminals in smart homes, and so on.
  • the embodiments of the present application do not limit the specific technology and specific device form adopted by the terminal device.
  • FIG. 2 is a schematic structural diagram of a terminal device 200 according to an embodiment of the present application.
  • the terminal device 200 includes a GPS positioning module 180L, and the positioning module 180L corresponds to the navigation software in the terminal device.
  • the GPS positioning module 180L can locate the current position of the terminal device, and the navigation software can present the positioning result to the user.
  • the GPS positioning module and navigation software may be a vehicle-mounted GPS positioning module and navigation software, or may be a user mobile terminal GPS positioning module and navigation software.
  • the navigation software may include: Baidu navigation software or AutoNavi navigation software, etc.
  • the terminal device 200 may include a processor 110, an external memory interface 120, an internal memory 121, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, a sensor module 180, a key 190, a camera 193 and a display screen 194, etc.
  • the sensor module 180 may include: a pressure sensor 180A, an acceleration sensor 180E, a fingerprint sensor 180H, a touch sensor 180K, a positioning module 180L, and the like.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the terminal device 200 . It can be understood that the terminal device 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor ( image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 .
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit (I2S) interface, a pulse code modulation (PCM) interface, and/or a universal serial bus (universal) interface. serial bus, USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit
  • PCM pulse code modulation
  • USB universal serial bus
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the wireless communication function of the terminal device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Antennas in terminal device 200 may be used to cover single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the terminal device 200 .
  • the wireless communication module 160 can provide applications on the terminal device 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT wireless fidelity
  • GNSS global navigation Satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the antenna 1 of the terminal device 200 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 200 can communicate with the network and other devices through wireless communication technology.
  • Wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), long term evolution (long term evolution) term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • the terminal device 200 realizes the display function through the display screen 194 .
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the terminal device 200 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the terminal device 200 can realize the shooting function through the camera 193 and the like.
  • Camera 193 is used to capture still images or video.
  • the terminal device 200 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card to expand the storage capacity of the terminal device 200 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 200 in various directions (generally three axes).
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 200 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the keys 190 include volume keys and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the terminal device 200 may receive key input and generate key signal input related to user settings and function control of the terminal device 200 .
  • the sensor module 180 may also include a positioning module 180L.
  • the positioning module may be positioned based on the GPS system, or may be positioned based on the Beidou system or other positioning systems.
  • the positioning module 180L can be used to estimate the geographic location of the terminal device 200 .
  • FIG. 3 provides a schematic structural diagram of a navigation system 300 according to an embodiment of the present application.
  • the navigation system 300 may include: a camera 301 and a navigation server 302 .
  • the navigation system 300 may also include devices such as the transportation platform server 303 .
  • the camera 301 may be used to take pictures of the vehicle. Specifically, the camera 301 can photograph a vehicle driving on a multi-layer road, and can also perform image processing on the photo of the vehicle, thereby identifying the license plate information of the vehicle, and uploading the license plate information to the server. Optionally, when the camera 301 does not have the image processing function, the camera 301 can also upload the photograph of the vehicle to the server, and then the server performs image processing on the photograph of the vehicle to identify the license plate information.
  • the camera 301 may be a camera set on the road for capturing and detecting violations, and the number of the cameras may be one or more.
  • the navigation server 302 may be used to implement functions such as storage, processing, reception and transmission of navigation-related data.
  • the navigation server 302 may be a server belonging to a navigation software company, such as Baidu or AutoNavi.
  • the navigation server 302 may store the correspondence between the camera numbers in the multi-layer road and the pavement layers, and determine the pavement layer where the vehicle is located according to the correspondence between the camera numbers and the pavement layers.
  • the transportation platform server 303 can be used to collect and store the photos taken by the camera 301 , for example, the transportation platform server 303 can be a server belonging to the transportation management department. Specifically, the traffic platform server 303 may also store the correspondence between the camera numbers in the multi-layer road and the pavement layers, and determine the pavement layer where the vehicle is located according to the correspondence between the camera numbers and the pavement layers.
  • navigation system 300 may include other contents according to the actual scene, which is not limited in this embodiment of the present application.
  • the positioning method provided in this embodiment of the present application may be applied to various scenarios.
  • the multiple scenarios may include, scenario 1: a scenario where positioning is implemented based on the navigation server 302 (as shown in FIG. 4 ); and scenario 2: a scenario where positioning is implemented based on the navigation server 302 and the transportation platform server 303 (as shown in FIG. 5 ) scene) etc.
  • Scenario 1 a scenario in which positioning is implemented based on the navigation server 302 .
  • FIG. 4 is a schematic diagram of a scenario based on navigation server positioning provided by an embodiment of the present application.
  • the scene may include: multi-layer pavement, such as pavement layer 1, pavement layer 2, and pavement layer 3.
  • the scene may further include: the vehicle 401 , the GPS positioning module 402 of the vehicle 401 , the navigation server 302 , the collection module 403 , the collection module 404 , the collection module 405 , and the like.
  • the vehicle 401 includes a license plate, and the license plate is used to identify the vehicle 401 .
  • the collection module 403 collects the license plate information in the pavement layer 1, the collection module 404 collects the license plate information in the pavement layer 2, and the collection module 405 collects the license plate information in the pavement layer 3.
  • the GPS positioning module 402 of the vehicle 401 may recognize that the vehicle will enter the multi-layer road, and upload the location information of the vehicle 401 to the navigation server 302, or , the vehicle 401 can report the location information of the vehicle 401 to the navigation server, and the navigation server can recognize that the vehicle will enter a multi-layer road.
  • the vehicle 401 continues to drive.
  • the collection module 404 can take a photo of the vehicle 401 and identify the license plate information corresponding to the vehicle 401.
  • the collection module 404 can use the number corresponding to the collection module 404.
  • the license plate information corresponding to the vehicle 401 is uploaded to the navigation server 302 .
  • the navigation server 302 can determine the road surface layer on which the vehicle 401 is located according to the number corresponding to the acquisition module 404, such as the road surface layer 2, and send the information of the road surface layer to the navigation software corresponding to the vehicle 401, and then the navigation software can determine the road surface layer according to the accuracy
  • the road surface layer updates the navigation route, for example, the navigation software indicates that the vehicle 401 can drive in the direction indicated by arrow 4, or drive in the direction indicated by arrow 5.
  • Scenario 2 A scenario in which positioning is implemented based on the navigation server 302 and the transportation platform server 303 .
  • FIG. 5 is a schematic diagram of a scenario based on positioning of a navigation server and a transportation platform server according to an embodiment of the present application.
  • the scene may include: multi-layer pavement, such as pavement layer 1, pavement layer 2, and pavement layer 3.
  • the scene may further include: a vehicle 401 , a GPS positioning module 402 of the vehicle 401 , a navigation server 302 , a transportation platform server 303 , a collection module 403 , a collection module 404 , and a collection module 405 , and the like.
  • the vehicle 401 includes a license plate.
  • the collection module 403 collects the license plate information in the pavement layer 1
  • the collection module 404 collects the license plate information in the pavement layer 2
  • the collection module 405 collects the license plate information in the pavement layer 3.
  • the GPS positioning module 402 of the vehicle 401 can recognize that the vehicle is about to enter the multi-layer road, or the vehicle 401 can report the vehicle 401 to the navigation server.
  • the navigation server can recognize that the vehicle will enter a multi-layer road, and trigger the navigation server 302 to send a query request to the transportation platform server 303 .
  • the transportation platform server 303 receives the query request, and obtains the license plate sequence collected by the acquisition module of the multi-layer road within a period of time (the license plate sequence is the collected license plate set), and obtains the license plate sequence corresponding to the collected vehicle 401 license plate.
  • the number of the collection module such as collection module 404 .
  • the traffic platform server 303 can determine the pavement layer where the vehicle 401 is located according to the number corresponding to the acquisition module 404, such as pavement layer 2, and send the information of the pavement layer to the navigation software corresponding to the vehicle 401, and then the navigation software can
  • the information of the road surface layer updates the navigation route, for example, the vehicle 401 can drive in the direction indicated by arrow 4, or drive in the direction indicated by arrow 5.
  • FIG. 6 is a schematic flowchart of a navigation server-based positioning provided by an embodiment of the present application.
  • the collection module is taken as the camera for illustration.
  • the acquisition module 403 in FIG. 4 can also be understood as the camera 403
  • the acquisition module 404 in FIG. 4 can also be understood as the camera 404
  • the acquisition module 405 in FIG. 4 can also be understood as the camera 405,
  • the vehicle 401 in FIG. 4 It can also be understood as the terminal 200 .
  • the method based on navigation server positioning may include the following steps:
  • the terminal 200 acquires the license plate information, and sends the license plate information to the navigation server 302 .
  • the navigation server 302 may receive the license plate information sent by the terminal 200 .
  • the license plate information may be a license plate number, etc.; the terminal 200 may be a vehicle, or a device such as a mobile phone.
  • the terminal 200 includes navigation software.
  • the method for the terminal 200 to acquire the license plate information may be as follows: the terminal 200 acquires the license plate information entered by the user into the navigation software in the terminal 200 .
  • FIG. 7 is a schematic diagram of an interface for inputting license plate information provided by an embodiment of the present application.
  • the terminal 200 is a mobile phone as an example for illustration, and this example does not constitute a limitation to the embodiment of the present application.
  • the mobile phone may display an interface as shown in a in FIG. 7 , and the interface may include a license plate setting control 701 for inputting license plate information and the like.
  • the navigation software can jump from the interface shown in a in FIG. 7 to the interface shown in b in FIG. 7 .
  • the user can input the license plate information in Please fill in your license plate 702 .
  • the mobile phone may receive the license plate information input by the user, and send the license plate information to the navigation server 302 .
  • the license plate information a certain A 12345.
  • the terminal 200 sends the GPS positioning information to the navigation server 302 .
  • the navigation server 302 may receive the GPS positioning information sent by the terminal 200 .
  • the GPS positioning information may be used to identify the location where the terminal 200 is located, or may be used to determine the location of the multi-layer road where the terminal 200 is located, and the GPS positioning information may be generated by a GPS positioning module in the terminal 200 .
  • the terminal 200 can send the GPS positioning information acquired in real time to the navigation server 302; correspondingly, the navigation server 302 can locate the terminal 200 in real time, and then determine the location of the terminal 200.
  • the navigation server 302 may store the corresponding relationship between the license plate information obtained from S601 and the GPS positioning information obtained from S602.
  • the GPS positioning module in the terminal 200 can recognize that the vehicle has entered the multi-layer road, and send the location of the vehicle in real time. to the navigation server 302 . At this time, the vehicle can continue to travel according to the route indicated by the navigation software in the terminal 200 .
  • the navigation server 302 acquires the photo of the license plate captured by the camera, and the camera number corresponding to the photo of the license plate.
  • the license plate photo can be used to identify the vehicle.
  • the license plate photo may be a photo taken by a camera and containing license plate information.
  • the camera can be at least one camera in a multi-layer road.
  • the navigation server 302 may obtain license plate photos from multiple cameras (for example, in the scene shown in FIG. 4 , the camera 403, the camera 404, and the camera 405), and the camera number corresponding to the license plate photo.
  • the camera in the embodiment of the present application may have image recognition capability, or may not have image recognition capability, wherein the image recognition capability is used to perform image recognition on the photographed license plate photos to recognize accurate license plate information.
  • the navigation server 302 may control the camera to take pictures of the license plate, or the navigation server 302 may obtain the required photos among the photos continuously captured by the camera and uploaded to the navigation server 302 .
  • the camera on the multi-layer road can take a plurality of license plate photos, identify the license plate information in the license plate photos based on the image processing module in the camera, and use the license plate information. , and the camera number corresponding to the license plate information is uploaded to the navigation server 302 , and the navigation server 302 can subsequently execute the steps shown in S605 .
  • the camera on the multi-layer road can take a plurality of license plate photos, and upload the license plate photos and the camera number corresponding to the license plate photos to the navigation server 302.
  • the server 302 has an image recognition capability, and the subsequent navigation server 302 can perform the steps shown in S604.
  • the navigation server 302 performs image processing on the license plate photo to obtain license plate information.
  • the navigation server 302 determines the road surface layer corresponding to the vehicle according to the license plate information and the camera number.
  • the navigation server 302 may store the correspondence between the camera number and the road surface layer.
  • the corresponding relationship between the camera and the pavement layer is shown in Table 1 below:
  • the camera 403 corresponds to the pavement layer 1
  • the camera 404 corresponds to the pavement layer 2
  • the camera 405 corresponds to the pavement layer 3.
  • the navigation server 302 may acquire the correspondence between multiple sets of license plate information and camera numbers. Further, according to the license plate information uploaded in the step shown in S601, such as a certain 12345, the navigation server 302 can determine the camera number corresponding to the certain 12345 in the corresponding relationship between the plurality of sets of license plate information and the camera number, such as the camera 404. Furthermore, as shown in Table 1, the navigation server can determine that the terminal 200 is located on the road surface 2 according to the camera 404 .
  • the navigation server 302 sends the road surface layer information to the terminal 200 .
  • the terminal 200 may determine whether to receive the road surface layer information sent by the navigation server 302 .
  • the navigation software in the terminal 200 can display the pavement layer information.
  • the pavement layer information is used to indicate the level at which the pavement is located in the multi-layer pavement, and the pavement layer information may be in other forms such as pavement layer numbers, such as pavement layer 2.
  • FIG. 8 is a schematic diagram of an interface for displaying a pavement layer according to an embodiment of the present application.
  • the terminal 200 is a mobile phone as an example for illustration.
  • the navigation software in the mobile phone can display the interface shown in FIG. 8 .
  • the indication information 801 on the left half screen of the mobile phone can display the information that the current vehicle is located on the road surface layer 2, and the route corresponding to the road surface layer 2 can be displayed on the right half screen of the mobile phone.
  • the navigation software in the terminal 200 may continue to navigate according to the original navigation algorithm.
  • FIG. 9 is another schematic diagram of an interface for displaying a pavement layer provided by an embodiment of the present application.
  • the terminal 200 is a mobile phone as an example for illustration.
  • the navigation software in the mobile phone can display the interface of the original navigation route as shown in Figure 9, such as the navigation route indicated along the pavement layer 1.
  • the left half screen of the mobile phone can display the current driving direction and the description of the driving direction
  • the right half screen of the mobile phone can display the route corresponding to the road surface layer 1 indicated by the original navigation algorithm.
  • the navigation server can realize accurate positioning of the road surface according to the corresponding relationship between the camera and the road surface layer, thereby providing a more accurate navigation route for the user, and the embodiment of the present application can make full use of the navigation server There is no need to build a new server, and the implementation cost is reduced.
  • the pavement layer may be updated based on the following methods, or the current pavement layer may be maintained.
  • the navigation software may update the pavement layer based on the weight of the pavement layer, or maintain the current pavement layer.
  • the navigation server can set a higher weight for the road surface layer determined by the camera; when the GPS positioning information of the terminal indicates that the vehicle where the terminal is located is located on other road surface layers. , the navigation server may set a lower weight for the road surface layer indicated by the GPS positioning information.
  • the navigation software receives the pavement layer determined based on the camera, it receives the pavement layer indicated by the GPS positioning information. Since the weight of the pavement layer determined by the GPS positioning information is lower than the weight of the pavement layer determined by the camera, the subsequent navigation software The pavement layer information determined by the camera may be used as the criterion, and the pavement layer information indicated by the GPS positioning information may be ignored.
  • the navigation software may update the pavement layer based on time, or maintain the current pavement layer.
  • the navigation software may take the pavement layer 1 as the criterion within a certain time threshold, and the pavement layer 1 is not updated.
  • the navigation software can re-request to obtain the information of the road surface layer on which the vehicle is located.
  • the navigation server may set the effective time for the road surface layer when obtaining the road surface layer. If the effective time is set for the pavement layer 1, it is 1 minute.
  • the navigation software of the terminal obtains the information currently located on the pavement layer 1 determined based on the camera, the navigation software may not update the pavement layer 1 within the valid time of 1 minute of the pavement layer 1.
  • the navigation software can take the new pavement layer information received as the criterion.
  • the navigation software may receive the road surface layer information indicated by the GPS positioning module, and receive the road surface layer information determined based on the camera. Due to the transmission delay, when the navigation software first receives the pavement layer information indicated by the GPS positioning module as pavement layer 2, and updates the original pavement layer to pavement layer 2; and then receives the pavement layer information determined based on the camera as pavement layer 2, this At this time, since the vehicle may have traveled a certain distance, the received road surface layer information based on the camera may not be accurate enough. Therefore, the road surface layer information determined by the camera can be discarded and the current road surface layer can be maintained.
  • setting different times for the pavement layers determined by different devices can obtain more accurate pavement layer information in this period of time under different time conditions, and then the navigation software can provide users with accurate pavement layer information based on this accurate pavement layer information. Provides more accurate navigation routes.
  • the navigation software may update the pavement layer based on the pavement layer reported by the user on the terminal.
  • FIG. 10 is a schematic diagram of a user reporting interface provided by an embodiment of the present application.
  • the terminal 200 is a mobile phone as an example for illustration. Since the user can determine which layer of the multi-layer road surface the driving vehicle is located on. Therefore, when the GPS positioning module in the mobile phone detects that the vehicle has entered a multi-layered road, it can send prompt information to the user.
  • the interface shown in a in Figure 10 can be displayed in the instruction information 1001 on the left half of the mobile phone. Select the current road surface layer, and display the navigation route corresponding to the road surface layer 2 indicated by the navigation software before receiving the road surface layer information reported by the user on the right half screen of the mobile phone.
  • the navigation software in the mobile phone can switch from the interface shown in a in FIG. 10 to the interface shown in b in FIG. 10 .
  • the indication information 1002 on the left half screen of the mobile phone can display: The information of the route corresponding to the road surface layer 1 has been switched for you, and the information received on the right half screen of the mobile phone can be displayed.
  • the navigation software indicates the navigation route corresponding to the pavement layer 1.
  • the navigation software can obtain the current more accurate road surface layer information based on the road surface layer information reported by the user in different scenarios, and then the navigation software can provide the user with a more accurate navigation route.
  • FIG. 11 is a schematic flowchart of positioning based on a navigation server and a transportation platform server according to an embodiment of the present application.
  • the acquisition module is used as a camera for illustration.
  • the acquisition module 403 in FIG. 5 can also be understood as the camera 403
  • the acquisition module 404 in FIG. 5 can also be understood as the camera 404
  • the acquisition module 405 in FIG. 5 can also be understood as the camera 405,
  • the vehicle 401 in FIG. 5 It can also be understood as the terminal 200 .
  • the method for positioning based on the navigation server and the traffic platform server may include the following steps:
  • the terminal 200 acquires the license plate information, and sends the license plate information to the navigation server 302 .
  • S1101 is similar to the steps shown in S601 in the embodiment corresponding to FIG. 6 , and details are not repeated here.
  • the terminal 200 sends the GPS positioning information to the navigation server 302 .
  • the navigation server 302 may determine the location of the terminal 200 or the location of the vehicle according to the positioning information, for example, on which overpass, tunnel or multi-layer road the vehicle is located.
  • the navigation software can also set the initial time based on when the vehicle reaches the multi-layer road Nm.
  • N can be a positive integer.
  • the navigation software in the terminal 200 can recognize through the GPS positioning module that the vehicle is only 100 meters away from the camera position of the multi-level road intersection, and set t0 at this time, and send a trigger signal to the navigation server 302 .
  • the navigation software in the terminal 200 may trigger the navigation server 302 to send a query request to the transportation platform server 303 .
  • the traffic platform server 303 receives the query request sent by the navigation server 302 .
  • the query request may include vehicle license plate information and/or GPS positioning information and the like.
  • the query request may also include the location information of the multi-layer pavement or the number of the multi-layer pavement (or the identifier of the overpass) and the like.
  • the navigation server 302 may determine the location of the multi-layer road surface or the number of the multi-layer road surface based on the GPS positioning information of the terminal 200 .
  • the query request may also include the camera number.
  • the navigation server 302 can, according to the GPS positioning information, directionally inquire which camera in the multi-layer road captures the license plate photo or license plate sequence, etc., and obtains: The number of this camera.
  • the transportation platform server 303 retrieves the license plate sequence A captured by all cameras at the positions of the multi-layer road within (t0+3)s-(t0+9)s according to the location information of the multi-layer road, and obtains The information of the license plate is compared with the license plate sequence A, and the camera number or road surface layer information corresponding to the license plate information is obtained.
  • the license plate sequence A can be, all cameras in the position of the multi-layer road, within (t0+3)s-(t0+9)s, after image processing is performed on the photo of the license plate obtained by shooting, multiple license plates are obtained.
  • the sequence corresponding to the information For example, in the scenario corresponding to FIG. 5 , the transportation platform server 303 may acquire the license plate sequence A captured by the camera 403 , the camera 404 , and the camera 405 .
  • the method for determining the time range may be, for example, the vehicle travels at a speed of 60 km/h (or 16.7 m/s), when the vehicle is at a position 100 m in front of the camera at time t0, after another 6 seconds, the vehicle can Drive to the location of the overpass camera.
  • the camera completes the shooting; due to the speed of the vehicle, you can reserve some margin in the front and rear to ensure that the camera can shoot in most cases, so
  • the time range can be set to (t0+3)s-(t0+9)s. It can be understood that the time range may include other contents according to the actual scenario, which is not limited in this embodiment of the present application.
  • the time range can also be (t0+4)s-(t0+8)s, etc.
  • the traffic platform server 303 or the navigation server 302 may acquire the road surface layer information of the terminal 200 .
  • the traffic platform server 303 when the traffic platform server 303 saves the correspondence between the camera number and the road surface layer, the method for obtaining the road surface layer information may be, the traffic platform server 303 may compare the license plate sequence A with the license plate information obtained in S1104, if If the license plate information is found from the license plate sequence A, the traffic platform server 303 can query the camera number corresponding to the license plate information, and determine the road surface corresponding to the camera number based on the correspondence between the camera number and the road surface layer saved by the traffic platform server 303 layer information, and the step shown in S1106 may be performed to send the pavement layer information to the navigation server 302 subsequently.
  • the method for obtaining the road surface layer information may be, the traffic platform server 303 may compare the license plate sequence A with the license plate information obtained in S1104, if If the license plate information is found from the license plate sequence A, the traffic platform server 303 can query the camera number corresponding to the license plate information, and then can perform the steps shown in S1106 to send the camera number to the navigation server 302.
  • the navigation server 302 can determine the road surface layer where the vehicle is located based on the correspondence between the camera number and the road surface layer, and can subsequently send the road surface layer information to the terminal 200 .
  • the traffic platform server 303 compares the license plate sequence A with the license plate information obtained in S1104 and does not find the license plate information from the license plate sequence A, it can return empty information to the terminal 200 .
  • the navigation software in the terminal 200 can continue to navigate according to the original navigation algorithm.
  • the traffic platform server 303 sends the camera number or road surface layer information to the navigation server 302 .
  • the navigation server 302 may receive the camera number or road surface layer information sent by the transportation platform server 303 .
  • the navigation server 302 when the navigation server 302 receives the camera number sent by the traffic platform server 303, the navigation server can determine the road surface layer information corresponding to the camera number based on the correspondence between the camera number and the road surface layer saved by itself, and can follow up. The steps shown in S1107 are performed.
  • the traffic platform server 303 may determine the pavement layer information based on the correspondence between the camera number and the pavement layer stored by itself.
  • the navigation server 302 receives the pavement layer information sent by the traffic platform server 303, it can subsequently execute S1107 steps shown.
  • the navigation server 302 sends the road surface layer information to the terminal 200 .
  • the terminal 200 receives the pavement layer information sent by the navigation server 302, and displays the pavement layer information in the navigation software.
  • the traffic platform server and the navigation server can realize the precise positioning of the road layer according to the corresponding relationship between the camera and the road layer, thereby providing users with more accurate navigation routes, and can make full use of the navigation server. Or the existing functions of equipment such as transportation platform servers, there is no need to build a new server, and the implementation cost is reduced.
  • the photo of the license plate taken by the camera may be blurred due to the weather, so that the camera cannot recognize the license plate information in the photo of the license plate taken, or the camera recognizes the license plate information in the photo of the license plate that was taken.
  • An error occurs, etc. Therefore, when the license plate information captured by the camera is inaccurate, the error can be corrected in the following ways.
  • the navigation software can perform error correction based on the weight of the road surface layer. For example, in a scene with bad weather, since the license plate information captured by the camera may be inaccurate, the navigation software can be based on the road surface layer determined by the camera and the road surface layer indicated by the GPS positioning information. weights for error correction.
  • the navigation server may set a lower weight for the road surface layer determined by the camera; when the GPS positioning information of the terminal indicates that the vehicle on which the terminal is located is located on another road surface layer. , the navigation server can set a higher weight for the road surface layer indicated by the GPS positioning information.
  • the navigation software receives the pavement layer determined based on the camera, it receives the pavement layer indicated by the GPS positioning information.
  • the navigation software can use The pavement layer information indicated by the GPS positioning information shall prevail, and the pavement layer information determined based on the camera is subjected to error correction, and the pavement layer information indicated based on the GPS positioning information shall prevail.
  • the navigation software can obtain more accurate pavement layer information according to the weight in different scenarios, and then the navigation software can provide the user with a more accurate navigation route based on the accurate pavement layer information.
  • error correction may be performed based on the distance between the road layers.
  • the navigation software can use the road surface layer determined based on the camera and the road surface layer indicated by the GPS positioning information. Correct the distance between them.
  • the navigation software can be based on the road surface.
  • the distance between layer 1 and pavement layer 2 is used to determine whether pavement layer 1 needs to be updated. For example, when the navigation software determines that the distance between the pavement layer 1 and the pavement layer 2 exceeds a certain distance threshold, it can be determined that the pavement layer 1 is inaccurate. At this time, the navigation software can take the pavement layer 2 indicated by the GPS positioning information as the criterion. Correct the error of the original pavement layer 1; when the distance between the pavement layer 1 and the pavement layer 2 does not exceed a certain distance threshold, it can be determined that the pavement layer 1 is accurate, and the navigation software can not correct the error of the original pavement layer 1 .
  • the navigation software can obtain more accurate pavement layer information according to the pavement layers determined by different devices and the distance between the pavement layers, and then the navigation software can be based on the accurate pavement layer information. Users provide more accurate navigation routes.
  • the navigation software can use the road surface layer based on the user input to perform error correction.
  • the navigation software displays that the pavement layer determined based on the camera is the pavement layer 2
  • the user can change the pavement layer information in the navigation software to correct the road surface determined based on the camera. layer for error correction.
  • the navigation software can obtain relatively accurate pavement layers according to user input, and then the navigation software can provide users with more accurate navigation routes based on the accurate pavement layer information.
  • the interface diagram provided in the embodiment of the present application is only used as an example, and not as a limitation of the embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of a positioning apparatus 120 provided by an embodiment of the present application.
  • the positioning apparatus 120 may be used in communication equipment, circuits, hardware components or chips, and the positioning apparatus includes: Processing unit 1201 and communication unit 1202.
  • the processing unit 1201 is configured to support the positioning device to perform the steps of information processing;
  • the communication unit 1202 is configured to support the positioning device to perform the steps of data transmission or reception.
  • the positioning apparatus 120 may be a positioning system, a terminal device, or a first server in this embodiment of the present application.
  • the positioning device 120 when the positioning device 120 is a positioning system, an embodiment of the present application provides a positioning device, which is applied to the positioning system.
  • the positioning system includes: a terminal device and a first server, and the device includes: a communication unit 1202 for sending a A server sends the identification, starting position and destination of the target object to be navigated; the communication unit 1202 is further configured to send the first navigation route to the terminal device according to the starting position and destination;
  • the communication unit 1202 In the process of driving the route, the communication unit 1202 is also used to report the location information of the terminal device to the first server; when the location information reflects that the terminal device is about to enter the intersection, the processing unit 1201 is used to obtain the target lane where the terminal device is located; Wherein, there are multiple cameras at the intersection, and the multiple cameras are used to photograph objects in different lanes at the intersection; the target lane is determined by the first server based on the content captured by the cameras, or the target lane is determined by the first server according to the data obtained from the second
  • the processing unit 1201 is specifically configured to photograph objects in multiple lanes of the intersection based on multiple cameras, and obtain multiple first association relationships, and any first association relationship includes an image and a captured image.
  • the identification of the camera; the processing unit 1201 is also specifically used to identify the target camera corresponding to the target image including the identification of the target object when the identification of the target object is identified in the multiple images; the processing unit 1201 is also specifically used for
  • the target lane where the target camera is located is determined according to the second association relationship; the second association relationship includes the corresponding relationship between the camera and the lane.
  • the communication unit 1202 is specifically configured to send a query request to the second server, where the query request includes the identifier of the target object and any one of the following: the location information of the target object or the identifier of the intersection;
  • the communication unit 1202 is further specifically configured to receive the identification of the target lane from the second server.
  • the communication unit 1202 is specifically configured to send a query request to the second server, where the query request includes the identifier of the target object and any one of the following: the location information of the target object or the identifier of the intersection;
  • the communication unit 1202 is also specifically used to receive the identification of the target camera from the second server, and the target camera is the camera that captures the target object;
  • the communication unit 1202 is specifically used to determine the target lane where the target camera is located according to the second association relationship;
  • the second association relationship includes the corresponding relationship between the camera and the lane.
  • the communication unit 1202 is further configured to send the second navigation route to the terminal device according to the target lane and the destination.
  • the processing unit 1201 when the first server receives the location information from the terminal device within the first time period, the processing unit 1201 is specifically configured to continue to use the second navigation route for the terminal within the first time period Device navigation; when the first server receives the location information from the terminal device after the first time period, the processing unit 1201 is further specifically configured to navigate the terminal device according to the location information of the terminal device received after the first time period.
  • the processing unit 1201 is further configured to set a first weight for the lane indicated in the first navigation route, and set a second weight for the target lane according to the environment information; wherein, when the environment information indicates that the environment is not When it is conducive to image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the processing unit 1201 , and is also used to send the second navigation route to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination.
  • the processing unit 1201 when the target lane is different from the lane indicated in the first navigation route, and the distance between the target lane and the lane indicated in the first navigation route is greater than the distance threshold, the processing unit 1201 further uses continue to navigate the terminal device according to the second navigation route.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • the device 120 when the positioning device 120 is the first server, an embodiment of the present application provides a positioning device, the device includes: a communication unit 1202, configured to receive an identifier and a starting position of a target object that needs to be navigated from a terminal device and destination; the communication unit 1202 is further configured to send the first navigation route to the terminal device according to the starting position and the destination; the communication unit 1202 is further configured to receive the position information of the terminal device in the process of driving the first navigation route ; When the location information reflects that the terminal device is about to enter the intersection, the processing unit 1201 is used to obtain the target lane where the terminal device is located; wherein, a plurality of cameras are set in the intersection, and the plurality of cameras are used to photograph objects in different lanes in the intersection.
  • the target lane is determined by the first server based on the content obtained by the camera, or the target lane is determined by the first server according to the information received from the second server; the communication unit 1202 is also used to send to the terminal device for indicating the target lane instruction information.
  • the processing unit 1201 is specifically configured to photograph objects in multiple lanes of the intersection based on multiple cameras, and obtain multiple first association relationships, and any first association relationship includes an image and a captured image.
  • the communication unit 1202 is specifically configured to send a query request to the second server, where the query request includes the identifier of the target object and any one of the following: the location information of the target object or the identifier of the intersection;
  • the communication unit 1202 is specifically configured to receive the identification of the target lane from the second server.
  • the communication unit 1202 is specifically configured to send a query request to the second server, where the query request includes the identifier of the target object and any one of the following: the location information of the target object or the identifier of the intersection;
  • the communication unit 1202 is also specifically configured to receive the identification of the target camera from the second server, and the target camera is the camera that captured the target object;
  • the processing unit 1201 is specifically configured to determine the target lane where the target camera is located according to the second association relationship;
  • the second association relationship includes the corresponding relationship between the camera and the lane.
  • the communication unit 1202 is further configured to send the second navigation route to the terminal device according to the target lane and the destination.
  • the processing unit 1201 when the first server receives the location information from the terminal device within the first time period, the processing unit 1201 is specifically configured to continue to use the second navigation route for the terminal within the first time period Device navigation; when the first server receives the location information from the terminal device after the first time period, the processing unit 1201 is further specifically configured to navigate the terminal device according to the location information of the terminal device received after the first time period.
  • the processing unit 1201 is further configured to set a first weight for the lane indicated in the first navigation route, and set a second weight for the target lane according to the environment information; wherein, when the environment information indicates that the environment is not When it is conducive to image recognition, the first weight is greater than the second weight; when the environment information indicates that the environment does not affect image recognition, the first weight is less than the second weight; when the target lane is different from the lane indicated in the first navigation route, the communication unit 1202 , and is also used to send the second navigation route to the terminal device according to the target lane and the lane with the greatest weight among the lanes indicated in the first navigation route, and the destination.
  • the processing unit 1201 when the target lane is different from the lane indicated in the first navigation route, and the distance between the target lane and the lane indicated in the first navigation route is greater than the distance threshold, the processing unit 1201 further uses continue to navigate the terminal device according to the second navigation route.
  • the identifier of the target object is a license plate number
  • the terminal device is a mobile phone or a vehicle.
  • the device when the positioning device 120 is a terminal device, an embodiment of the present application provides a positioning device, the device includes: a communication unit 1202, configured to send to the first server the identification of the target object that needs to be navigated, the starting position and destination; the communication unit 1202 is further configured to receive the first navigation route from the first server; the first navigation route is related to the starting position and the destination; in the process of the terminal device traveling according to the first navigation route, the communication unit 1202 , is also used to report the location information of the terminal device to the first server; when the location information reflects that the terminal device is about to enter the intersection, the communication unit 1202 is also used to send prompt information to the first server; the prompt information is used to prompt the terminal device to enter the intersection. Entering the intersection; the communication unit 1202 is further configured to receive instruction information from the first server for indicating the target lane; the processing unit 1201 is configured to prompt the user to be in the target lane according to the instruction information.
  • the positioning apparatus 120 of each of the above solutions has the function of implementing the corresponding steps performed by the positioning system, the first server or the terminal device in the above method.
  • the positioning device 120 may further include: a storage unit 1203 .
  • the processing unit 1201 and the storage unit 1203 are connected through a communication line.
  • the storage unit 1203 may include one or more memories, and the memories may be devices in one or more devices or circuits for storing programs or data.
  • the storage unit 1203 may exist independently, and is connected to the processing unit 1201 of the positioning device through a communication line.
  • the storage unit 1203 can also be integrated with the processing unit 1201 .
  • the communication unit 1202 may be an input or output interface, a pin or a circuit, or the like.
  • the storage unit 1203 may store computer-executed instructions of the radar or target device method, so that the processing unit 1201 executes the radar or target device method in the above embodiments.
  • the storage unit 1203 may be a register, a cache or a RAM, etc., and the storage unit 1203 may be integrated with the processing unit 1201 .
  • the storage unit 1203 may be a ROM or other types of static storage devices that may store static information and instructions, and the storage unit 1203 may be independent of the processing unit 1201 .
  • FIG. 13 is a schematic diagram of the hardware structure of a control device provided by an embodiment of the present application.
  • the control device includes a processor 1301, a communication line 1304 and at least one communication interface (exemplified in FIG. 13 ).
  • the communication interface 1303 is used as an example for description).
  • the processor 1301 may be a general-purpose central processing unit (central processing unit, CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more processors for controlling the execution of the programs of the present application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication lines 1304 may include circuitry that communicates information between the aforementioned components.
  • devices or communication networks such as Ethernet, wireless local area networks (wireless local area networks, WLAN) and the like.
  • control device may also include a memory 1302 .
  • Memory 1302 may be read-only memory (ROM) or other type of static storage device that can store static information and instructions, random access memory (RAM), or other type of static storage device that can store information and instructions It can also be an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, CD-ROM storage (including compact discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or capable of carrying or storing desired program code in the form of instructions or data structures and capable of being executed by a computer Access any other medium without limitation.
  • the memory may exist independently and be connected to the processor through communication line 1304 .
  • the memory can also be integrated with the processor.
  • the memory 1302 is used for storing computer-executed instructions for executing the solution of the present application, and the execution is controlled by the processor 1301 .
  • the processor 1301 is configured to execute the computer-executed instructions stored in the memory 1302, thereby implementing the positioning method provided by the embodiments of the present application.
  • the computer-executed instructions in the embodiments of the present application may also be referred to as application code, which is not specifically limited in the embodiments of the present application.
  • the processor 1301 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 13 .
  • control device may include multiple processors, such as the processor 1301 and the processor 1305 in FIG. 13 .
  • processors can be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor.
  • a processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • FIG. 14 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip 140 includes one or more (including two) processors 1410 and a communication interface 1430 .
  • memory 1440 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set of them.
  • the memory 1440 may include a read-only memory and a random access memory, and provides instructions and data to the processor 1410 .
  • a portion of memory 1440 may also include non-volatile random access memory (NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 1440 , the communication interface 1430 and the memory 1440 are coupled together through the bus system 1420 .
  • the bus system 1420 may also include a power bus, a control bus, a status signal bus, and the like in addition to the data bus.
  • the various buses are designated as bus system 1420 in FIG. 14 .
  • the methods described in the foregoing embodiments of the present application may be applied to the processor 1410 or implemented by the processor 1410 .
  • the processor 1410 may be an integrated circuit chip with signal processing capability.
  • each step of the above-mentioned method can be completed by an integrated logic circuit of hardware in the processor 1410 or an instruction in the form of software.
  • the above-mentioned processor 1410 can be a general-purpose processor (eg, a microprocessor or a conventional processor), a digital signal processor (DSP), an application specific integrated circuit (ASIC), an off-the-shelf programmable gate Array (field-programmable gate array, FPGA) or other programmable logic devices, discrete gates, transistor logic devices or discrete hardware components, the processor 1410 can implement or execute the methods, steps and logic block diagrams disclosed in the embodiments of the present invention .
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a storage medium mature in the field, such as random access memory, read-only memory, programmable read-only memory, or electrically erasable programmable read only memory (EEPROM).
  • the storage medium is located in the memory 1440, and the processor 1410 reads the information in the memory 1440, and completes the steps of the above method in combination with its hardware.
  • the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product.
  • the computer program product may be written in the memory in advance, or downloaded and installed in the memory in the form of software.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions according to the embodiments of the present application are generated in whole or in part.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website site, computer, server, or data center over a wire (e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (eg infrared, wireless, microwave, etc.) means to transmit to another website site, computer, server or data center.
  • a wire e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (eg infrared, wireless, microwave, etc.
  • the computer readable storage medium may be Any available medium on which a computer can store or data storage device including a server, data center, etc., integrated with one or more available media.
  • available media may include magnetic media (eg, floppy disks, hard disks, or tapes), optical media (eg, Digital versatile disc (digital versatile disc, DVD)), or semiconductor media (for example, solid state disk (solid state disk, SSD)), etc.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • Computer-readable media can include both computer storage media and communication media and also include any medium that can transfer a computer program from one place to another.
  • the storage medium can be any target medium that can be accessed by a computer.
  • the computer readable medium may include compact disc read-only memory (CD-ROM), RAM, ROM, EEPROM or other optical disk storage; the computer readable medium may include magnetic disks memory or other disk storage devices.
  • any connection line is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc, where disks usually reproduce data magnetically, while discs use lasers to optically reproduce data.

Abstract

一种定位方法和装置,定位系统包括:终端设备和第一服务器,方法包括:终端设备向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;第一服务器根据起始位置和目的地,向终端设备发送第一导航路线;在终端设备根据第一导航路线行驶的过程中,终端设备向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,第一服务器获取终端设备所在的目标车道;第一服务器向终端设备发送用于指示目标车道的指示信息;终端设备根据指示信息提示用户处于的目标车道。终端设备可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道。

Description

定位方法和装置
本申请要求于2021年03月30日提交中国专利局、申请号为202110342456.6、申请名称为“定位方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种定位方法和装置。
背景技术
随着城市交通的发展,立交桥的数量越来越多,其建造也越来越复杂。例如包含多层路面的立交桥。当用户使用导航驶入立交桥时,由于立交桥在同一位置设有多层路面,这就使得导航无法区分当前用户位于立交桥的哪一层,进而无法根据准确的路面层提供准确的导航路线;尤其是当用户根据导航路线走错之后,导航软件无法及时识别用户当前所在立交桥的哪一层,进而无法及时更新正确的导航路线,从而给用户在立交桥上行驶过程中带来极大麻烦。
现有可能的设计中,可以利用射频识别(radio frequency identification,RFID)对立交桥的路面层进行定位。例如,可以在路面上铺设RFID标签,并基于车辆经过该路面时接收到的RFID发射的信号,对车辆所在的路面层进行定位识别。
然而,由于RFID的识别宽度有限,可能会由于在道路中的车辆距离RFID标签较远,出现车辆无法识别到路面的RFID标签发送信号的情况,进而无法对车辆在立交桥中所处的路面层进行精准定位;除此以外,在立交桥路面铺设RFID,一方面容易被车辆压坏,另一方面需要破坏现有路面进行铺设,工程量大。
发明内容
本申请实施例提供一种定位方法和装置,可以精准判断车辆位于多条车道中的哪一条车道,进而接收到该车道信息的终端设备可以基于准确的车道进行导航。
第一方面,本申请实施例提供一种定位方法,应用于定位系统,定位系统包括:终端设备和第一服务器,方法包括:终端设备向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;第一服务器根据起始位置和目的地,向终端设备发送第一导航路线;在终端设备根据第一导航路线行驶的过程中,终端设备向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,第一服务器获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;第一服务器向终端设备发送用于指示目标车道的指示信息;终端设备根据指示信息提示用户处于的目标车道。这样,终端设备可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而接收到该车道信息的终端设备可以基于准确的车道进行导航。
其中,该车道可以理解为路面层或道路层;该第一服务器可以为导航服务器;该 第二服务器可以为交通平台服务器;该路口可以理解为多层路面的路口,或者多条道路的路口;第一导航路线可以理解为基于终端设备的GPS定位,得到的导航路线。终端设备可以为手机或车辆。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;第一服务器在多个图像中识别到目标对象的标识时,第一服务器确定包括目标对象的标识的目标图像对应的目标摄像头;第一服务器根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。这样,第一服务器可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
其中,车道中的对象可以为该车道中的车辆的车牌;图像可以为包含车牌信息的车牌照片;摄像头的标识可以为摄像头编号;目标对应的标识可以为车牌号;目标车道可以为基于摄像头与车道的对应关系,得到的车道。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;第一服务器接收来自第二服务器的目标车道的标识。这样,当第二服务器保存有摄像头与车道的对应关系时,第二服务器可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,并将该车道的信息发送至第一服务器,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;第一服务器接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;第一服务器根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。这样,当第一服务器保存有摄像头与车道的对应关系时,第一服务器可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
在一种可能的实现方式中,方法还包括:在目标车道与第一导航路线中指示的车道不同时,第一服务器根据目标车道和目的地,向终端设备发送第二导航路线。这样,终端设备可以根据在不同的场景中,确定的不同的车道,为用户提供更加准确的导航路线。
其中,该第二导航路线可以为基于摄像头与车道的对应关系,得到的目标车道对应的导航路线。
在一种可能的实现方式中,第一服务器根据目标车道和目的地,向终端设备发送第二导航路线之后,还包括:在第一服务器在第一时间段内接收到来自终端设备的位置信息时,第一服务器在第一时间段内持续依据第二导航路线为终端设备导航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,第一服务器根据第一时间段之后接收的终端设备的位置信息为终端设备导航。这样,可以在不同的时间条件下,获得该段时间内更准确的车道信息,进而导航软件可以基于该准确的车道信息,为用户提供更准确的导航路线。
在一种可能的实现方式中,方法还包括:第一服务器为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,第一服务器根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。这样,对不同设备确定的车道,设置不同的权重,可以在不同的适用条件下,根据该权重得到更准确的车道信息,进而终端设备可以基于该准确的车道信息,为用户提供准确的导航路线。
其中,该影响图像识别的环境可以为雷雨天或雾霾天等,天气恶劣,或者能见度较低的环境。
在一种可能的实现方式中,方法还包括:当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,第一服务器持续依据第一导航路线为终端设备导航。这样,终端设备可以对不同的场景中得到的车道之间的距离进行判断,并得到更准确的车道信息,进而导航软件可以基于该准确的车道信息,为用户提供准确的导航路线。
其中,目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时可以理解为,基于摄像头与车道的对应关系确定的目标车道可能不够准确,此时可以第一导航路线为终端设备导航。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
第二方面,本申请实施例提供一种定位方法,方法包括:第一服务器接收来自终端设备的需要进行导航的目标对象的标识、起始位置和目的地;第一服务器根据起始位置和目的地,向终端设备发送第一导航路线;第一服务器接收终端设备在第一导航路线行驶的过程中的位置信息;在位置信息反映终端设备即将驶入路口时,第一服务器获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;第一服务器向终端设备发送用于指示目标车道的指示信息。这样,终端设备可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而接收到该车道信息的终端设备可以基于准确的车道进行导航。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;第一服务器在多个图像中识别到目标对象的标识时,第一服务器确定包括目标对象的标识的目标图像对应的目标摄像头;第一服务器根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。这样,第一服务器可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;第一服务器接收来自第二服务器的目标车道的标识。这样,当第二服务器保存有摄像头与车道的对应关系时,第二服务器可以基于摄像头与车道 的对应关系,精准判断用户位于哪一条车道,并将该车道的信息发送至第一服务器,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
在一种可能的实现方式中,第一服务器获取终端设备所在的目标车道,包括:第一服务器向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;第一服务器接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;第一服务器根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。这样,当第一服务器保存有摄像头与车道的对应关系时,第一服务器可以基于摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而终端设备可以基于第一服务器发送的准确的车道进行导航。
在一种可能的实现方式中,方法还包括:在目标车道与第一导航路线中指示的车道不同时,第一服务器根据目标车道和目的地,向终端设备发送第二导航路线。这样,终端设备可以根据在不同的场景中,确定的不同的车道,为用户提供更加准确的导航路线。
在一种可能的实现方式中,第一服务器根据目标车道和目的地,向终端设备发送第二导航路线之后,还包括:在第一服务器在第一时间段内接收到来自终端设备的位置信息时,第一服务器在第一时间段内持续依据第二导航路线为终端设备导航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,第一服务器根据第一时间段之后接收的终端设备的位置信息为终端设备导航。这样,可以在不同的时间条件下,获得该段时间内更准确的车道信息,进而导航软件可以基于该准确的车道信息,为用户提供准确的导航路线。
在一种可能的实现方式中,方法还包括:第一服务器为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,第一服务器根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。这样,对不同设备确定的车道,设置不同的权重,可以在不同的适用条件下,根据该权重得到更准确的车道信息,进而终端设备可以基于该准确的车道信息,为用户提供准确的导航路线。
在一种可能的实现方式中,方法还包括:当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,第一服务器持续依据第一导航路线为终端设备导航。这样,终端设备可以对不同的场景中得到的车道之间的距离进行判断,并得到更准确的车道信息,进而导航软件可以基于该准确的车道信息,为用户提供准确的导航路线。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
第三方面,本申请实施例提供一种定位方法,方法包括:终端设备向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;终端设备接收来自第一服务器的第一导航路线;第一导航路线与起始位置和目的地有关;在终端设备根据第一导航路线行驶的过程中,终端设备向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,终端设备向第一服务器发送提示信息;提示信息用于提示终端设备即将驶入路口;终端设备接收来自第一服务器的用于指示目标车道的指示信息;终端设备根据指示信息提示用户处于目标车道。这样,终端设备可以基于 摄像头与车道的对应关系,精准判断用户位于哪一条车道,进而接收到该车道信息的终端设备可以基于准确的车道进行导航。
第四方面,本申请实施例提供一种定位装置,应用于定位系统,定位系统包括:终端设备和第一服务器,装置包括:通信单元,用于向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;通信单元,还用于根据起始位置和目的地,向终端设备发送第一导航路线;在终端设备根据第一导航路线行驶的过程中,通信单元,还用于向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,处理单元,用于获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;通信单元,还用于向终端设备发送用于指示目标车道的指示信息;处理单元,还用于根据指示信息提示用户处于的目标车道。
在一种可能的实现方式中,处理单元,具体用于基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;处理单元,还具体用于在多个图像中识别到目标对象的标识时,第一服务器确定包括目标对象的标识的目标图像对应的目标摄像头;处理单元,还具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,通信单元,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元,还具体用于接收来自第二服务器的目标车道的标识。
在一种可能的实现方式中,通信单元,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元,还具体用于接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;通信单元,具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,在目标车道与第一导航路线中指示的车道不同时,通信单元,还用于根据目标车道和目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,在第一服务器在第一时间段内接收到来自终端设备的位置信息时,处理单元,具体用于在第一时间段内持续依据第二导航路线为终端设备导航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,处理单元,还具体用于根据第一时间段之后接收的终端设备的位置信息为终端设备导航。
在一种可能的实现方式中,处理单元,还用于为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,处理单元,还用于根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,处理单元,还用于持续依据 第二导航路线为终端设备导航。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
第五方面,本申请实施例提供一种定位装置,装置包括:通信单元,用于接收来自终端设备的需要进行导航的目标对象的标识、起始位置和目的地;通信单元,还用于根据起始位置和目的地,向终端设备发送第一导航路线;通信单元,还用于接收终端设备在第一导航路线行驶的过程中的位置信息;在位置信息反映终端设备即将驶入路口时,处理单元,用于获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;通信单元,还用于向终端设备发送用于指示目标车道的指示信息。
在一种可能的实现方式中,处理单元,具体用于基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;第一服务器在多个图像中识别到目标对象的标识时,处理单元,还具体用于确定包括目标对象的标识的目标图像对应的目标摄像头;处理单元,还具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,通信单元,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元,具体用于接收来自第二服务器的目标车道的标识。
在一种可能的实现方式中,通信单元,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元,还具体用于接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;处理单元,具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,在目标车道与第一导航路线中指示的车道不同时,通信单元,还用于根据目标车道和目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,在第一服务器在第一时间段内接收到来自终端设备的位置信息时,处理单元,具体用于在第一时间段内持续依据第二导航路线为终端设备导航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,处理单元,还具体用于根据第一时间段之后接收的终端设备的位置信息为终端设备导航。
在一种可能的实现方式中,处理单元,还用于为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,通信单元,还用于根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,处理单元,还用于持续依据第二导航路线为终端设备导航。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
第六方面,本申请实施例提供一种定位装置,装置包括:通信单元,用于向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;通信单元,还用于接收来自第一服务器的第一导航路线;第一导航路线与起始位置和目的地有关;在终端设备根据第一导航路线行驶的过程中,通信单元,还用于向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,通信单元,还用于向第一服务器发送提示信息;提示信息用于提示终端设备即将驶入路口;通信单元,还用于接收来自第一服务器的用于指示目标车道的指示信息;处理单元,用于根据指示信息提示用户处于目标车道。
第七方面,本申请实施例提供一种定位装置,包括处理器和存储器,存储器用于存储代码指令;处理器用于运行代码指令,使得电子设备以执行如第一方面或第一方面的任一种实现方式中描述的定位方法,如第二方面或第二方面的任一种实现方式中描述的定位方法,或如第三方面或第三方面的任一种实现方式中描述的定位方法。
第八方面,本申请实施例提供一种计算机可读存储介质,计算机可读存储介质存储有指令,当指令被执行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的定位方法,如第二方面或第二方面的任一种实现方式中描述的定位方法,或如第三方面或第三方面的任一种实现方式中描述的定位方法。
第九方面,一种计算机程序产品,包括计算机程序,当计算机程序被运行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的定位方法,如第二方面或第二方面的任一种实现方式中描述的定位方法,或如第三方面或第三方面的任一种实现方式中描述的定位方法。
应当理解的是,本申请的第四方面至第九方面与本申请的第一方面至第三方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1为本申请实施例提供的一种场景示意图;
图2为本申请实施例提供的一种终端设备200的框架示意图;
图3为本申请实施例提供的一种导航系统300的框架示意图;
图4为本申请实施例提供的一种基于导航服务器定位的场景示意图;
图5为本申请实施例提供的一种基于导航服务器和交通平台服务器定位的场景示意图;
图6为本申请实施例提供的一种基于导航服务器定位的流程示意图;
图7为本申请实施例提供的一种的输入车牌信息的界面示意图;
图8为本申请实施例提供的一种的显示路面层的界面示意图;
图9为本申请实施例提供的另一种的显示路面层的界面示意图;
图10为本申请实施例提供的一种用户上报的界面示意图;
图11为本申请实施例提供的一种基于导航服务器和交通平台服务器定位的流程示意图;
图12为本申请实施例提供的一种定位装置的结构示意图;
图13为本申请实施例提供的一种控制设备的硬件结构示意图;
图14为本申请实施例提供的一种芯片的结构示意图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一值和第二值仅仅是为了区分不同的值,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
随着城市交通的发展,立交桥和隧道的数量越来越多,其建造也越来越复杂。例如包含多层路面的立交桥。立交桥给交通带来便利的同时,也给路面导航带来了挑战。通常情况下,当用户使用导航驶入立交桥时,由于立交桥在同一位置设有多层路面,这就使得导航无法区分当前用户位于立交桥的哪一层。可能会出现,用户已经驶入了错误的路面层,但导航仍按照用户行驶在正确的路面层上指示路线。
可以理解的是,本申请实施例的立交桥也可以替换为包括主路和辅路的道路,该主路和辅路可以在同一层,也可以在不同的层,通常情况下,导航无法区分主路和辅路,导致也不能为用户实现精准导航。适应的,该路面层也可以称为道路层。该路面层可以用于表示多层路面中不同层的路面;或者,该路面层也可以表示同一层路面中不同的道路,例如,相邻的道路中的主路和辅路,可以用不同的路面层表示。
为便于描述,后续均已立交桥和路面层进行描述,该描述并不构成对场景的具体限定。
示例性的,图1为本申请实施例提供的一种场景示意图。如图1所示,该场景中包括立交桥100。该立交桥100中包括多个道路:例如道路101、道路102和道路103。其中,道路101可以为一层的道路,用于驶入立交桥;道路102可以为二层的道路;道路103可以为三层的道路。示例性的,车辆104可以沿道路101驶入立交桥100,该车辆104可以沿箭头②所指示的方向驶入道路102,或者该车辆104也可以沿箭头①所指示的方向驶入道路103。
在道路101上,车辆104可以基于导航106所指示的路线行驶。其中,该导航106可以为:车载导航或手机导航等。
该立交桥100中可以包括多个摄像头,例如设置在专门的摄像头固定杆上的摄像头112,设置在路灯上的摄像头108,设置在广告牌上的摄像头110,或者设置在桥下的摄像头111等。可以理解的是,该摄像头可以根据实际场景设置在其他位置上,本申请实施例中对此不做限定。
示例性的,在图1对应的场景中,用户按照导航106指示的路线,驾驶车辆104沿道路101驶入立交桥100,用户本应该根据导航106指示的路线驾驶车辆104行驶在道路103上,但用户驾驶车辆104行驶到了道路102上。导航106本应该及时发现用户行驶路线错误,进而导航106可以根据道路102为用户重新规划路线。但是,由于导航106无法区分当前用户位于立交桥的哪一层,也就无法识别用户已经走错了路线,例如,用户已经驾驶车辆104行驶到了道路102上,因为道路102和道路103在经纬度定位或GPS定位中,位置几乎一致,所以导航不能识别到用户不在导航所指示的道路103上行驶,进而导航也就无法基于正确的路面层给用户提供准确的路线。
现有技术中提供了一种基于RFID的立交桥智能导航方法。具体的,可以在立交桥上设置RFID标签,当车辆判断当前道路为立交桥道路(或其他多层路面的道路)时,可以使用射频天线接收设置于立交桥上的RFID标签发送的信息,该信号可以为带有当前所在的道路编号的射频信号,进而车辆可以准确的知道自身所在的路面层,并基于获取的GPS定位信号、道路编号和行车路线进行导航。
然而,上述方法存在以下问题:其一,使用GPS进行定位时,由于GPS定位信号的定位精度在十米左右,在立交桥密集处就会发生导航不准甚至导航错误的情况;其二,将射频天线设置在车辆的底部,且将RFID标签设置在道路地面下方的方法,需要将RFID铺设到立交桥路面上,这种情况下,RFID不仅容易被来往的车辆压坏,而且其铺设RFID的工程量大,也很容易破坏现有路面,成本较高;其三,RFID的识别宽度有限,车辆的在道路中距离RFID标签较远,可能出现车辆无法识别到路面的RFID标签发送信号的情况。
有鉴于此,本申请实施例提供一种定位方法,可以充分利用设置在多层路面中的摄像头,并基于多层道路中的摄像头与路面层的对应关系,精准判断用户位于该多层道路中的哪一层,进而接收到该路面层信息的终端设备可以基于准确的路面层进行导航。其中,该终端设备可以为具有导航能力的车辆或手机等设备。
可以理解的是,上述终端设备也可以称为终端,(terminal)、用户设备(user equipment,UE)、移动台(mobile station,MS)、移动终端(mobile terminal,MT)等。终端设备可以是手机(mobile phone)、智能电视、穿戴式设备、平板电脑(Pad)、带无线收发功能的电脑、虚拟现实(virtual reality,VR)终端设备、增强现实(augmented reality,AR)终端设备、工业控制(industrial control)中的无线终端、无人驾驶(self-driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等等。本申请的实施例对终端设备所采用的具体技术和具体设备形态不做限定。
为了能够更好地理解本申请实施例,下面对本申请实施例的终端设备的结构进行介绍。
示例性的,图2为本申请实施例提供的一种终端设备200的结构示意图。
本申请实施例中,该终端设备200中包含GPS定位模块180L,该定位模块180L与终端设备中的导航软件相对应。例如,GPS定位模块180L可以定位终端设备当前的位置,导航软件可以将该定位的结果呈现给用户。其中,该GPS定位模块和导航软件可以是车载的GPS定位模块和导航软件,也可以是用户移动终端GPS定位模块和导航软件。该导航软件可以包括:百度导航软件或高德导航软件等。
如图2所示,终端设备200可以包括处理器110,外部存储器接口120,内部存储器121,电源管理模块141,天线1,天线2,移动通信模块150,无线通信模块160,传感器模块180,按键190,摄像头193以及显示屏194等。其中传感器模块180可以包括:压力传感器180A,加速度传感器180E,指纹传感器180H,触摸传感器180K以及定位模块180L等。
可以理解的是,本申请实施例示意的结构并不构成对终端设备200的具体限定。可以理解的是,终端设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuitsound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。
终端设备200的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。终端设备200中的天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。
移动通信模块150可以提供应用在终端设备200上的包括2G/3G/4G/5G等无线通信的解决方案。其中,无线通信模块160可以提供应用在终端设备200上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system, GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。
在一些实施例中,终端设备200的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得终端设备200可以通过无线通信技术与网络以及其他设备通信。无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(codedivision multiple access,CDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。
终端设备200通过显示屏194实现显示功能。显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,终端设备200可以包括1个或N个显示屏194,N为大于1的正整数。
终端设备200可以通过摄像头193等实现拍摄功能。摄像头193用于捕获静态图像或视频。在一些实施例中,终端设备200可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,实现扩展终端设备200的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。
内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。
加速度传感器180E可检测终端设备200在各个方向上(一般为三轴)加速度的大小。
指纹传感器180H用于采集指纹。终端设备200可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。
按键190包括音量键等。按键190可以是机械按键。也可以是触摸式按键。终端设备200可以接收按键输入,产生与终端设备200的用户设置以及功能控制有关的键信号输入。
传感器模块180也可以包括定位模块180L。例如,该定位模块可以基于GPS系统定位,也可以基于北斗系统或者其他定位系统进行定位。该定位模块180L可用于估计终端设备200的地理位置。
示例性的,图3为本申请实施例提供一种导航系统300的结构示意图。如图3所示,该导航系统300中可以包括:摄像头301、导航服务器302。可选的,该导航系统300中也可以包括交通平台服务器303等设备。
摄像头301可以用于拍摄车辆的照片。具体的,该摄像头301可以拍摄行驶在多层路面中的车辆,也可以对该车辆的照片进行图像处理,进而识别出该车辆的车牌信 息,并将该车牌信息上传至服务器。可选的,当该摄像头301不具备图像处理功能时,该摄像头301也可以将拍摄的车辆的照片上传至服务器,进而由服务器对该车辆的照片进行图像处理,识别出牌信息。
其中,该摄像头301可以为路面中设置的用于抓拍并检测违规事项的摄像头,该摄像头的数量可以为一个或多个。
导航服务器302可以用于实现导航相关数据的存储、处理、接收和发送等功能。例如,该导航服务器302可以为属于导航软件公司的服务器,该导航软件公司如百度或高德等。具体的,该导航服务器302可以保存有多层路面中的摄像头编号与路面层的对应关系,并根据摄像头编号与路面层的对应关系,确定车辆所在的路面层。
该交通平台服务器303可以用于收集和存储由摄像头301拍摄的照片,例如该交通平台服务器303可以为属于交通管理部门的服务器。具体的,交通平台服务器303也可以保存有多层路面中的摄像头编号与路面层的对应关系,并根据摄像头编号与路面层的对应关系,确定车辆所在的路面层。
可以理解的是,该导航系统300可以根据实际场景包括其他内容,本申请实施例中对此不做限定。
为了更好的理解本申请实施例的方法,下面首先对本申请实施例适用的应用场景进行描述。
可能的实现方式中,本申请实施例提供的定位方法可以应用于多种场景。该多种场景可以包括,场景一:基于导航服务器302实现定位的场景(如图4对应的场景);和场景二:基于导航服务器302和交通平台服务器303实现定位的场景(如图5对应的场景)等。
场景一:基于导航服务器302实现定位的场景。
示例性的,图4为本申请实施例提供的一种基于导航服务器定位的场景示意图。如图4所示,该场景中可以包括:多层路面,如路面层①、路面层②以及路面层③等。该场景中还可以包括:车辆401、该车辆401的GPS定位模块402、导航服务器302、采集模块403、采集模块404以及采集模块405等。其中,该车辆401中包含车牌,该车牌用于标识车辆401。其中,采集模块403采集路面层①中的车牌信息,采集模块404采集路面层②中的车牌信息,以及采集模块405采集路面层③中的车牌信息。
示例性的,当车辆401行驶到该多层路面的路口时,车辆401的GPS定位模块402可以识别到车辆将进入到多层路面,并将该车辆401的位置信息上传至导航服务器302,或者,车辆401可以向导航服务器上报车辆401的位置信息,导航服务器可以识别到车辆将进入到多层路面。车辆401继续行驶,当车辆401行驶到路面层②上时,采集模块404可以拍摄该车辆401的照片,并识别出该车辆401对应的车牌信息,采集模块404可以将该采集模块404对应的编号以及车辆401对应的车牌信息,上传至导航服务器302。进而,导航服务器302可以根据该采集模块404对应的编号确定车辆401所在的路面层,例如路面层②,并将该路面层的信息发送至车辆401对应的导航软件中,进而导航软件可以根据准确的路面层更新导航路线,例如导航软件指示车辆401可以按照箭头④指示的方向行驶,或者按照箭头⑤指示的方向行驶。
场景二:基于导航服务器302和交通平台服务器303实现定位的场景。
示例性的,图5为本申请实施例提供的一种基于导航服务器和交通平台服务器定位的场景示意图。如图5所示,该场景中可以包括:多层路面,如路面层①、路面层②以及路面层③等。该场景中还可以包括:车辆401、该车辆401的GPS定位模块402、导航服务器302、交通平台服务器303、采集模块403、采集模块404以及采集模块405等。其中,该车辆401中包含车牌。其中,采集模块403采集路面层①中的车牌信息,采集模块404采集路面层②中的车牌信息,以及采集模块405采集路面层③中的车牌信息。
示例性的,当401车辆行驶到该多层路面的路口的前100m时,车辆401的GPS定位模块402可以识别到车辆即将驶入到多层路面,或者,车辆401可以向导航服务器上报车辆401的位置信息,导航服务器可以识别到车辆将进入到多层路面,并触发导航服务器302将查询请求发送给交通平台服务器303。交通平台服务器303接收到该查询请求,并获取一段时间内该多层路面的采集模块采集得到的车牌序列(车牌序列为采集的车牌集合),并获取该车牌序列中采集车辆401车牌所对应的采集模块的编号,例如采集模块404。交通平台服务器303可以根据该采集模块404对应的编号确定车辆401所在的路面层,例如路面层②,并将该路面层的信息发送至车辆401对应的导航软件中,进而导航软件可以根据准确的路面层的信息更新导航路线,例如车辆401可以按照箭头④指示的方向行驶,或者按照箭头⑤指示的方向行驶。
对于场景一,示例性的,图6为本申请实施例提供的一种基于导航服务器定位的流程示意图。如图6对应的实施例中,以采集模块为摄像头进行示例说明。其中,图4中的采集模块403也可以理解为摄像头403,图4中的采集模块404也可以理解为摄像头404,图4中的采集模块405也可以理解为摄像头405,图4中的车辆401也可以理解为终端200。示例性的,基于导航服务器定位的方法可以包括如下步骤:
S601、终端200获取车牌信息,并将该车牌信息发送至导航服务器302。
适应的,导航服务器302可以接收到终端200发送的车牌信息。
本申请实施例中,该车牌信息可以为车牌号码等;该终端200可以为车辆,或者手机等设备。其中,该终端200中包含导航软件。终端200获取车牌信息的方法可以为,终端200获取用户输入至终端200内的导航软件中的车牌信息。
示例性的,图7为本申请实施例提供的一种输入车牌信息的界面示意图。在图7对应的实施例中,以终端200为手机为例进行示例说明,该示例并不构成对本申请实施例的限定。
示例性的,当用户打开手机中的导航软件时,手机可以显示如图7中的a所示的界面,该界面中可以包括用于输入车牌信息的设置车牌控件701等。当用户在如7中的a所示的界面中,触发该设置车牌控件701时,导航软件可以由图7中的a所示的界面跳转至如图7中的b所示的界面。如图7中的b所示的界面中,用户可以在请填写您的车牌702中输入车牌信息。响应于用户输入车牌信息的操作,手机可以接收到该用户输入的车牌信息,并将该车牌信息发送至导航服务器302。该车牌信息例如:某A 12345。
S602、终端200将GPS定位信息发送至导航服务器302。
适应的,导航服务器302可以接收到终端200发送的GPS定位信息。
本申请实施例中,GPS定位信息可以用于标识终端200所在的位置,或者用于确定终端200所在的多层路面的位置,该GPS定位信息可以由终端200中的GPS定位模块生成。例如,终端200可以将实时获取的GPS定位信息发送至导航服务器302;相应的,导航服务器302可以实时对终端200进行定位,进而确定终端200所在的位置。
可以理解的是,此时导航服务器302中可以保存有从S601中获取的车牌信息,以及从S602中获取的GPS定位信息的对应关系。
示例性的,如图4所示的场景中,当终端200驶入多层路面的路口时,终端200中的GPS定位模块可以识别到车辆进入多层路面,并实时将该车辆所在的位置发送至导航服务器302。此时车辆可以按照终端200中的导航软件指示的路线继续行驶。
S603、导航服务器302获取摄像头拍摄的车牌照片,以及该车牌照片对应的摄像头编号。
本申请实施例中,该车牌照片可以用于识别车辆。该车牌照片可以为摄像头拍摄的包含车牌信息的照片。其中,该摄像头可以为多层路面中的至少一个摄像头。
示例性的,导航服务器302中可以获取来自多个摄像头(例如图4所示的场景中的,摄像头403、摄像头404以及摄像头405)拍摄的车牌照片,以及该车牌照片对应的摄像头编号。本申请实施例中的摄像头可以具有图像识别能力,或者,不具有图像识别能力,其中图像识别能力用于将拍摄的车牌照片进行图像识别,识别出准确的车牌信息。其中,导航服务器302可以控制摄像头拍摄车牌照片,或者导航服务器302可以获取连续拍摄的摄像头上传至导航服务器302的照片中,所需的照片。
一种实现中,当该摄像头具有图像识别功能时,该多层路面的摄像头可以拍摄多张车牌照片,并基于摄像头中的图像处理模块识别出该车牌照片中的车牌信息,并将该车牌信息,以及该车牌信息对应的摄像头编号上传至导航服务器302,后续导航服务器302可以执行S605所示的步骤。
另一种实现中,当该摄像头不具有图像识别功能时,该多层路面的摄像头可以拍摄多张车牌照片,并将该车牌照片,以及该车牌照片对应的摄像头编号上传至导航服务器302,导航服务器302具有图像识别能力,后续导航服务器302可以执行S604所示的步骤。
S604、导航服务器302对车牌照片进行图像处理,得到车牌信息。
S605、导航服务器302根据车牌信息和摄像头编号,确定该车辆对应的路面层。
本申请实施例中,导航服务器302中可以保存有摄像头编号与路面层的对应关系。如图4所示的场景中,该摄像头与路面层的对应关系如下表1所示:
表1 摄像头编号与路面层对应示意表
摄像头编号 路面层
摄像头403
摄像头404
摄像头405
其中,摄像头403与路面层①对应,摄像头404与路面层②对应,摄像头405与路面层③对应。
示例性的,导航服务器302可以获取多组车牌信息和摄像头编号的对应关系。进一步的,导航服务器302可以根据S601所示的步骤中上传的车牌信息,如某12345,在该多组车牌信息和摄像头编号的对应关系中,确定该某12345对应的摄像头编号,如摄像头404。进而,如表1所示,导航服务器可以根据摄像头404,确定该终端200位于路面层②。
S606、导航服务器302将路面层信息发送至终端200。
适应的,终端200可以确定是否接收到该导航服务器302发送的路面层信息。
一种实现中,若终端200接收到该路面层信息,则终端200中的导航软件可以显示该路面层信息。该路面层信息用于指示该路面位于多层路面中的层级,该路面层信息可以为路面层编号等其他形式,例如路面层②。
示例性的,图8为本申请实施例提供的一种的显示路面层的界面示意图。在图8对应的实施例中,以终端200为手机为例进行示例说明。手机接收到导航服务器发送的该手机位于路面层②的信息时,手机中的导航软件可以显示如图8所示的界面。如图8所示,可以在手机左半屏的指示信息801中显示当前车辆位于路面层②的信息,以及在手机右半屏中显示路面层②对应的路线。
另一种实现中,若终端200未接收到该路面层信息,则终端200中的导航软件可以继续按照原导航算法继续导航。
示例性的,图9为本申请实施例提供的另一种的显示路面层的界面示意图。在图9对应的实施例中,以终端200为手机为例进行示例说明。当手机未接收到路面层信息时,手机中的导航软件可以显示如图9所示的原导航路线的界面,如沿路面层①指示的导航路线。如图9所示,可以在手机左半屏中显示当前车辆正在行驶的方向和行驶方向的说明,以及在手机右半屏中显示原导航算法指示的路面层①对应的路线。
基于此,在多层路面的场景中,导航服务器可以根据摄像头与路面层的对应关系,实现路面层的精准定位,进而为用户提供更准确的导航路线,且本申请实施例可以充分利用导航服务器等设备现有的功能,无需搭建新的服务器,降低实施成本。
在图6对应的实施例的基础上,可能的实现方式中,在S605确定路面层后,可以基于以下几种方法实现路面层的更新,或者维持当前的路面层。
一种实现中,导航软件可以基于路面层的权重,实现路面层的更新,或者维持当前的路面层。
示例性的,当利用摄像头对终端所在的路面层进行定位后,导航服务器可以为该摄像头确定的路面层设置较高的权重;当终端的GPS定位信息指示该终端所在的车辆位于其他路面层时,导航服务器可以为该GPS定位信息指示的路面层设置较低的权重。当导航软件接收到基于摄像头确定的路面层后,接收到该GPS定位信息指示的路面层,由于该GPS定位信息确定的路面层的权重低于摄像头确定的路面层的权重,因此,后续导航软件可以以摄像头确定的路面层信息为准,且可以忽略该GPS定位信息指示的路面层信息。
基于此,对不同设备确定的路面层,设置不同的权重,可以在不同的适用条件下, 根据该权重得到更准确的路面层信息,进而导航软件可以基于该准确的路面层信息,为用户提供更准确的导航路线。
另一种实现中,导航软件可以基于时间,实现路面层的更新,或者维持当前的路面层。
示例性的,当利用摄像头对终端所在的路面层,例如路面层①进行定位后,导航软件可以在一定时间阈值内,以该路面层①为准,且不对路面层①进行更新。当超过该时间阈值时,导航软件可以重新请求获取该车辆所在的路面层的信息。例如,导航服务器可以在得到路面层时,为该路面层设置有效时间。如为该路面层①设置有效时间,1分钟。当终端的导航软件获取基于摄像头确定的当前位于路面层①的信息时,在该路面层①的1分钟的有效时间内,导航软件可以不对该路面层①进行更新。可以理解为,在该1分钟之内,即使导航软件接收到其他设备发送的路面层信息,也不对该路面层①进行更新。当超出路面层①的有效时间时,导航软件可以以接收到的新的路面层信息为准。
又例如,导航软件可以接收到GPS定位模块指示的路面层信息,以及接收到基于摄像头确定的路面层信息。由于传输延迟,当导航软件先接收到GPS定位模块指示的路面层信息为路面层②,并将原路面层更新至路面层②;随后接收到基于摄像头确定的路面层信息为路面层②,此时由于车辆可能已经行驶了一段距离,这时接收到的基于摄像头确定的路面层信息可能不够准确了,因此可以丢弃该摄像头确定的路面层信息,维持当前的路面层。
基于此,对不同设备确定的路面层,设置不同的时间,可以在不同的时间条件下,获得该段时间内更准确的路面层信息,进而导航软件可以基于该准确的路面层信息,为用户提供更准确的导航路线。
又一种实现中,导航软件可以基于用户在终端上报的路面层,实现路面层的更新。
示例性的,图10为本申请实施例提供的一种用户上报的界面示意图。在图10对应的实施例中,以终端200为手机为例进行示例说明。由于用户可以确定驾驶车辆位于多层路面中的哪一层。因此,当手机中的GPS定位模块检测到车辆驶入多层路面后,可以向用户发送提示信息,如图10中的a所示的界面,可以在手机左半屏的指示信息1001中显示请选择当前行驶路面层,并在手机右半屏中显示未接收到用户上报路面层信息前的,导航软件指示的路面层②对应的导航路线。当用户触发路面层①对应的控件时,响应于用户的触发操作,手机中的导航软件可以由图10中的a所示的界面切换至如图10中的b所示的界面。在如图10中的b所示的界面中,手机左半屏的指示信息1002中可以显示:已为您切换至路面层①对应的路线的信息,并在在手机右半屏中显示接收到用户上报路面层信息后的,导航软件指示的路面层①对应的导航路线。
基于此,导航软件可以在不同的场景中,基于用户上报的路面层信息,获得当前较为准确的路面层信息,进而导航软件可以为用户提供更准确的导航路线。
对于场景二,示例性的,图11为本申请实施例提供的一种基于导航服务器和交通平台服务器定位的流程示意图。如图11对应的实施例中,以采集模块为摄像头进行示例说明。其中,图5中的采集模块403也可以理解为摄像头403,图5中的采集 模块404也可以理解为摄像头404,图5中的采集模块405也可以理解为摄像头405,图5中的车辆401也可以理解为终端200。示例性的,基于导航服务器和交通平台服务器定位的方法可以包括如下步骤:
S1101、终端200获取车牌信息,并将该车牌信息发送至导航服务器302。
可以理解的是,S1101与图6对应的实施例中S601所示的步骤类似,在此不再赘述。
S1102、终端200将GPS定位信息发送至导航服务器302。
导航服务器302可以根据该定位信息确定终端200的位置,或者车辆所在的位置,例如车辆位于哪个立交桥、隧道或多层路面上。
S1103、终端200中的导航软件识别到车辆到达多层路面的路口100米(m)时,设置初始时刻为t0。
其中,导航软件也可以基于车辆到达多层路面Nm时,设置初始时刻。其中,N可以为正整数。
示例性的,终端200中的导航软件可以通过GPS定位模块识别到车辆距离多层路面路口的摄像头位置仅有100米,此时设为t0,并发送触发信号给导航服务器302。
S1104、终端200中的导航软件可以触发导航服务器302向交通平台服务器303发送查询请求。
适应的,交通平台服务器303接收到导航服务器302发送的查询请求。其中,该查询请求中可以包括车辆的车牌信息和/或GPS定位信息等。
可能的实现方式中,该查询请求中也可以包含多层路面的位置信息或多层路面编号(或称为立交桥的标识)等。示例性的,导航服务器302可以基于终端200的GPS定位信息,确定多层路面所在的位置,或者多层路面编号。
可能的实现方式中,该查询请求中也可以包含摄像头编号。示例性的,当导航服务器302中保存有多层路面中的摄像头编号时,导航服务器302可以根据GPS定位信息,定向查询该多层路面中的哪个摄像头拍摄得到的车牌照片或车牌序列等,得到该摄像头的编号。
S1105、交通平台服务器303根据多层路面的位置信息,调取(t0+3)s-(t0+9)s之内该多层路面的位置上所有摄像头拍摄得到的车牌序列A,并将得到的车牌信息与车牌序列A进行对比,获取该车牌信息对应的摄像头编号或路面层信息。
其中,该车牌序列A可以为,多层路面的位置中所有摄像头,在(t0+3)s-(t0+9)s之内,对拍摄得到车牌照片进行图像处理后,得到的多个车牌信息对应的序列。例如,在如图5对应的场景中,交通平台服务器303可以获取摄像头403、摄像头404和摄像头405拍摄得到的车牌序列A。
本申请实施例中,该时间范围的确定方法可以为,例如车辆按照60千米/小时(或16.7米/秒)速度行进,当t0时刻车辆在摄像头前方100m位置时,再过6s,车辆可以行驶到立交桥摄像头所在位置,此时,(t0+6)s时刻,摄像头完成拍摄;由于车辆速度有快慢,因此,可以在前后预留一部分余量,保证大部分情况下摄像头能拍摄到,因此可以将时间范围设置为(t0+3)s-(t0+9)s。可以理解的是,该时间范围可以根据实际场景包括其他内容,本申请实施例中对此不做限定。例如,该时间范围也可以为(t0+4) s-(t0+8)s等。
本申请实施例中,可以由交通平台服务器303,或者导航服务器302获取终端200的路面层信息。
一种实现中,当交通平台服务器303保存有摄像头编号与路面层的对应关系时,获取路面层信息的方法可以为,交通平台服务器303可以将车牌序列A与S1104中获取车牌信息进行对比,如果从车牌序列A中找到该车牌信息,则交通平台服务器303可以查询拍摄该车牌信息对应的摄像头编号,并基于交通平台服务器303保存的摄像头编号与路面层的对应关系,确定该摄像头编号对应的路面层信息,后续可以执行S1106所示的步骤中向导航服务器302发送该路面层信息。
另一种实现中,当导航服务器302保存有摄像头编号与路面层的对应关系时,获取路面层信息的方法可以为,交通平台服务器303可以将车牌序列A与S1104中获取车牌信息进行对比,如果从车牌序列A中找到该车牌信息,则交通平台服务器303可以查询拍摄该车牌信息对应的摄像头编号,后续可以执行S1106所示的步骤向导航服务器302发送该摄像头编号。
进而,导航服务器302可以基于摄像头编号和路面层的对应关系确定车辆所在路面层,后续可以将该路面层信息发送至终端200。
其中,若交通平台服务器303将车牌序列A与S1104中获取车牌信息进行对比,且未从车牌序列A中找到该车牌信息时,可以将空信息返回给终端200。适应的,当终端200接收到该空信息后,该终端200中的导航软件可以继续按照原导航算法继续导航。
S1106、交通平台服务器303将摄像头编号或路面层信息发送至导航服务器302。
适应的,导航服务器302可以接收到该交通平台服务器303发送的摄像头编号或路面层信息。
一种实现中,当导航服务器302接收到该交通平台服务器303发送的摄像头编号时,导航服务器可以基于自身保存的摄像头编号与路面层的对应关系,确定该摄像头编号对应的路面层信息,后续可以执行S1107所示的步骤。
另一种实现中,交通平台服务器303可以基于自身保存的摄像头编号与路面层的对应关系确定路面层信息,当导航服务器302接收到该交通平台服务器303发送的路面层信息时,后续可以执行S1107所示的步骤。
S1107、导航服务器302将路面层信息发送至终端200。
适应的,终端200中接收到该导航服务器302发送的路面层信息,并在导航软件中显示该路面层信息。
基于此,在多层路面的场景中,交通平台服务器和导航服务器可以根据摄像头与路面层的对应关系,实现路面层的精准定位,进而为用户提供更准确的导航路线,且可以充分利用导航服务器或交通平台服务器等设备现有的功能,无需搭建新的服务器,降低实施成本。
在图6对应的实施例或者图11对应的实施例的基础上,可能的方式中,若当前处于雷雨天或雾霾天等,天气恶劣,或者能见度较低的场景中,可能会出现基于摄像头拍摄的车牌照片,识别得到的车牌信息不准确的情况。
示例性的,在下雨天时,摄像头由于天气原因,拍摄的车牌照片可能比较模糊,进而出现摄像头无法识别出该拍摄的车牌照片中的车牌信息,或者,摄像头识别该拍摄的车牌照片中的车牌信息出现错误等情况。因此,当出现该摄像头拍摄得到的车牌信息不准确时,可以通过以下方式,进行纠错。
一种实现中,当处于该天气恶劣的场景中,由于该摄像头拍摄得到的车牌信息可能不准确,因此导航软件可以基于路面层的权重进行纠错。例如,当处于该天气恶劣的场景中,由于该摄像头拍摄得到的车牌信息可能不准确,因此导航软件可以基于摄像头确定的路面层,以及基于GPS定位信息指示的路面层,该两个路面层的权重进行纠错。
示例性的,当利用摄像头对车辆所在的路面层进行定位后,导航服务器可以为该摄像头确定的路面层设置较低的权重;当终端的GPS定位信息指示该终端所在的车辆位于其他路面层时,导航服务器可以为该GPS定位信息指示的路面层设置较高的权重。当导航软件接收到基于摄像头确定的路面层后,接收到该GPS定位信息指示的路面层,由于该GPS定位信息确定的路面层的权重高于摄像头确定的路面层的权重,因此导航软件可以以该GPS定位信息指示的路面层信息为准,对基于摄像头确定的路面层信息进行纠错,并以基于该GPS定位信息指示的路面层信息为准。
基于此,导航软件可以在不同的场景中,根据该权重得到更准确的路面层信息,进而导航软件可以基于该准确的路面层信息,为用户提供更准确的导航路线。
另一种实现中,当处于该天气恶劣的场景中,由于该摄像头拍摄得到的车牌信息可能不准确,因此可以基于路面层间的距离进行纠错。例如,当处于该天气恶劣的场景中,由于该摄像头拍摄得到的车牌信息可能不准确,因此导航软件可以利用基于摄像头确定的路面层,以及基于GPS定位信息指示的路面层,该两个路面层之间的距离进行纠错。
示例性的,当利用摄像头对终端所在的路面层,例如路面层①进行定位后,当终端的GPS定位信息指示该终端所在的车辆位于其他路面层,例如路面层②时,导航软件可以基于路面层①与路面层②之间的距离,判断路面层①是否需要更新。例如,当导航软件确定路面层①与路面层②之间的距离超过某一距离阈值时,可以确定该路面层①不准确,此时导航软件可以以GPS定位信息指示的路面层②为准,对原路面层①进行纠错;当路面层①与路面层②之间的距离未超过某一距离阈值时,可以确定该路面层①准确,此时导航软件可以不对原路面层①进行纠错。
基于此,可以在不同的场景中,导航软件可以根据不同的设备确定的路面层,该路面层间的距离,得到更准确的路面层信息,进而导航软件可以基于该准确的路面层信息,为用户提供更准确的导航路线。
又一种实现中,当处于该天气恶劣的场景中,由于该摄像头拍摄得到的车牌信息可能不准确,因此导航软件可以利用基于用户输入的路面层进行纠错。
示例性的,当用户驾驶车辆行驶在路面层①中,但导航软件显示基于摄像头确定的路面层为路面层②时,用户可以通过更改该导航软件中的路面层信息,对基于摄像头确定的路面层进行纠错。
基于此,可以在不同的场景中,导航软件可以根据用户输入得到相对准确的路面 层,进而导航软件可以基于该准确的路面层信息,为用户提供更准确的导航路线。可以理解的是,本申请实施例提供的界面图只作为一种示例,并不作为本申请实施例的限定。
上面结合图6-图11,对本申请实施例提供的方法进行了说明,下面对本申请实施例提供的执行上述方法的装置进行描述。
示例性的,图12为本申请实施例提供的一种定位装置120的结构示意图,如图12所示,定位装置120可以用于通信设备、电路、硬件组件或者芯片中,该定位装置包括:处理单元1201和通信单元1202。其中,处理单元1201用于支持定位装置执行信息处理的步骤;通信单元1202用于支持定位装置执行数据发送或接收的步骤。其中,该定位装置120可以是本申请实施例中的定位系统、终端设备或第一服务器。
具体的,当该定位装置120为定位系统时,本申请实施例提供一种定位装置,应用于定位系统,定位系统包括:终端设备和第一服务器,装置包括:通信单元1202,用于向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;通信单元1202,还用于根据起始位置和目的地,向终端设备发送第一导航路线;在终端设备根据第一导航路线行驶的过程中,通信单元1202,还用于向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,处理单元1201,用于获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;通信单元1202,还用于向终端设备发送用于指示目标车道的指示信息;处理单元1201,还用于根据指示信息提示用户处于的目标车道。
在一种可能的实现方式中,处理单元1201,具体用于基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;处理单元1201,还具体用于在多个图像中识别到目标对象的标识时,第一服务器确定包括目标对象的标识的目标图像对应的目标摄像头;处理单元1201,还具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,通信单元1202,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元1202,还具体用于接收来自第二服务器的目标车道的标识。
在一种可能的实现方式中,通信单元1202,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元1202,还具体用于接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;通信单元1202,具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,在目标车道与第一导航路线中指示的车道不同时,通信单元1202,还用于根据目标车道和目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,在第一服务器在第一时间段内接收到来自终端设备的位置信息时,处理单元1201,具体用于在第一时间段内持续依据第二导航路线为终端设备导 航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,处理单元1201,还具体用于根据第一时间段之后接收的终端设备的位置信息为终端设备导航。
在一种可能的实现方式中,处理单元1201,还用于为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,处理单元1201,还用于根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,处理单元1201,还用于持续依据第二导航路线为终端设备导航。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
具体的,当该定位装置120为第一服务器时,本申请实施例提供一种定位装置,装置包括:通信单元1202,用于接收来自终端设备的需要进行导航的目标对象的标识、起始位置和目的地;通信单元1202,还用于根据起始位置和目的地,向终端设备发送第一导航路线;通信单元1202,还用于接收终端设备在第一导航路线行驶的过程中的位置信息;在位置信息反映终端设备即将驶入路口时,处理单元1201,用于获取终端设备所在的目标车道;其中,路口中设置有多个摄像头,多个摄像头用于拍摄路口中不同车道中的对象;目标车道为第一服务器基于摄像头摄像得到内容确定的,或者,目标车道为第一服务器根据从第二服务器接收的信息确定的;通信单元1202,还用于向终端设备发送用于指示目标车道的指示信息。
在一种可能的实现方式中,处理单元1201,具体用于基于多个摄像头拍摄路口的多个车道中的对象,得到多个第一关联关系,任一个第一关联关系包括图像以及拍摄图像的摄像头的标识;第一服务器在多个图像中识别到目标对象的标识时,处理单元1201,还具体用于确定包括目标对象的标识的目标图像对应的目标摄像头;处理单元1201,还具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,通信单元1202,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元1202,具体用于接收来自第二服务器的目标车道的标识。
在一种可能的实现方式中,通信单元1202,具体用于向第二服务器发送查询请求,查询请求包括目标对象的标识、以及下述的任一种:目标对象的位置信息或路口的标识;通信单元1202,还具体用于接收来自第二服务器的目标摄像头的标识,目标摄像头为拍摄到目标对象的摄像头;处理单元1201,具体用于根据第二关联关系确定目标摄像头所在的目标车道;第二关联关系包括摄像头与车道的对应关系。
在一种可能的实现方式中,在目标车道与第一导航路线中指示的车道不同时,通信单元1202,还用于根据目标车道和目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,在第一服务器在第一时间段内接收到来自终端设备的位置信息时,处理单元1201,具体用于在第一时间段内持续依据第二导航路线为终端设备导 航;在第一服务器在第一时间段之后接收到来自终端设备的位置信息时,处理单元1201,还具体用于根据第一时间段之后接收的终端设备的位置信息为终端设备导航。
在一种可能的实现方式中,处理单元1201,还用于为第一导航路线中指示的车道设置第一权重,以及根据环境信息为目标车道设置第二权重;其中,在环境信息表示环境不利于图像识别时,第一权重大于第二权重;在环境信息表示环境不影响图像识别时,第一权重小于第二权重;当目标车道与第一导航路线中指示的车道不同时,通信单元1202,还用于根据目标车道和第一导航路线中指示的车道中权重大的车道,以及目的地,向终端设备发送第二导航路线。
在一种可能的实现方式中,当目标车道与第一导航路线中指示的车道不同,且目标车道与第一导航路线中指示的车道之间的距离大于距离阈值时,处理单元1201,还用于持续依据第二导航路线为终端设备导航。
在一种可能的实现方式中,目标对象的标识为车牌号,终端设备为手机或车辆。
具体的,当该定位装置120为终端设备时,本申请实施例提供一种定位装置,装置包括:通信单元1202,用于向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;通信单元1202,还用于接收来自第一服务器的第一导航路线;第一导航路线与起始位置和目的地有关;在终端设备根据第一导航路线行驶的过程中,通信单元1202,还用于向第一服务器上报终端设备的位置信息;在位置信息反映终端设备即将驶入路口时,通信单元1202,还用于向第一服务器发送提示信息;提示信息用于提示终端设备即将驶入路口;通信单元1202,还用于接收来自第一服务器的用于指示目标车道的指示信息;处理单元1201,用于根据指示信息提示用户处于目标车道。
可以理解的是,上述各个方案的定位装置120具有实现上述方法中定位系统、第一服务器或终端设备执行的相应步骤的功能。
在一种可能的实施例中,定位装置120还可以包括:存储单元1203。处理单元1201和存储单元1203通过通信线路相连。
存储单元1203可以包括一个或者多个存储器,存储器可以是一个或者多个设备、电路中用于存储程序或者数据的器件。
存储单元1203可以独立存在,通过通信线路与定位装置具有的处理单元1201相连。存储单元1203也可以和处理单元1201集成在一起。
其中,则通信单元1202可以是输入或者输出接口、管脚或者电路等。示例性的,存储单元1203可以存储雷达或目标设备的方法的计算机执行指令,以使处理单元1201执行上述实施例中雷达或目标设备的方法。存储单元1203可以是寄存器、缓存或者RAM等,存储单元1203可以和处理单元1201集成在一起。存储单元1203可以是ROM或者可存储静态信息和指令的其他类型的静态存储设备,存储单元1203可以与处理单元1201相独立。
示例性的,图13为本申请实施例提供的一种控制设备的硬件结构示意图,如图13所示,该控制设备包括处理器1301,通信线路1304以及至少一个通信接口(图13中示例性的以通信接口1303为例进行说明)。
处理器1301可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或 多个用于控制本申请方案程序执行的集成电路。
通信线路1304可包括在上述组件之间传送信息的电路。
通信接口1303,使用任何收发器一类的装置,用于与其他设备或通信网络通信,如以太网,无线局域网(wireless local area networks,WLAN)等。
可能的,该控制设备还可以包括存储器1302。
存储器1302可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路1304与处理器相连接。存储器也可以和处理器集成在一起。
其中,存储器1302用于存储执行本申请方案的计算机执行指令,并由处理器1301来控制执行。处理器1301用于执行存储器1302中存储的计算机执行指令,从而实现本申请实施例所提供的定位方法。
可能的,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
在具体实现中,作为一种实施例,处理器1301可以包括一个或多个CPU,例如图13中的CPU0和CPU1。
在具体实现中,作为一种实施例,控制设备可以包括多个处理器,例如图13中的处理器1301和处理器1305。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。
示例性的,图14为本申请实施例提供的一种芯片的结构示意图。芯片140包括一个或两个以上(包括两个)处理器1410和通信接口1430。
在一些实施方式中,存储器1440存储了如下的元素:可执行模块或者数据结构,或者他们的子集,或者他们的扩展集。
本申请实施例中,存储器1440可以包括只读存储器和随机存取存储器,并向处理器1410提供指令和数据。存储器1440的一部分还可以包括非易失性随机存取存储器(non-volatile random access memory,NVRAM)。
本申请实施例中,存储器1440、通信接口1430以及存储器1440通过总线系统1420耦合在一起。其中,总线系统1420除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。为了便于描述,在图14中将各种总线都标为总线系统1420。
上述本申请实施例描述的方法可以应用于处理器1410中,或者由处理器1410实现。处理器1410可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器1410中的硬件的集成逻辑电路或者软件形式的指 令完成。上述的处理器1410可以是通用处理器(例如,微处理器或常规处理器)、数字信号处理器(digital signal processing,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件、分立门、晶体管逻辑器件或分立硬件组件,处理器1410可以实现或者执行本发明实施例中的公开的各方法、步骤及逻辑框图。
结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。其中,软件模块可以位于随机存储器、只读存储器、可编程只读存储器或带电可擦写可编程存储器(electrically erasable programmable read only memory,EEPROM)等本领域成熟的存储介质中。该存储介质位于存储器1440,处理器1410读取存储器1440中的信息,结合其硬件完成上述方法的步骤。
在上述实施例中,存储器存储的供处理器执行的指令可以以计算机程序产品的形式实现。其中,计算机程序产品可以是事先写入在存储器中,也可以是以软件形式下载并安装在存储器中。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。例如,可用介质可以包括磁性介质(例如,软盘、硬盘或磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本申请实施例还提供了一种计算机可读存储介质。上述实施例中描述的方法可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。计算机可读介质可以包括计算机存储介质和通信介质,还可以包括任何可以将计算机程序从一个地方传送到另一个地方的介质。存储介质可以是可由计算机访问的任何目标介质。
作为一种可能的设计,计算机可读介质可以包括紧凑型光盘只读储存器(compact disc read-only memory,CD-ROM)、RAM、ROM、EEPROM或其它光盘存储器;计算机可读介质可以包括磁盘存储器或其它磁盘存储设备。而且,任何连接线也可以被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,DSL或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(digital versatile disc,DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。
上述的组合也应包括在计算机可读介质的范围内。以上,仅为本发明的具体实施 方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (20)

  1. 一种定位方法,其特征在于,应用于定位系统,所述定位系统包括:终端设备和第一服务器,所述方法包括:
    所述终端设备向所述第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;
    所述第一服务器根据所述起始位置和所述目的地,向所述终端设备发送第一导航路线;
    在所述终端设备根据第一导航路线行驶的过程中,所述终端设备向所述第一服务器上报所述终端设备的位置信息;
    在所述位置信息反映所述终端设备即将驶入路口时,所述第一服务器获取所述终端设备所在的目标车道;其中,所述路口中设置有多个摄像头,所述多个摄像头用于拍摄路口中不同车道中的对象;所述目标车道为所述第一服务器基于摄像头摄像得到内容确定的,或者,所述目标车道为所述第一服务器根据从第二服务器接收的信息确定的;
    所述第一服务器向所述终端设备发送用于指示所述目标车道的指示信息;
    所述终端设备根据所述指示信息提示用户处于的所述目标车道;
    所述第一服务器获取所述终端设备所在的目标车道,包括:
    所述第一服务器基于所述多个摄像头拍摄所述路口的多个车道中的对象,得到多个第一关联关系,任一个所述第一关联关系包括图像以及拍摄所述图像的摄像头的标识;
    所述第一服务器在多个所述图像中识别到所述目标对象的标识时,所述第一服务器确定包括所述目标对象的标识的目标图像对应的目标摄像头;
    所述第一服务器根据第二关联关系确定所述目标摄像头所在的目标车道;所述第二关联关系包括摄像头与车道的对应关系。
  2. 根据权利要求1所述的方法,其特征在于,所述第一服务器获取所述终端设备所在的目标车道,包括:
    所述第一服务器向所述第二服务器发送查询请求,所述查询请求包括所述目标对象的标识、以及下述的任一种:所述目标对象的位置信息或所述路口的标识;
    所述第一服务器接收来自所述第二服务器的目标车道的标识。
  3. 根据权利要求1所述的方法,其特征在于,所述第一服务器获取所述终端设备所在的目标车道,包括:
    所述第一服务器向所述第二服务器发送查询请求,所述查询请求包括所述目标对象的标识、以及下述的任一种:所述目标对象的位置信息或所述路口的标识;
    所述第一服务器接收来自所述第二服务器的目标摄像头的标识,所述目标摄像头为拍摄到所述目标对象的摄像头;
    所述第一服务器根据第二关联关系确定所述目标摄像头所在的目标车道;所述第二关联关系包括摄像头与车道的对应关系。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    在所述目标车道与所述第一导航路线中指示的车道不同时,所述第一服务器根据所述目标车道和所述目的地,向所述终端设备发送第二导航路线。
  5. 根据权利要求4任一项所述的方法,其特征在于,所述第一服务器根据所述目标 车道和所述目的地,向所述终端设备发送第二导航路线之后,还包括:
    在所述第一服务器在第一时间段内接收到来自所述终端设备的位置信息时,所述第一服务器在所述第一时间段内持续依据所述第二导航路线为所述终端设备导航;
    在所述第一服务器在所述第一时间段之后接收到来自所述终端设备的位置信息时,所述第一服务器根据所述第一时间段之后接收的所述终端设备的位置信息为所述终端设备导航。
  6. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    所述第一服务器为所述第一导航路线中指示的车道设置第一权重,以及根据环境信息为所述目标车道设置第二权重;其中,在所述环境信息表示所述环境不利于图像识别时,所述第一权重大于所述第二权重;在所述环境信息表示所述环境不影响图像识别时,所述第一权重小于所述第二权重;
    当所述目标车道与所述第一导航路线中指示的车道不同时,所述第一服务器根据所述目标车道和所述第一导航路线中指示的车道中权重大的车道,以及所述目的地,向所述终端设备发送第二导航路线。
  7. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    当所述目标车道与所述第一导航路线中指示的车道不同,且所述目标车道与所述第一导航路线中指示的车道之间的距离大于距离阈值时,所述第一服务器持续依据所述第一导航路线为所述终端设备导航。
  8. 根据权利要求1-3任一项所述的方法,其特征在于,所述目标对象的标识为车牌号,所述终端设备为手机或车辆。
  9. 一种定位方法,其特征在于,所述方法包括:
    第一服务器接收来自终端设备的需要进行导航的目标对象的标识、起始位置和目的地;
    所述第一服务器根据所述起始位置和所述目的地,向所述终端设备发送第一导航路线;
    所述第一服务器接收所述终端设备在第一导航路线行驶的过程中的位置信息;
    在所述位置信息反映所述终端设备即将驶入路口时,所述第一服务器获取所述终端设备所在的目标车道;其中,所述路口中设置有多个摄像头,所述多个摄像头用于拍摄路口中不同车道中的对象;所述目标车道为所述第一服务器基于摄像头摄像得到内容确定的,或者,所述目标车道为所述第一服务器根据从第二服务器接收的信息确定的;
    所述第一服务器向所述终端设备发送用于指示所述目标车道的指示信息;
    所述第一服务器获取所述终端设备所在的目标车道,包括:
    所述第一服务器基于所述多个摄像头拍摄所述路口的多个车道中的对象,得到多个第一关联关系,任一个所述第一关联关系包括图像以及拍摄所述图像的摄像头的标识;
    所述第一服务器在多个所述图像中识别到所述目标对象的标识时,所述第一服务器确定包括所述目标对象的标识的目标图像对应的目标摄像头;
    所述第一服务器根据第二关联关系确定所述目标摄像头所在的目标车道;所述第二关联关系包括摄像头与车道的对应关系。
  10. 根据权利要求9所述的方法,其特征在于,所述第一服务器获取所述终端设备所 在的目标车道,包括:
    所述第一服务器向所述第二服务器发送查询请求,所述查询请求包括所述目标对象的标识、以及下述的任一种:所述目标对象的位置信息或所述路口的标识;
    所述第一服务器接收来自所述第二服务器的目标车道的标识。
  11. 根据权利要求9所述的方法,其特征在于,所述第一服务器获取所述终端设备所在的目标车道,包括:
    所述第一服务器向所述第二服务器发送查询请求,所述查询请求包括所述目标对象的标识、以及下述的任一种:所述目标对象的位置信息或所述路口的标识;
    所述第一服务器接收来自所述第二服务器的目标摄像头的标识,所述目标摄像头为拍摄到所述目标对象的摄像头;
    所述第一服务器根据第二关联关系确定所述目标摄像头所在的目标车道;所述第二关联关系包括摄像头与车道的对应关系。
  12. 根据权利要求9-11任一项所述的方法,其特征在于,所述方法还包括:
    在所述目标车道与所述第一导航路线中指示的车道不同时,所述第一服务器根据所述目标车道和所述目的地,向所述终端设备发送第二导航路线。
  13. 根据权利要求12所述的方法,其特征在于,所述第一服务器根据所述目标车道和所述目的地,向所述终端设备发送第二导航路线之后,还包括:
    在所述第一服务器在第一时间段内接收到来自所述终端设备的位置信息时,所述第一服务器在所述第一时间段内持续依据所述第二导航路线为所述终端设备导航;
    在所述第一服务器在所述第一时间段之后接收到来自所述终端设备的位置信息时,所述第一服务器根据所述第一时间段之后接收的所述终端设备的位置信息为所述终端设备导航。
  14. 根据权利要求9-11任一项所述的方法,其特征在于,所述方法还包括:
    所述第一服务器为所述第一导航路线中指示的车道设置第一权重,以及根据环境信息为所述目标车道设置第二权重;其中,在所述环境信息表示所述环境不利于图像识别时,所述第一权重大于所述第二权重;在所述环境信息表示所述环境不影响图像识别时,所述第一权重小于所述第二权重;
    当所述目标车道与所述第一导航路线中指示的车道不同时,所述第一服务器根据所述目标车道和所述第一导航路线中指示的车道中权重大的车道,以及所述目的地,向所述终端设备发送第二导航路线。
  15. 根据权利要求9-11任一项所述的方法,其特征在于,所述方法还包括:
    当所述目标车道与所述第一导航路线中指示的车道不同,且所述目标车道与所述第一导航路线中指示的车道之间的距离大于距离阈值时,所述第一服务器持续依据所述第一导航路线为所述终端设备导航。
  16. 根据权利要求9-11任一项所述的方法,其特征在于,所述目标对象的标识为车牌号,所述终端设备为手机或车辆。
  17. 一种定位方法,其特征在于,所述方法包括:
    终端设备向第一服务器发送需要进行导航的目标对象的标识、起始位置和目的地;
    所述终端设备接收来自所述第一服务器的第一导航路线;所述第一导航路线与所述起 始位置和所述目的地有关;
    在所述终端设备根据第一导航路线行驶的过程中,所述终端设备向所述第一服务器上报所述终端设备的位置信息;
    在所述位置信息反映所述终端设备即将驶入路口时,所述终端设备向所述第一服务器发送提示信息;所述提示信息用于提示所述终端设备即将驶入路口;其中,所述路口中设置有多个摄像头,所述多个摄像头用于拍摄路口中不同车道中的对象;
    所述终端设备接收来自所述第一服务器的用于指示所述目标车道的指示信息;
    所述终端设备根据所述指示信息提示用户处于所述目标车道;
    其中,所述目标车道是所述第一服务器根据第二关联关系确定的目标摄像头所在的目标车道;所述第二关联关系包括摄像头与车道的对应关系;
    所述目标摄像头是所述第一服务器基于所述多个摄像头拍摄所述路口的多个车道中的对象,得到多个第一关联关系,任一个所述第一关联关系包括图像以及拍摄所述图像的摄像头的标识;所述第一服务器在多个图像中识别到目标对象的标识时,所述第一服务器确定的包括所述目标对象的标识的目标图像对应的目标摄像头。
  18. 一种电子设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时,使得所述电子设备执行如权利要求1至8任一项所述的方法,或如权利要求9至16任一项所述的方法,或如权利要求17所述的方法。
  19. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时,使得计算机执行如权利要求1至8任一项所述的方法,或如权利要求9至16任一项所述的方法,或如权利要求17所述的方法。
  20. 一种计算机程序产品,其特征在于,包括计算机程序,当所述计算机程序被运行时,使得计算机执行如权利要求1至8任一项所述的方法,或如权利要求9至16任一项所述的方法,或如权利要求17所述的方法。
PCT/CN2022/075723 2021-03-30 2022-02-09 定位方法和装置 WO2022206179A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110342456.6 2021-03-30
CN202110342456.6A CN113269976B (zh) 2021-03-30 2021-03-30 定位方法和装置

Publications (1)

Publication Number Publication Date
WO2022206179A1 true WO2022206179A1 (zh) 2022-10-06

Family

ID=77228276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/075723 WO2022206179A1 (zh) 2021-03-30 2022-02-09 定位方法和装置

Country Status (2)

Country Link
CN (1) CN113269976B (zh)
WO (1) WO2022206179A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269976B (zh) * 2021-03-30 2022-08-23 荣耀终端有限公司 定位方法和装置
CN113660611B (zh) * 2021-08-18 2023-04-18 荣耀终端有限公司 定位方法和装置
CN114509068A (zh) * 2022-01-04 2022-05-17 海信集团控股股份有限公司 一种多层道路上的车辆的位置判断方法及装置

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006023278A (ja) * 2004-06-07 2006-01-26 Nissan Motor Co Ltd 車載用ナビゲーション装置とこれに用いる車線位置推定装置
CN104422462A (zh) * 2013-09-06 2015-03-18 上海博泰悦臻电子设备制造有限公司 一种车辆导航的方法、装置
CN104880193A (zh) * 2015-05-06 2015-09-02 石立公 一种车道级导航系统及其车道级导航方法
CN105588576A (zh) * 2015-12-15 2016-05-18 重庆云途交通科技有限公司 一种车道级导航方法及系统
CN107192396A (zh) * 2017-02-13 2017-09-22 问众智能信息科技(北京)有限公司 汽车精确导航方法和装置
CN108303103A (zh) * 2017-02-07 2018-07-20 腾讯科技(深圳)有限公司 目标车道的确定方法和装置
CN109141464A (zh) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 导航变道提示方法和装置
CN110488825A (zh) * 2019-08-19 2019-11-22 中国第一汽车股份有限公司 一种自动驾驶的匝道口识别方法及车辆
CN110853360A (zh) * 2019-08-05 2020-02-28 中国第一汽车股份有限公司 一种车辆定位系统和方法
CN113269976A (zh) * 2021-03-30 2021-08-17 荣耀终端有限公司 定位方法和装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104821089A (zh) * 2015-05-18 2015-08-05 深圳市骄冠科技实业有限公司 一种基于具有通讯功能射频车牌的分车道车辆定位系统
CN110375764A (zh) * 2019-07-16 2019-10-25 中国第一汽车股份有限公司 变道提示方法、系统、车辆及存储介质

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006023278A (ja) * 2004-06-07 2006-01-26 Nissan Motor Co Ltd 車載用ナビゲーション装置とこれに用いる車線位置推定装置
CN104422462A (zh) * 2013-09-06 2015-03-18 上海博泰悦臻电子设备制造有限公司 一种车辆导航的方法、装置
CN104880193A (zh) * 2015-05-06 2015-09-02 石立公 一种车道级导航系统及其车道级导航方法
CN105588576A (zh) * 2015-12-15 2016-05-18 重庆云途交通科技有限公司 一种车道级导航方法及系统
CN108303103A (zh) * 2017-02-07 2018-07-20 腾讯科技(深圳)有限公司 目标车道的确定方法和装置
CN107192396A (zh) * 2017-02-13 2017-09-22 问众智能信息科技(北京)有限公司 汽车精确导航方法和装置
CN109141464A (zh) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 导航变道提示方法和装置
CN110853360A (zh) * 2019-08-05 2020-02-28 中国第一汽车股份有限公司 一种车辆定位系统和方法
CN110488825A (zh) * 2019-08-19 2019-11-22 中国第一汽车股份有限公司 一种自动驾驶的匝道口识别方法及车辆
CN113269976A (zh) * 2021-03-30 2021-08-17 荣耀终端有限公司 定位方法和装置

Also Published As

Publication number Publication date
CN113269976A (zh) 2021-08-17
CN113269976B (zh) 2022-08-23

Similar Documents

Publication Publication Date Title
WO2022206179A1 (zh) 定位方法和装置
JP7050683B2 (ja) 三次元情報処理方法及び三次元情報処理装置
WO2021196052A1 (zh) 驾驶数据采集方法及装置
CN109817022B (zh) 一种获取目标对象位置的方法、终端、汽车及系统
CN109974734A (zh) 一种用于ar导航的事件上报方法、装置、终端及存储介质
CN110164135B (zh) 一种定位方法、定位装置及定位系统
CN107526090A (zh) 位置定位方法、设备及系统、计算机可读存储介质
EP2682925B1 (en) Vehicle monitoring method and system
CN102436737A (zh) 一种基于无线网络和照片的路况分享系统及方法
JP2001184593A (ja) 道路交通システム
CN101556738A (zh) 基于球机控制方式的违法停车取证系统
US11645913B2 (en) System and method for location data fusion and filtering
JP2023516502A (ja) 画像ベースの場所決定及び駐車モニタリングのためのシステム及び方法
JP2015210713A (ja) ドライブレコーダおよびこれを用いたクラウド型道路情報等運用システム
CN113077627B (zh) 检测车辆的超限源头的方法、装置及计算机存储介质
KR101070882B1 (ko) 리얼 맵 정보 제공 방법 및 그 시스템
KR100957605B1 (ko) 도로 영상 제공 시스템
US20230136925A1 (en) Mobile roadway sensing
CN115359671A (zh) 一种路口车辆协同控制方法及相关设备
CN114422936A (zh) 隧道交通管理方法、装置及存储介质
US20150052567A1 (en) Apparatus for requesting black box images over digital multimedia broadcasting network, and apparatus and method for searching black box images
US20240135718A1 (en) Method and system for gathering image training data for a machine learning model
JP7147791B2 (ja) タグ付与システム、キャッシュサーバ、およびキャッシュサーバの制御方法
JP5951188B2 (ja) 情報提供方法、情報提供プログラム、情報提供サーバ、記憶媒体
CN112365738B (zh) 一种智能社区网络停车系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22778366

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22778366

Country of ref document: EP

Kind code of ref document: A1