CN114184197A - Navigation voice broadcasting method, equipment, system and storage medium - Google Patents

Navigation voice broadcasting method, equipment, system and storage medium Download PDF

Info

Publication number
CN114184197A
CN114184197A CN202010969645.1A CN202010969645A CN114184197A CN 114184197 A CN114184197 A CN 114184197A CN 202010969645 A CN202010969645 A CN 202010969645A CN 114184197 A CN114184197 A CN 114184197A
Authority
CN
China
Prior art keywords
time
broadcasting
navigation event
broadcast
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010969645.1A
Other languages
Chinese (zh)
Inventor
邱堋星
赵瑞
闫青永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010969645.1A priority Critical patent/CN114184197A/en
Publication of CN114184197A publication Critical patent/CN114184197A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

The application provides a navigation voice broadcasting method, equipment, a system and a storage medium. In the navigation voice broadcasting method, the response time required by processing the navigation event is used as a basis for determining the broadcasting time, the position and the speed condition of the navigated object are fully considered, the dynamic calculation of the broadcasting time of the navigation event according to the actual running condition of the navigated object is facilitated, the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.

Description

Navigation voice broadcasting method, equipment, system and storage medium
Technical Field
The present application relates to the field of navigation technologies, and in particular, to a navigation voice broadcasting method, device, system, and storage medium.
Background
In a travel scene, the map navigation application plays an important role, for example, when a user drives a vehicle, the map navigation application can be used for navigation, and in the navigation process, the map navigation application can broadcast a navigation event to the user so as to help the user to drive safely, wherein the navigation event includes road construction, road speed limit, road congestion and the like.
In the prior art, a map navigation application generally broadcasts voice information related to a navigation event when the distance from a real-time positioning position of a user to the position of the navigation event reaches a predetermined broadcast distance. However, since the predetermined distance is generally fixed, the inventors have found that not all users can effectively cope with the navigation event within the fixed distance. Therefore, there is a need for continuous optimization and improvement of the existing navigation voice broadcasting technology.
Disclosure of Invention
Aspects of the present application provide a navigation voice broadcasting method, device, system, and storage medium, so as to solve the problem that the existing navigation voice broadcasting method is not timely in broadcasting.
The embodiment of the present application further provides a navigation voice broadcasting method, which is applicable to a server, and includes: determining a first navigation event in a road network; calculating the response time required for processing the first navigation event according to the historical travel track data corresponding to the first navigation event; and sending the first navigation event and the response time length to terminal equipment corresponding to the navigated object, so that the terminal equipment broadcasts the first navigation event to the navigated object according to the response time length.
The embodiment of the application further provides a navigation voice broadcasting method, which includes: acquiring the real-time driving speed and the real-time positioning position of a navigated object; determining a broadcast time of a first navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object; wherein the event information includes: broadcasting contents and responding time; and when the running condition of the navigation object is determined to meet the broadcasting opportunity of the first navigation event, the broadcasting content is broadcasted in a voice mode.
The embodiment of the present application further provides a navigation voice broadcasting method, which is applicable to a server, and includes: determining a navigation event contained in a navigation path of a navigated object; acquiring event information of the navigation event, wherein the event information comprises broadcast contents and response time required for processing the navigation event; and sending the navigation event and the event information to terminal equipment corresponding to the navigated object so that the terminal equipment determines the broadcasting time of the navigation event according to the event information.
An embodiment of the present application further provides a server, including: a memory, a processor, and a communication component; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: and executing the steps in the navigation voice broadcasting method provided by the embodiment of the application.
An embodiment of the present application further provides a terminal device, including: a memory, a processor, a communication component, and a display component; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: and executing the steps in the navigation voice broadcasting method provided by the embodiment of the application.
The embodiment of the present application further provides a computer-readable storage medium storing a computer program, and the computer program can implement the steps in the navigation voice broadcast method provided by the embodiment of the present application when being executed.
The embodiment of the present application further provides a navigation voice broadcast system, including: a server and a terminal device of a navigated object; wherein the server is configured to: determining a navigation event contained in a navigation path of the navigated object; acquiring event information of the navigation event, and issuing the navigation event and the event information to the terminal equipment; the terminal device is configured to: acquiring the real-time driving speed and the real-time positioning position of the navigated object; determining the broadcasting time of the navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the navigation event received by the navigated object; wherein the event information includes: broadcasting the content and responding time; when the navigation object driving condition is determined to meet the broadcasting time of the navigation event, the broadcasting content is broadcasted in a voice mode
In the navigation voice broadcasting system provided by the embodiment of the application, the server excavates the response time length required by the navigation event according to the historical travel track data corresponding to the navigation event, and sends the navigation event and the response time length thereof to the terminal equipment corresponding to the navigated object, so that the determination basis of the broadcasting time can be effectively provided for the terminal equipment. In the navigation process, the terminal equipment calculates the driving time required by the navigated object to reach the position of the navigation event in real time according to the real-time positioning position and the real-time driving speed of the navigated object, and determines the broadcasting time of the navigation event according to the driving time and the response time required by the navigation event processing. In the broadcasting mode, the response time required by the navigation event processing is used as a basis for determining the broadcasting time, the position and the speed of the navigated object are fully considered, the broadcasting time of the navigation event is dynamically calculated according to the actual running condition of the navigated object, the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic diagram of voice broadcasting of a navigation event based on a distance advance;
fig. 2 is a schematic structural diagram of a navigation voice broadcast system according to an exemplary embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a process of calculating a response duration corresponding to an intersection navigation event according to the present application;
fig. 4 is a schematic diagram of calculating a broadcast time corresponding to a navigation event according to the present application;
fig. 5 is a block diagram of a navigation voice broadcast system according to an embodiment of an application scenario of the present application;
fig. 6a is a schematic flow chart of a navigation voice broadcasting method according to an exemplary embodiment of the present application;
fig. 6b is a schematic flowchart of a navigation voice broadcasting method according to another exemplary embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application;
fig. 8 is a schematic structural diagram of a server according to another exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
An existing navigation voice broadcasting mode is shown in fig. 1, and the navigation event broadcasting mode performs navigation event broadcasting based on distance lead, that is, in a navigation process, a distance between a navigation event position and a real-time positioning position of a user is dynamically calculated, and a voice broadcasting prompt is given to the user in advance by a certain distance. For example, in fig. 1, the navigation event corresponds to a preset broadcast distance (distance advance). When the distance between the vehicle and the navigation event meets the broadcasting distance, namely the real-time positioning position of the vehicle is close to the broadcasting position shown in fig. 1, the navigation event is subjected to voice broadcasting.
The existing broadcast distance is generally configured with a fixed distance, but for vehicles in different driving states driven by different users of the same navigation event, the broadcast according to the fixed distance cannot guarantee that all users take effective countermeasures for the navigation event, such as a vehicle with a speed of 120km/h and a vehicle with a speed of 80km/h, and for a voice prompt of limiting the speed to 70km/h, the distances required for taking effective safety countermeasures are different inevitably. Meanwhile, different events require different countermeasures to be made by the user. Therefore, it is necessary to improve the navigation voice broadcasting technology to provide better navigation guidance service, improve user experience, and ensure safe driving.
In view of the above technical problems, in some embodiments of the present application, a solution is provided, and the technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 2 is a schematic structural diagram of a navigation voice broadcast system according to an exemplary embodiment of the present application, and as shown in fig. 2, the navigation voice broadcast system 200 includes: server 201 and terminal device 202.
The server 201 may be implemented as a server device such as a conventional server, a cloud host, a virtual center, and the like, which is not limited in this embodiment. The server device mainly includes a processor, a hard disk, a memory, a system bus, and the like, and is similar to a general computer architecture, and is not described in detail.
The terminal device 202 may be implemented as a mobile phone on the user side, a tablet computer, an intelligent wearable device, an in-vehicle terminal, or an intelligent sound box.
In some exemplary embodiments, the terminal device 202 and the server 201 may communicate wirelessly via a mobile network. The network format of the mobile network may be any one of 2G (gsm), 2.5G (gprs), 3G (WCDMA, TD-SCDMA, CDMA2000, UTMS), 4G (LTE), 4G + (LTE +), 5G, WiMax, and so on.
The terminal device 202 runs a map navigation application, and the map navigation application includes a voice broadcast engine, and is used for providing navigation service to the navigated object and simultaneously broadcasting a navigation message to the navigated object based on an audio component on the terminal device. In the navigation voice broadcast system 200, one or more terminal apparatuses 202 may be included. Each terminal device 202 may be held by the object being navigated, for example, may be mounted on the vehicle being navigated, or may be held by the driver of the vehicle.
The navigated object may be any vehicle or pedestrian traveling in the road network, and the embodiment is not limited thereto.
The navigation voice broadcasting system 200 may be configured to perform voice broadcasting on a plurality of navigation events in a navigation path of a navigated object when the navigated object travels in a road network. The navigation event may include, but is not limited to: the method comprises the following steps of taking pictures, measuring speed (commonly called electronic eyes) navigation events, crossing navigation events, speed limit navigation events, highway entrance and exit navigation events, road facility navigation events, road convergence point navigation events, navigation events of dense and dense people such as campuses/villages and the like.
In the navigation voice broadcast system 200, on one hand, the server 201 may perform data analysis based on traffic data of a road network to mine a navigation event occurring or likely to occur in the road network, and on the other hand, the server 201 may mine a response duration corresponding to the navigation event according to historical travel track data. For example, the response duration of a type of navigation event may be mined for each type of navigation event, or the response duration of such a type of navigation event may be mined for each type of navigation event.
The historical travel track data corresponding to the navigation event may include travel track data of a plurality of user users in processing the type or the type of navigation event in a historical time period. The historical driving track reflects the response behavior, the response time and the like of the user when dealing with the navigation event. For example, when the navigation event is implemented as an electronic eye navigation event, the historical travel track data corresponding to the electronic eye navigation event may include: the vehicle speed, the time to start decelerating, the magnitude of acceleration, the length of time spent by the position of the electronic eye, etc. of the vehicle passing the position of the electronic eye during the history period. For another example, when the navigation event is an intersection turning navigation event, the historical driving track data corresponding to the intersection turning navigation event may include: the direction angle of the vehicle passing through the intersection, the time to start turning, the turning speed, the length of time taken to pass through the intersection, and the like during the history period.
Wherein, the response duration refers to: the length of time required for a user to safely handle a navigation event while driving a vehicle includes the brain reaction time required to safely handle the navigation event and the time required for actual driving operations.
For example, when a user deals with a right turn navigation event on a certain road, it takes 20 seconds to drive the vehicle to a right turn lane and start deceleration until the vehicle safely passes through the intersection. In this example, 20 seconds is the response time duration for a right turn navigation event. For another example, when the user deals with a slow-down navigation event at a campus doorway, it takes 60 seconds to slow down the speed of the vehicle and to pass through the campus doorway slowly. In this example, 60 seconds is the response duration of the campus navigation event.
An optional implementation of the server 201 mining the response time length corresponding to the navigation event according to the historical travel track data will be described in the following embodiments, which are not described herein.
When the navigated object uses the map navigation application, a navigation starting position can be provided so that the server 201 or the terminal device 202 can plan a navigation path for the navigated object. The navigation path may include one or more navigation events. Any one of the navigation events in the navigation path will be exemplified below, and for convenience of description, the any one of the navigation events will be described as the first navigation event.
After determining a first navigation event included in the navigation path of the navigated object, the server 201 may obtain event information of the first navigation event; wherein the event information of the first navigation event comprises: the broadcast content corresponding to the first navigation event and the response duration corresponding to the first navigation event. When the response duration is obtained, the server 201 may query the response duration corresponding to the first navigation event from the response durations corresponding to the plurality of navigation events mined in advance. Next, the server 201 may issue the first navigation event and the corresponding event information to the terminal device 202 corresponding to the navigated object.
In some cases, the navigation path is planned by the server 201, and the server 201 issues the navigation path to the terminal device 202. When the server 201 issues the navigation path to the terminal device 202, it may issue each navigation event included in the navigation path and corresponding event information together. In other cases, in the process of navigating the navigated object based on the navigation path, the server 201 may dynamically issue the navigation event to the terminal device 202 according to the real-time driving state of the navigated object and the event occurring in real time on the navigation path, which is not limited in this embodiment.
In the navigation voice broadcast system 200, the terminal device 202 is mainly used for: and acquiring the real-time driving speed and the real-time positioning position of the navigated object. And determining the broadcasting time of the first navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object. When it is determined that the driving condition of the navigation object meets the broadcast timing of the first navigation event, the terminal device 202 may broadcast the broadcast content in the event information of the first navigation event in a voice manner.
In the embodiment, when the navigation event is broadcasted, the response time length required by processing the navigation event is used as a basis for determining the broadcasting time, the real-time running speed and the real-time positioning position of the navigated object are fully considered, the broadcasting time of the navigation event is favorably and dynamically calculated according to the actual running condition of the navigated object, the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.
It should be noted that the terminal device 202 may dynamically calculate the broadcast time of each navigation event in real time during the navigation process, so as to avoid late broadcast or early broadcast. For example, the terminal device 202 may use 1 second or 3 seconds or other shorter time as a period for dynamically calculating the broadcast timing, and in each period, calculate the broadcast timing of the navigation event to be broadcast. For the method for calculating the broadcast time of the navigation event in each period, reference may be made to the above and following descriptions of the embodiments of the present application.
It should be noted that, in the navigation voice broadcast system 200, the server 201 may update the navigation event and the response time thereof in the road network according to a specified period, so as to provide the latest navigation service to the navigated object. The specified period may be 6 hours, 12 hours, one day or one week, and the embodiment is not limited.
In some exemplary embodiments, the calculating, by the server 201, the response time duration required for processing the first navigation event according to the historical travel track data corresponding to the first navigation event may include: a shortest response time and a longest response time required to process the first navigation event. Wherein the longest response duration is greater than the shortest response duration.
The longest response duration refers to a time length from a time when the navigated object can start to perform a response action for the navigation event at the earliest time to a time when the navigated object completes processing the navigation event. The shortest response duration refers to a time length from a time when the navigated object can make a response action at the latest with respect to the navigation event to a time when the navigated object completes processing the navigation event.
In other exemplary embodiments, the calculating, by the server 201, the response time duration required for processing the first navigation event according to the historical travel track data corresponding to the first navigation event may include: the shortest response time, the longest response time, and the safe response time. Wherein, the safety response duration is: the time length required by the navigated object to safely and leisurely respond to the action until the navigation event is processed is obtained, and the safe response time length is less than the longest response time length and greater than the shortest response time length. The following embodiment will describe in detail an alternative implementation of the server 201 mining the longest response time length, the shortest response time length, and the safety response time length based on the historical travel track data, by taking the navigated object as a vehicle as an example, and combining fig. 3.
Generally, when a user drives a vehicle to process a navigation event, the user starts to prepare for processing the navigation event when the user is within a certain range from the position of the navigation event. For example, when a driver turns a vehicle, it is common to start decelerating and turn on the turn signal a certain distance ahead. For another example, when a driver drives a vehicle through a construction road section, the driver usually starts decelerating a vehicle a certain distance ahead and gets out of the construction road section.
Based on the above, optionally, in some embodiments, the server 201 may calculate at least one of speed data, acceleration data, and direction angle of the vehicle within a set distance range of the location of the first navigation event according to the historical travel track data. For example, the set distance corresponding to the exit/entrance navigation event of the expressway may be 500 meters, and the set distance corresponding to the intersection navigation event may be 50 meters, 80 meters, or 100 meters, and the like.
Wherein, based on the speed data and the acceleration data, the time of vehicle deceleration and the time of vehicle acceleration can be calculated; the direction angle of the vehicle can be used for judging whether the vehicle turns, changes lanes and the like. At least one of the vehicle speed data, the acceleration data and the direction angle of the vehicle within the set distance range may be obtained based on a vehicle-mounted positioning device, and the embodiment is not limited.
Next, the server 201 may determine smooth coping data and emergency coping data corresponding to the first navigation event from the historical driving trace data according to at least one of the speed data, the acceleration data, and the direction angle.
The stable coping data refers to historical driving track data generated by coping with a navigation event in a relatively stable state when a user drives a vehicle; the emergency response data is history travel track data generated when a user responds to a navigation event in an emergency state while driving a vehicle. For example, historical travel track data generated when a user drives a vehicle to make a sharp turn at an intersection belongs to emergency response data corresponding to the intersection; and the other user generates historical driving track data when the intersection turns stably, and the historical driving track data belongs to stable coping data corresponding to the intersection.
Alternatively, when the smooth coping data and the emergency coping data are distinguished from each other based on the speed data and the acceleration data, the server 201 may calculate the time when the vehicle starts decelerating and the time when the vehicle starts accelerating based on the speed data and the acceleration data. The time when the vehicle starts decelerating may be considered as the time when the vehicle starts handling the navigation event, and the time when the vehicle starts accelerating may be considered as the time when the navigation event is finished. If the time difference between the moment when the vehicle starts to decelerate and the moment when the vehicle starts to accelerate is smaller than the set corresponding time threshold, the corresponding operation of the vehicle at this time can be regarded as an emergency corresponding operation, and the generated track data belongs to emergency corresponding data; on the contrary, if the time difference between the time when the vehicle starts decelerating and the time when the vehicle starts accelerating is greater than or equal to the set coping time threshold, it is considered that the coping operation of the vehicle at this time is a smooth coping operation, and the generated trajectory data belongs to the smooth coping data.
Alternatively, when the smooth coping data and the emergency coping data are distinguished according to the acceleration data, the server 201 may determine whether a certain vehicle is decelerated urgently or smoothly according to the magnitude of the acceleration value of the vehicle. If the direction of the acceleration of the vehicle is opposite to the running direction of the vehicle and the value of the acceleration is larger than the set acceleration threshold value, the vehicle can be considered to be in emergency deceleration, at the moment, the corresponding operation of the vehicle at this time can be considered to be an emergency corresponding operation, and the generated track data belongs to emergency corresponding data; on the contrary, if the direction of the acceleration of the vehicle is opposite to the traveling direction of the vehicle and the value of the acceleration is smaller than or equal to the set acceleration threshold, the vehicle can be considered to be decelerating steadily, and at this time, the handling operation of the vehicle at this time can be considered to be a smooth handling operation, and the generated trajectory data belongs to smooth handling data. The acceleration is a vector, and includes a direction and a magnitude, and the value of the acceleration refers to the magnitude of the acceleration.
Alternatively, when the smooth coping data and the emergency coping data are distinguished according to the direction angle, the server 201 may determine whether a certain vehicle makes an emergency turn or makes a smooth turn according to the change speed of the direction angle of the vehicle.
The server 201 may determine that the historical travel track data of a certain vehicle is emergency response data or smooth response data according to the change speed of the direction angle of the vehicle. If the change speed of the direction angle of the vehicle is greater than the set change threshold, it is considered that the vehicle is turning urgently, and at this time, it is considered that the turning pair operation of the vehicle this time is an urgent turning operation, and the generated trajectory data belongs to the urgent response data. If the change speed of the direction angle of the vehicle is less than or equal to the set change threshold, the vehicle can be considered to be turning smoothly, at this time, the turning operation of the vehicle at this time can be considered to be a smooth turning operation, and the generated trajectory data belongs to smooth coping data.
It should be appreciated that the server 201 may distinguish emergency response data and smooth response data from historical travel track data corresponding to the first navigation event based on one or more of the above embodiments in combination.
Next, the server 201 may calculate a safe response time period required to process the first navigation event according to the smooth coping data, and may calculate a shortest response time period required to process the first navigation event according to the emergency coping data.
Optionally, an alternative embodiment of calculating the safety response duration of the first navigation event according to the smooth handling data may include: determining a first moment when the vehicle starts to continuously decelerate within a set distance range of the position of the first navigation event and a second moment when the vehicle leaves the position of the first navigation event from the smooth coping data; and calculating to obtain the safety response time length required for processing the first navigation event according to the first time and the second time.
Optionally, an alternative embodiment of calculating the shortest response time for the first navigation event according to the emergency handling data may include: determining a third moment when the vehicle starts to continuously decelerate within a set distance range of the position of the first navigation event and a fourth moment when the vehicle leaves the position of the first navigation event from the emergency response data; and calculating to obtain the shortest response time required for processing the first navigation event according to the third time and the fourth time.
Taking the first navigation event as an intersection as an example, the server 201 may obtain the position information of the intersection, such as the longitude and latitude and the direction angle of the intersection, based on the road topology data. Next, historical travel track data within a set distance (e.g., 1km) from the intersection, including data on the vehicle speed, direction angle, time, etc., of the vehicle within 1km of the intersection, may be obtained.
After the server 201 acquires the historical travel track data, the historical travel track data may be further cleaned and filtered. For example, the interference data may be filtered by combining the historical traffic congestion state and congestion data, traffic event data, data of yaw occurring corresponding to the navigation event, and the like, which are not described herein again.
Then, the magnitude of acceleration and the change speed of the steering angle at the time of deceleration of the vehicle are calculated from the vehicle speed of the vehicle. And excavating stable safe steering data and emergency steering data from historical driving track data corresponding to the intersection based on the speed of the vehicle, the acceleration during deceleration and the change speed of the direction angle.
Wherein, from the steady steering data, the time t of the vehicle passing through the crossing position can be calculatedi0The time t at which the vehicle begins to decelerate continuously can be foundi1And calculating the time length delta t-t needed by the safe steering of the vehiclei0-ti1. When calculating the safety response time Tsafe from the trajectory data of a plurality of vehicles, the following formula may be used for calculation:
Figure BDA0002683635080000111
in the formula 1, a0I represents the serial number of the vehicle, n represents the total number of the vehicles participating in the calculation, and n is a positive integer.
Accordingly, from the emergency steering data, the time when the vehicle passes the intersection position can be calculated as ti0The time t at which the vehicle begins to decelerate continuously can be foundi1And will ti0And ti1The shortest response time length Tmin is calculated by substituting the above equation 1, as shown in fig. 3.
Further optionally, in some embodiments, the longest response time of the first navigation event may be further calculated according to a topology of the road network. Optionally, a distance between the first navigation event and a last navigation event of the first navigation event may be determined from the road network; calculating the average travel time from the position of the last navigation event to the position of the first navigation event according to the distance between the first navigation event and the last navigation event; based on the average travel time and the grade of the road on which the first navigation event is located, a maximum response time period required to process the first navigation event may be determined.
Taking intersection a as an example, the server 201 can determine the intersection B immediately before the intersection a from the road network, and calculate the average time t0 required for the vehicle to travel from the intersection a to the intersection B according to the distance between the intersection a and the intersection B. Meanwhile, the empirical value t1 of the time required for the vehicle to go from the intersection a to the intersection B can be determined according to the road grade. Next, the server 201 may take the minimum value, i.e., min (t0, t1), from t0 and t1 as the longest response time Tmax corresponding to the intersection.
It should be understood that, when the server 201 calculates the above-mentioned safety response time, the shortest response time, and the longest response time, it may perform multiple calculations by combining historical travel track data of multiple vehicles when processing navigation events in different time periods, and may obtain an average value of multiple calculation results, which is not described again.
Based on the above embodiment, the server 201 may respectively calculate the response duration of each navigation event in the road network, and issue each navigation event and the corresponding response duration thereof to the terminal device 202. It should be understood that, when the navigation voice broadcasting system 200 includes a plurality of terminal devices, the server 201 may respectively issue each navigation event in the road network and the corresponding response duration to each terminal device; alternatively, the server 201 may also issue each navigation event in the road network and the corresponding response duration to the terminal device initiating the request when the terminal device initiates the navigation event acquisition request, which is not described in detail again.
In some exemplary embodiments, when determining the broadcast timing of the first navigation event based on the real-time running speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object, the terminal device 202 may first determine a running duration required by the navigated object to run to the location of the first navigation event according to the real-time running speed and the real-time positioning position of the navigated object.
When the server 201 issues the first navigation event, the location of the first navigation event may be issued together. Based on the position of the first navigation event and the real-time position of the navigated object, the real-time distance between the first navigation event and the navigated object can be calculated, thereby calculating the travel time required for the navigated object to travel to the first navigation event.
For example, when the navigated object is a vehicle, the real-time travel speed of the marker vehicle is V, the real-time location is D1, and assuming that the location of the first navigation event is D2, the travel time period required for the vehicle to travel to the first navigation event is: t1 ═ (D2-D1)/V.
In particular, when the driving duration is shorter than the shortest response duration of the first navigation event, it may be considered that the navigation is invalid at this time, and the voice broadcast may not be performed on the first navigation event and the operation of calculating the broadcast time may not be performed any more, so as to reduce the calculation amount of the terminal device 202.
The broadcast may be considered valid when the travel time is greater than or equal to the shortest response time of the first navigation event. At this time, the terminal device 202 may determine the announcement advance time period T2 of the first navigation event based on the response time period of the first navigation event. The broadcast advance duration refers to a period of time prepared in advance for broadcasting the first navigation event. Next, the terminal device 202 may calculate a time difference between the travel time length and the broadcast advance time length as a remaining travelable time length of the navigated object before the broadcast of the navigation event. Wherein the time difference can be expressed as: and delta T is T1-T2. Based on the time difference between the travel time and the broadcast advance time, the terminal device 202 may determine a broadcast timing of the first navigation event.
In some optional embodiments, the broadcast advance time length may be determined according to the response time length. Optionally, the broadcast advance time may be determined according to the longest response time Tmax. That is, the broadcast advance time period T2 is Tmax. Generally, the longest response time is longer, and the longest response time is used as the broadcast advance time, so that the navigated object can still have enough remaining time to respond to the navigation event after listening to the voice broadcast content.
In other optional embodiments, the broadcast advance time may be determined according to the response time and the broadcast time. And the broadcasting duration is obtained by calculation according to the broadcasting content and the broadcasting speed.
In some embodiments, the map navigation application may provide a plurality of broadcast modes to the user, each broadcast mode being available for broadcast by a different broadcaster. The navigated object can select the broadcast mode according to the preference. Based on the broadcast mode selected by the navigation object, the broadcast speed can be determined, and based on the broadcast speed and the broadcast content, the broadcast time length is calculated.
Optionally, when the server 201 issues the broadcast content for the navigation event, multiple broadcast texts corresponding to the broadcast content may be issued, and different broadcast texts may have different text lengths, so as to correspond to different broadcast durations.
The broadcasting method comprises the steps that different broadcasting trigger conditions corresponding to multiple broadcasting texts respectively are achieved, and different contents are broadcasted to a user under the condition that actual driving conditions of a navigated object are different.
For example, when the first navigation event is an electronic eye navigation event, the broadcast text corresponding to the electronic eye navigation event may include:
text 11: the front part is provided with a speed measurement camera, please pay attention to.
Text 12: the front part is used for speed measurement and shooting, and people are called to slow down when speeding up.
Wherein, the broadcast trigger condition that above-mentioned text 11 corresponds is: the vehicle does not overspeed, and the broadcast triggering conditions corresponding to the text 12 are as follows: the vehicle has already overspeed.
For another example, when the first navigation event is an intersection navigation event, the broadcast text corresponding to the intersection navigation event may include:
text 21: if the crossing is congested, please slow down to turn right.
Text 22: turning right at the intersection ahead, making the road slippery in rainy days, and please slow down to avoid the pedestrian.
Text 23: when the vehicle is not moved out after turning right at the front intersection and in an accident position of 300 m, the vehicle is required to avoid.
Wherein, the broadcast trigger condition that above-mentioned text 21 corresponds is: the traffic flow at the intersection is greater than a set value; the broadcast triggering condition corresponding to the text 22 is as follows: rainy weather; the broadcast trigger condition corresponding to the text 23 is as follows: traffic accidents occur after the intersection turns right.
Optionally, the terminal device 202 may select a target broadcast text from a plurality of broadcast texts corresponding to the broadcast content according to the actual driving condition data of the navigated object. Wherein the actual driving condition data may include: current driving state data of the navigated object and/or current driving environment data of the navigated object.
The current driving state data of the navigated object may include: the speed, acceleration, deceleration, backing, turning and other data of the navigated object. The current driving environment data of the navigated object may include: the grade of the road on which the vehicle runs, the traffic flow of the road, the congestion condition of the road, and the weather conditions such as rain, snow, wet and slippery ground, foggy days and the like.
Based on the above, optionally, when the terminal device 202 selects a target broadcast text from a plurality of broadcast texts corresponding to the broadcast content, the current driving state data and/or the current driving environment data of the navigated object may be acquired; and determining a target broadcast text matched with the current driving state data and/or the current driving environment data from the multiple broadcast texts to obtain broadcast contents.
For example, when the navigation event is an electronic eye navigation event, if the current vehicle speed of the navigated object exceeds the upper limit of the road speed limit, the text 12 may be selected as the target broadcast text. For another example, when the navigation event is an intersection navigation event, if the weather data at the current time indicates rainfall, the text 22 may be selected as the target broadcast text, which is not described again.
Optionally, after the target broadcast text is determined based on the above embodiment, the broadcast duration corresponding to the target text may be calculated, and the broadcast duration may be calculated based on the target broadcast text and the broadcast speed, so as to ensure that the user has enough time to handle the navigation event after the broadcast is completed.
In some exemplary embodiments, the broadcast timing may be implemented as at least one of a broadcast time, a broadcast position, and a broadcast distance. Accordingly, the terminal device 202 determines the broadcast timing of the first navigation event according to the time difference Δ T between the driving time length T1 and the broadcast advance time length T2, and includes at least one of the following:
embodiment a 1: and adding a time difference delta T to the real-time driving time of the navigated object to obtain the broadcast time of the first navigation event. That is, assuming that the real-time travel time of the object to be navigated is T0, the broadcast time of the first navigation event is: t0 +. DELTA.T.
Embodiment B1: and predicting the travel distance of the navigated object in the time difference according to the time difference Delta T and the real-time travel speed of the navigated object, and increasing the distance on the real-time positioning of the navigated object to obtain the broadcast position of the first navigation event.
Assuming that the real-time traveling speed of the object to be navigated is V, the distance Δ S traveled by the object to be navigated within the time difference Δ T is V × Δ T. The real-time location of the navigated object is D1, then the broadcast location of the first navigation event is: d1 +. DELTA.S.
Embodiment C1: and predicting the distance traveled by the navigated object within the time difference according to the time difference and the real-time traveling speed of the navigated object, wherein the distance is used as the distance corresponding to the broadcast time of the first navigation event. As described above, the broadcast timing of the first navigation event corresponds to a distance Δ S.
Based on the above description, optionally, when it is determined that the driving condition of the navigation object satisfies at least one of the broadcast timings, the terminal device 202 may broadcast the broadcast content corresponding to the first navigation event in a voice. Specifically, at least one of the following can be realized:
embodiment a 2: and when the real-time driving time T0' of the navigated object is determined to be the broadcast time corresponding to the broadcast time, broadcasting the broadcast content corresponding to the first navigation event by voice. That is, when the real-time travel time T0' of the object to be navigated is T0 +. DELTA.T, the broadcast content corresponding to the first navigation event is broadcasted in voice.
Embodiment B3: and when the navigated object is determined to reach the broadcast position corresponding to the broadcast opportunity based on the real-time positioning position D1' of the navigated object, broadcasting the broadcast content corresponding to the first navigation event in a voice mode. That is, when the real-time location D1 of the navigated object is D1 +. DELTA.S, the broadcast content corresponding to the first navigation event is broadcast by voice.
Embodiment C2: and when the distance from the navigated object to the first navigation event reaches the distance corresponding to the broadcasting opportunity, based on the real-time positioning position D1' of the navigated object, broadcasting the broadcasting content corresponding to the first navigation event by voice. That is, when D1' -D2 ═ Δ S, the broadcast content corresponding to the first navigation event is broadcast by voice.
The foregoing embodiments describe different implementation forms of the response time, and it should be understood that when the implementation forms of the response time are different, the calculated broadcast time is also different. As will be exemplified below.
In some embodiments, the response duration comprises: the shortest response duration Tmin and the longest response duration Tmax.
Accordingly, when the sum of the response time length and the broadcast time length is taken as the broadcast advance time length, the terminal device 202 may take the sum of the shortest response time length Tmin and the broadcast time length T2 as the shortest broadcast advance time length. The terminal device 202 may use the sum of the longest response time Tmax and the broadcast time T2 as the longest broadcast advance time.
Based on the above, correspondingly, when the terminal device 202 determines the broadcast timing of the first navigation event, the latest broadcast timing of the first navigation event may be determined according to the time difference between the running time and the shortest broadcast advance time. That is, Δ T is calculated as T1- (Tmin + T2), and the above-described embodiment a1, embodiment B1, and embodiment C1 are executed based on the calculated Δ T.
Meanwhile, the terminal device 202 may determine the earliest broadcast timing of the first navigation event according to the time difference between the driving time and the longest response time. That is, Δ T is calculated as T1- (Tmax + T2), and the above-described embodiment a1, embodiment B1, and embodiment C1 are executed based on the calculated Δ T.
Optionally, after the earliest broadcast time and the latest broadcast time are obtained through calculation, the terminal device 202 may use the latest broadcast time as the broadcast time of the first navigation event; or taking the earliest broadcasting time as the broadcasting time of the first navigation event; or the earliest broadcasting time and the latest broadcasting time are both used as the broadcasting time of the first navigation event, so that the first navigation event is broadcasted for multiple times.
In other embodiments, the response time duration comprises: a shortest response time period Tmin, a longest response time period Tmax, and a safety response time period Tsafe.
Correspondingly, terminal equipment 202 will respond to the sum of length of time and broadcast length of time, as broadcast length of time in advance, can include:
taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time; taking the sum of the longest response time and the broadcast time as the longest broadcast advance time; and taking the sum of the safety response time length and the broadcasting time length as the safety broadcasting advance time length.
Accordingly, when determining the broadcast timing of the first navigation event, the terminal device 202 may determine the latest broadcast timing of the first navigation event according to the time difference between the driving time length T1 and the shortest broadcast advance time length, that is, calculate Δ T ═ T1- (Tmin + T2), and execute the above-described embodiment a1, embodiment B1, and embodiment C1 according to the calculated Δ T.
Meanwhile, the terminal device 202 may determine the earliest broadcast timing of the first navigation event according to the time difference between the driving time length T1 and the longest response time length, that is, calculate Δ T ═ T1- (Tmax + T2), and perform the above-described embodiment a1, embodiment B1, and embodiment C1 according to the calculated Δ T.
Meanwhile, the terminal device may determine the safe broadcasting timing of the first navigation event according to the time difference between the driving time length T1 and the safe broadcasting advance time length, that is, calculate Δ T ═ T1- (Tsafe + T2), and execute the above-described embodiment a1, embodiment B1, and embodiment C1 according to the calculated Δ T.
In some embodiments, for the first navigation event, if the safety broadcast timing of the first navigation event does not conflict with the broadcast timings of other navigation events received by the navigation object, the terminal device 202 determines the safety broadcast timing corresponding to the first navigation event as the broadcast timing of the first navigation event. Namely, when the first navigation event is not conflicted with other navigation events, the first navigation event is broadcasted according to the safe broadcasting opportunity.
The following section will further describe an alternative embodiment for determining the voice broadcast time with reference to fig. 4. In fig. 4, t0 is the current time, t1 is the driving time required for the vehicle to arrive at the event, t2 is the safety response time, t3 is the longest response time, t4 is the shortest response time, and t5 is the broadcast time. Then, the broadcast time corresponding to the time length of the safety response is: t0+ (t1-t2-t 5).
In other embodiments, if the safety broadcast timing of the first navigation event conflicts with the broadcast timing of other navigation events received by the navigated object, the terminal device 202 may determine the broadcast timing of the first navigation event according to the earliest broadcast timing and/or the latest broadcast timing corresponding to the first navigation event. That is, when the first navigation event conflicts with other navigation events, one other broadcast time is selected to broadcast the first navigation event according to the earliest broadcast time and/or the latest broadcast time. As will be exemplified below.
Optionally, the terminal device 202 may determine a safe broadcast ending time of the first navigation event according to the safe broadcast time and the broadcast duration of the first navigation event. If the safety broadcasting time and the safety broadcasting ending time of the first navigation event are crossed with the broadcasting starting time and the broadcasting ending time of the second navigation event received by the navigation object, the broadcasting conflict between the first navigation event and the second navigation event can be determined. At this time, the terminal device 202 may determine any target time between the earliest broadcast time and the latest broadcast time as the broadcast time of the first navigation event; and the new broadcasting ending time formed by the target time and the broadcasting duration and the broadcasting ending time of the second navigation event are not crossed.
Based on the above, the terminal device 202 can effectively adjust the navigation events with conflicts based on the earliest broadcast opportunity and the latest broadcast opportunity. For example, in some embodiments, when the broadcast time is realized as the broadcast time, the terminal device 202 may determine the broadcast start time period of the first navigation event according to the safe broadcast time and the broadcast duration of the first navigation event; if the broadcast time interval of the first navigation event is crossed with the broadcast time interval of the second navigation event, the broadcast time interval of the first navigation event can be adjusted between the earliest broadcast time interval and the latest broadcast time interval of the first navigation event, and the adjusted broadcast time interval is enabled not to be crossed with the broadcast time interval of the second navigation event.
It is also worth noting that in some alternative embodiments, the terminal device 202 may further adjust the broadcast timing of the adjacent navigation events to avoid the formation of high-density continuous broadcast.
Optionally, if a difference between the safe broadcasting opportunity of the first navigation event and the broadcasting end opportunity of the second navigation event is smaller than a first threshold, the terminal device 202 may determine a first target opportunity between the safe broadcasting opportunity and the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the interval between the first target opportunity and the broadcasting finishing opportunity of the second navigation event is greater than or equal to the first threshold. For example, when the time difference between the broadcast time of the first navigation event and the broadcast end time of the previous navigation event is smaller than a certain set time threshold, the broadcast time of the first navigation event can be properly delayed, and the delayed broadcast time is not later than the latest broadcast time of the first navigation event.
Optionally, if a difference between the safe broadcast ending time of the first navigation event and the broadcast time of the second navigation event is smaller than a second threshold, the terminal device 202 may determine a second target time between the safe broadcast time and the earliest broadcast time as the broadcast time of the first navigation event; and the difference between the broadcast ending time formed by the second target time and the broadcast duration and the broadcast time of the second navigation event is greater than or equal to a second threshold value. For example, when the time difference between the broadcast end time of the first navigation event and the broadcast time of the next navigation event is smaller than a certain set time threshold, the broadcast time of the first navigation event can be advanced properly, and the broadcast time before and after the update is not earlier than the earliest broadcast time of the first navigation event.
Fig. 5 is a structural block diagram of a navigation voice broadcast system according to an embodiment of an application scenario of the present application, and in some application scenarios, the server described in the foregoing embodiment may be implemented as a cloud device, for example, a cloud server. The cloud end comprises a big data analysis mining engine and a broadcast event making engine. The user side comprises a data acquisition engine, a broadcast sending engine and a dynamic calculation engine.
The data acquisition engine of the user side is mainly used for acquiring position data, direction angle data and the like when a user drives a vehicle. Wherein the location data may be implemented as latitude and longitude information. After the data of the vehicle collected by the data engine can be sent to the cloud, the cloud can mine the habits of the vehicle for different events based on the big data analysis mining engine, such as the response time required for mining and processing different navigation events described in the foregoing embodiment. Based on the mining result, the map data and the configured telephone operation, the cloud end can manufacture different broadcast events through the broadcast event manufacturing engine and send the broadcast events to the user end.
After the user side receives the broadcast events sent by the cloud, the broadcast time corresponding to each broadcast event can be dynamically calculated based on the dynamic calculation engine, and voice broadcast is carried out when the broadcast time arrives based on the broadcast sending engine.
The method can fully utilize the big data mining capability of the cloud and the dynamic computing capability of the user side, and timely and accurate navigation voice broadcasting service is provided for the user.
In addition to the navigation voice broadcast system described in the foregoing embodiment, the embodiment of the present application further provides a navigation voice broadcast method, which will be exemplarily described below.
Fig. 6a is a schematic flowchart of a navigation voice broadcasting method according to an exemplary embodiment of the present application, where the method, when executed on a server side, may include the steps shown in fig. 6 a:
step 501, determining a navigation event contained in a navigation path of a navigated object.
Step 502, obtaining event information of the navigation event, wherein the event information comprises broadcast content and response time length required for processing the navigation event; and the response duration is calculated according to the historical travel track data corresponding to the navigation event.
Step 503, the navigation event and the event information are sent to the terminal device corresponding to the navigated object, so that the terminal device determines the broadcast time of the navigation event according to the event information.
In some exemplary embodiments, the method further comprises: acquiring topological data of a road network or historical driving track data of the navigation event; calculating response time required for processing the navigation event according to topological data of a road network or the historical driving track data; the response duration includes: at least one of a safe response duration, a shortest response duration, and a longest response duration.
In some exemplary embodiments, a manner of calculating a response time period required to process the navigation event based on the historical travel track data includes: calculating at least one of speed data, acceleration data and direction angle of the vehicle within a set distance range of the position of the navigation event according to the historical driving track data; determining smooth coping data corresponding to the navigation event from the historical driving track data according to at least one of the speed data, the acceleration data and the direction angle; determining a first moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a second moment when the vehicle leaves the position of the navigation event from the smooth coping data; and calculating to obtain the safety response time length required for processing the navigation event according to the first time and the second time.
In some exemplary embodiments, the method further comprises: according to at least one of the speed data, the acceleration data and the direction angle, determining emergency response data corresponding to the navigation event from the historical driving track data; determining a third moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a fourth moment when the vehicle leaves the position of the navigation event from the emergency response data; and calculating to obtain the shortest response time required for processing the navigation event according to the third time and the fourth time.
In some exemplary embodiments, one way to calculate the response time period required to process the navigation event based on the topology data of the road network may include: determining a distance between the navigation event and a last navigation event of the navigation event from topology data of the road network; calculating the average running time from the position of the last navigation event to the position of the navigation event according to the distance between the navigation event and the last navigation event; and determining the longest response time required for processing the navigation event according to the average running time and the grade of the road where the navigation event is located.
In the embodiment, based on the historical driving track data of the road network, the response time required by the vehicle in the road network when the navigation event is processed can be excavated, and then when the navigation message is broadcasted, the response time required by the navigation event is taken as the basis for determining the broadcasting time, so that the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.
Fig. 6b is a schematic flowchart of a navigation voice broadcasting method according to another exemplary embodiment of the present application, where the method, when executed on a terminal device side, may include the steps shown in fig. 6 b:
step 601, acquiring the real-time driving speed and the real-time positioning position of the navigated object.
Step 602, determining a broadcast time of a first navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object; wherein the event information includes: broadcasting content and responding time.
And 603, when the running condition of the navigation object is determined to meet the broadcasting opportunity of the first navigation event, broadcasting the broadcasting content in voice.
In some exemplary embodiments, when it is determined that the driving condition of the navigation object satisfies the broadcast timing of the first navigation event, one way of broadcasting the broadcast content by voice may include: when the real-time driving time of the navigated object is determined to be the broadcasting time corresponding to the broadcasting time, broadcasting the broadcasting content in a voice mode; or when the navigated object is determined to reach the broadcast position corresponding to the broadcast opportunity based on the real-time positioning position of the navigated object, the broadcast content is broadcasted in a voice mode; or, based on the real-time positioning position of the navigated object, when the fact that the distance from the navigated object to the first navigation event reaches the distance corresponding to the broadcasting opportunity is determined, the broadcasting content is broadcasted in a voice mode.
In some exemplary embodiments, a manner of determining a broadcast timing of a first navigation event based on a real-time driving speed, a real-time positioning location of the navigated object, and event information of the first navigation event received by the navigated object may include: determining the driving time length required by the navigated object to drive to the position of the first navigation event according to the real-time driving speed and the real-time positioning position of the navigated object; determining a broadcast advance time length of the first navigation event at least based on the response time length; and determining the broadcasting time of the first navigation event according to the time difference between the running time and the broadcasting advance time.
In some exemplary embodiments, the method further comprises: determining a broadcast speed according to a broadcast mode set by the navigated object; calculating broadcast duration corresponding to the first navigation event according to the broadcast speed and the broadcast content; determining a mode of the broadcast advance time length of the first navigation event at least based on the response time length, specifically comprising: and taking the sum of the response time length and the broadcast time length as the broadcast advance time length.
In some exemplary embodiments, the response duration includes: the length of the shortest response time and the length of the longest response time are used as a mode of broadcasting the length of time in advance, and the mode comprises the following steps: taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time; taking the sum of the longest response time and the broadcast time as the longest broadcast advance time; determining a broadcast timing of the first navigation event based on a time difference between the travel time and the broadcast advance time, comprising: determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time; determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time; and taking the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event, and/or taking the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event.
In some exemplary embodiments, the response duration includes: the method comprises the following steps that the shortest response time, the longest response time and the safety response time are used, the safety response time is shorter than the longest response time and longer than the shortest response time, the sum of the response time and the broadcast time is used as a mode of the broadcast advance time, and the method comprises the following steps: taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time; taking the sum of the longest response time and the broadcast time as the longest broadcast advance time; taking the sum of the safety response time length and the broadcasting time length as a safety broadcasting advance time length; determining a broadcast timing of the first navigation event based on a time difference between the travel time and the broadcast advance time, comprising: determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time; determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time; determining the safety broadcasting time of the first navigation event according to the time difference between the running time and the safety broadcasting advance time; if the safe broadcasting time of the first navigation event does not conflict with the broadcasting time of other navigation events received by the navigated object, determining the safe broadcasting time as the broadcasting time of the first navigation event; and if the safe broadcasting time of the first navigation event conflicts with the broadcasting times of the other navigation events, determining the broadcasting time of the first navigation event according to the earliest broadcasting time and/or the latest broadcasting time.
In some exemplary embodiments, if the safety broadcasting timing of the first navigation event conflicts with the broadcasting timings of the other navigation events, a manner of determining the broadcasting timing of the first navigation event according to the earliest broadcasting timing and/or the latest broadcasting timing may include: determining the safe broadcasting finishing time of the first navigation event according to the safe broadcasting time and the broadcasting duration of the first navigation event; if the safe broadcasting time and the safe broadcasting ending time of the first navigation event are crossed with the broadcasting starting time and the broadcasting ending time of the second navigation event received by the navigated object, determining any target time between the earliest broadcasting time and the latest broadcasting time as the broadcasting time of the first navigation event; and the target time, the target time and the new broadcasting ending time formed by the broadcasting duration and the broadcasting ending time of the second navigation event are not crossed.
In some exemplary embodiments, the method further comprises: if the difference between the safe broadcasting opportunity of the first navigation event and the broadcasting finishing opportunity of the second navigation event is smaller than a first threshold value, determining a first target opportunity between the safe broadcasting opportunity and the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the interval between the first target opportunity and the broadcasting finishing opportunity of the second navigation event is greater than or equal to the first threshold.
In some exemplary embodiments, the method further comprises: if the difference between the safe broadcasting finishing opportunity of the first navigation event and the broadcasting opportunity of the second navigation event is smaller than a second threshold value, determining a second target opportunity between the safe broadcasting opportunity and the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the difference between the broadcast ending time formed by the second target time and the broadcast duration and the broadcast time of the second navigation event is greater than or equal to the second threshold value.
In some exemplary embodiments, a manner of determining the broadcast timing of the first navigation event according to the time difference between the driving time length and the broadcast advance time length includes: increasing the time difference at the real-time running time of the navigated object to obtain the broadcast time of the first navigation event; and/or predicting the travel distance of the navigated object in the time difference according to the time difference and the real-time travel speed of the navigated object, and increasing the distance on the real-time positioning of the navigated object to obtain the broadcast position of the first navigation event; and/or predicting the distance traveled by the navigated object in the time difference according to the time difference and the real-time traveling speed of the navigated object, wherein the distance traveled by the navigated object in the time difference is used as the distance corresponding to the broadcasting opportunity.
In the embodiment, when the navigation message is broadcasted, the response time length required by processing the navigation event is used as the basis for determining the broadcasting time, the position and the speed of the vehicle are fully considered, the broadcasting time of the navigation event can be dynamically calculated according to the actual running condition of the vehicle, the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 501 to 504 may be device a; for another example, the execution subjects of steps 501 and 502 may be device a, and the execution subject of step 503 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 501, 502, etc., are merely used for distinguishing different operations, and the sequence numbers themselves do not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
It should be further noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 7 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application, and as shown in fig. 7, the terminal device includes: memory 701, processor 702, communications component 703, and audio component 604.
A memory 701 for storing a computer program and may be configured to store other various data to support operations on the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, contact data, phonebook data, messages, pictures, videos, etc.
A processor 702, coupled to the memory 701, for executing the computer program in the memory 701 for: acquiring the real-time driving speed and the real-time positioning position of a navigated object; determining a broadcast time of a first navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object; wherein the event information includes: broadcasting contents and responding time; and when the running condition of the navigation object is determined to meet the broadcasting opportunity of the first navigation event, the broadcasting content is broadcasted in a voice mode.
In some exemplary embodiments, when determining that the driving condition of the navigation object satisfies the broadcast timing of the first navigation event, the processor 702 is specifically configured to: when the real-time driving time of the navigated object is determined to be the broadcasting time corresponding to the broadcasting time, broadcasting the broadcasting content in a voice mode; or when the navigated object is determined to reach the broadcast position corresponding to the broadcast opportunity based on the real-time positioning position of the navigated object, the broadcast content is broadcasted in a voice mode; or, based on the real-time positioning position of the navigated object, when the fact that the distance from the navigated object to the first navigation event reaches the distance corresponding to the broadcasting opportunity is determined, the broadcasting content is broadcasted in a voice mode.
In some exemplary embodiments, the processor 702, when determining the broadcast timing of the first navigation event based on the real-time traveling speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object, is specifically configured to: determining the driving time length required by the navigated object to drive to the position of the first navigation event according to the real-time driving speed and the real-time positioning position of the navigated object; determining a broadcast advance time length of the first navigation event at least based on the response time length; and determining the broadcasting time of the first navigation event according to the time difference between the running time and the broadcasting advance time.
In some exemplary embodiments, the processor 702 is further configured to: determining a broadcast speed according to a broadcast mode set by the navigated object; calculating broadcast duration corresponding to the first navigation event according to the broadcast speed and the broadcast content; the processor 702 is specifically configured to, when determining that the broadcast of the first navigation event is ahead of time based on at least the response time, determine: and taking the sum of the response time length and the broadcast time length as the broadcast advance time length.
In some exemplary embodiments, the response duration includes: the shortest response time and the longest response time; processor 702 will the sum of length of time with broadcast is in the response, as broadcast length of time in advance, specifically is used for: taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time; taking the sum of the longest response time and the broadcast time as the longest broadcast advance time; when determining the broadcast time of the first navigation event according to the time difference between the driving time and the broadcast advance time, the processor 702 is specifically configured to: determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time; determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time; and taking the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event, and/or taking the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event.
In some exemplary embodiments, the response duration includes: the system comprises a shortest response time length, a longest response time length and a safety response time length, wherein the safety response time length is less than the longest response time length and is greater than the shortest response time length; processor 702 will the sum of length of time with broadcast is in the response, as broadcast length of time in advance, specifically is used for: taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time; taking the sum of the longest response time and the broadcast time as the longest broadcast advance time; taking the sum of the safety response time length and the broadcasting time length as a safety broadcasting advance time length; when determining the broadcast time of the first navigation event according to the time difference between the driving time and the broadcast advance time, the processor 702 is specifically configured to: determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time; determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time; determining the safety broadcasting time of the first navigation event according to the time difference between the running time and the safety broadcasting advance time; if the safe broadcasting time of the first navigation event does not conflict with the broadcasting time of other navigation events received by the navigated object, determining the safe broadcasting time as the broadcasting time of the first navigation event; and if the safe broadcasting time of the first navigation event conflicts with the broadcasting times of the other navigation events, determining the broadcasting time of the first navigation event according to the earliest broadcasting time and/or the latest broadcasting time.
In some exemplary embodiments, if the safety broadcasting timing of the first navigation event conflicts with the broadcasting timings of the other navigation events, the processor 702 is specifically configured to, when determining the broadcasting timing of the first navigation event according to the earliest broadcasting timing and/or the latest broadcasting timing: determining the safe broadcasting finishing time of the first navigation event according to the safe broadcasting time and the broadcasting duration of the first navigation event; if the safe broadcasting time and the safe broadcasting ending time of the first navigation event are crossed with the broadcasting starting time and the broadcasting ending time of the second navigation event received by the navigated object, determining any target time between the earliest broadcasting time and the latest broadcasting time as the broadcasting time of the first navigation event; and the target time, the target time and the new broadcasting ending time formed by the broadcasting duration and the broadcasting ending time of the second navigation event are not crossed.
In some exemplary embodiments, the processor 702 is further configured to: if the difference between the safe broadcasting opportunity of the first navigation event and the broadcasting finishing opportunity of the second navigation event is smaller than a first threshold value, determining a first target opportunity between the safe broadcasting opportunity and the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the interval between the first target opportunity and the broadcasting finishing opportunity of the second navigation event is greater than or equal to the first threshold.
In some exemplary embodiments, the processor 702 is further configured to: if the difference between the safe broadcasting finishing opportunity of the first navigation event and the broadcasting opportunity of the second navigation event is smaller than a second threshold value, determining a second target opportunity between the safe broadcasting opportunity and the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the difference between the broadcast ending time formed by the second target time and the broadcast duration and the broadcast time of the second navigation event is greater than or equal to the second threshold value.
In some exemplary embodiments, when determining the broadcast timing of the first navigation event according to the time difference between the running time and the broadcast advance time, the processor 702 is specifically configured to: increasing the time difference at the real-time running time of the navigated object to obtain the broadcast time of the first navigation event; and/or predicting the travel distance of the navigated object in the time difference according to the time difference and the real-time travel speed of the navigated object, and increasing the distance on the real-time positioning of the navigated object to obtain the broadcast position of the first navigation event; and/or predicting the distance traveled by the navigated object in the time difference according to the time difference and the real-time traveling speed of the navigated object, wherein the distance traveled by the navigated object in the time difference is used as the distance corresponding to the broadcasting opportunity.
Further, as shown in fig. 7, the terminal device is further configured to: display component 705, power component 706, and the like. Only some of the components are schematically shown in fig. 7, and the terminal device is not meant to include only the components shown in fig. 7.
Audio component 704, among other things, can be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals including a voice broadcast message of the navigation event.
Among other things, the display assembly 705 includes a screen, which may include a liquid crystal display assembly (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In the embodiment, when the navigation message is broadcasted, the response time length required by the navigation event processing is used as the basis for determining the broadcasting time, the position and the speed of the vehicle are fully considered, the broadcasting time of the navigation event is favorably and dynamically calculated according to the actual running condition of the vehicle, the problem that the broadcasting of the navigation event is not timely is solved, and the safety in the driving process is improved.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program can implement the steps that can be executed by the terminal device in the foregoing method embodiments when executed.
Fig. 8 is a schematic structural diagram of a server according to an exemplary embodiment of the present application, and as shown in fig. 8, the terminal device includes: memory 801, processor 802, and communications component 803.
A memory 801 for storing computer programs and may be configured to store other various data to support operations on the server. Examples of such data include instructions for any application or method operating on the server, contact data, phonebook data, messages, pictures, videos, and so forth.
A processor 802, coupled to the memory 801, for executing computer programs in the memory 801 for: determining a navigation event contained in a navigation path of a navigated object; acquiring event information of the navigation event, wherein the event information comprises broadcast contents and response time required for processing the navigation event; the response duration is obtained by calculation according to historical driving track data corresponding to the navigation event; and sending the navigation event and the event information to terminal equipment corresponding to the navigated object so that the terminal equipment determines the broadcasting time of the navigation event according to the event information.
Further optionally, the processor 802 is further configured to: acquiring topological data of a road network or historical driving track data of the navigation event; calculating response time required for processing the navigation event according to topological data of a road network or the historical driving track data; the response duration includes: at least one of a safe response duration, a shortest response duration, and a longest response duration.
Further optionally, when the processor 802 calculates the response time required for processing the navigation event according to the historical driving track data, it is specifically configured to: calculating at least one of speed data, acceleration data and direction angle of the vehicle within a set distance range of the position of the navigation event according to the historical driving track data; determining smooth coping data corresponding to the navigation event from the historical driving track data according to at least one of the speed data, the acceleration data and the direction angle; determining a first moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a second moment when the vehicle leaves the position of the navigation event from the smooth coping data; and calculating to obtain the safety response time length required for processing the navigation event according to the first time and the second time.
Further optionally, the processor 802 is further configured to: according to at least one of the speed data, the acceleration data and the direction angle, determining emergency response data corresponding to the navigation event from the historical driving track data; determining a third moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a fourth moment when the vehicle leaves the position of the navigation event from the emergency response data; and calculating to obtain the shortest response time required for processing the navigation event according to the third time and the fourth time.
Further optionally, when the processor 802 calculates the response time required for processing the navigation event according to the topology data of the road network, it is specifically configured to: determining a distance between the navigation event and a last navigation event of the navigation event from topology data of the road network; calculating the average running time from the position of the last navigation event to the position of the navigation event according to the distance between the navigation event and the last navigation event; and determining the longest response time required for processing the navigation event according to the average running time and the grade of the road where the navigation event is located.
Further, as shown in fig. 8, the server is further configured to: power supply components 804, and the like. Only some of the components are schematically shown in fig. 8, and the server is not meant to include only the components shown in fig. 8.
In the embodiment, based on the historical driving track data of the road network, the response time required by the vehicle in the road network when the navigation event is processed can be excavated, and then when the navigation message is broadcasted, the response time required by the navigation event is taken as the basis for determining the broadcasting time, so that the problem that the navigation event is not broadcasted timely is solved, and the safety in the driving process is improved.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program can implement the steps that can be executed by the server in the foregoing method embodiments when executed.
The memories of fig. 7 and 8 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The communication components of fig. 7 and 8 described above are configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The power supply components of fig. 7 and 8 described above provide power to the various components of the device in which the power supply components are located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (19)

1. A navigation voice broadcasting method comprises the following steps:
acquiring the real-time driving speed and the real-time positioning position of a navigated object;
determining a broadcast time of a first navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object; wherein the event information includes: broadcasting contents and responding time;
and when the running condition of the navigation object is determined to meet the broadcasting opportunity of the first navigation event, the broadcasting content is broadcasted in a voice mode.
2. The method according to claim 1, wherein when it is determined that the driving condition of the navigation object meets the broadcast opportunity of the first navigation event, the broadcast content is broadcast by voice, and the method comprises the following steps:
when the real-time driving time of the navigated object is determined to be the broadcasting time corresponding to the broadcasting time, broadcasting the broadcasting content in a voice mode; or the like, or, alternatively,
when the navigated object is determined to reach a broadcast position corresponding to the broadcast opportunity based on the real-time positioning position of the navigated object, the broadcast content is broadcasted in a voice mode; or the like, or, alternatively,
and based on the real-time positioning position of the navigated object, when the distance from the navigated object to the first navigation event reaches the distance corresponding to the broadcast opportunity, the broadcast content is broadcasted in a voice mode.
3. The method according to claim 1, wherein determining the broadcasting timing of the first navigation event based on the real-time driving speed, the real-time positioning position of the navigated object and the event information of the first navigation event received by the navigated object comprises:
determining the driving time length required by the navigated object to drive to the position of the first navigation event according to the real-time driving speed and the real-time positioning position of the navigated object;
determining a broadcast advance time length of the first navigation event at least based on the response time length;
and determining the broadcasting time of the first navigation event according to the time difference between the running time and the broadcasting advance time.
4. The method of claim 3, the method further comprising:
determining a broadcast speed according to a broadcast mode set by the navigated object;
calculating broadcast duration corresponding to the first navigation event according to the broadcast speed and the broadcast content;
determining the broadcast advance time of the first navigation event at least based on the response time, specifically comprising:
and taking the sum of the response time length and the broadcast time length as the broadcast advance time length.
5. The method of claim 4, wherein the response duration comprises: the length of time and the length of time of the longest response of the shortest response will the length of time of the response with the sum of the length of time of broadcasting is regarded as it is long in advance to broadcast, include:
taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time;
taking the sum of the longest response time and the broadcast time as the longest broadcast advance time;
determining the broadcasting opportunity of the first navigation event according to the time difference between the running time and the broadcasting advance time, and the method comprises the following steps:
determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time;
determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time;
and taking the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event, and/or taking the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event.
6. The method of claim 4, wherein the response duration comprises: the shortest response duration, the longest response duration and the safety response duration, the safety response duration is less than the longest response duration and is greater than the shortest response duration, and the sum of the response duration and the broadcast duration is used as the broadcast advance duration, including:
taking the sum of the shortest response time and the broadcast time as the shortest broadcast advance time;
taking the sum of the longest response time and the broadcast time as the longest broadcast advance time;
taking the sum of the safety response time length and the broadcasting time length as a safety broadcasting advance time length;
determining the broadcasting opportunity of the first navigation event according to the time difference between the running time and the broadcasting advance time, and the method comprises the following steps:
determining the latest broadcasting time of the first navigation event according to the time difference between the running time and the shortest broadcasting advance time;
determining the earliest broadcast time of the first navigation event according to the time difference between the running time and the longest response time;
determining the safety broadcasting time of the first navigation event according to the time difference between the running time and the safety broadcasting advance time;
if the safe broadcasting time of the first navigation event does not conflict with the broadcasting time of other navigation events received by the navigated object, determining the safe broadcasting time as the broadcasting time of the first navigation event;
and if the safe broadcasting time of the first navigation event conflicts with the broadcasting times of the other navigation events, determining the broadcasting time of the first navigation event according to the earliest broadcasting time and/or the latest broadcasting time.
7. The method according to claim 6, wherein if the safety broadcasting timing of the first navigation event conflicts with the broadcasting timings of the other navigation events, determining the broadcasting timing of the first navigation event according to the earliest broadcasting timing and/or the latest broadcasting timing, includes:
determining the safe broadcasting finishing time of the first navigation event according to the safe broadcasting time and the broadcasting duration of the first navigation event;
if the safe broadcasting time and the safe broadcasting ending time of the first navigation event are crossed with the broadcasting starting time and the broadcasting ending time of the second navigation event received by the navigated object, determining any target time between the earliest broadcasting time and the latest broadcasting time as the broadcasting time of the first navigation event; and the target time, the target time and the new broadcasting ending time formed by the broadcasting duration and the broadcasting ending time of the second navigation event are not crossed.
8. The method of claim 7, further comprising:
if the difference between the safe broadcasting opportunity of the first navigation event and the broadcasting finishing opportunity of the second navigation event is smaller than a first threshold value, determining a first target opportunity between the safe broadcasting opportunity and the latest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the interval between the first target opportunity and the broadcasting finishing opportunity of the second navigation event is greater than or equal to the first threshold.
9. The method of claim 7, further comprising:
if the difference between the safe broadcasting finishing opportunity of the first navigation event and the broadcasting opportunity of the second navigation event is smaller than a second threshold value, determining a second target opportunity between the safe broadcasting opportunity and the earliest broadcasting opportunity as the broadcasting opportunity of the first navigation event; and the difference between the broadcast ending time formed by the second target time and the broadcast duration and the broadcast time of the second navigation event is greater than or equal to the second threshold value.
10. The method according to any one of claims 3 to 9, wherein determining the broadcast timing of the first navigation event according to the time difference between the driving time length and the broadcast advance time length comprises:
increasing the time difference at the real-time running time of the navigated object to obtain the broadcast time of the first navigation event; and/or the presence of a gas in the gas,
predicting the travel distance of the navigated object in the time difference according to the time difference and the real-time travel speed of the navigated object, and increasing the distance on the real-time positioning of the navigated object to obtain the broadcast position of the first navigation event; and/or the presence of a gas in the gas,
and predicting the distance traveled by the navigated object in the time difference according to the time difference and the real-time traveling speed of the navigated object, wherein the distance traveled by the navigated object in the time difference is used as the distance corresponding to the broadcasting opportunity.
11. A navigation voice broadcasting method is suitable for a server, and comprises the following steps:
determining a navigation event contained in a navigation path of a navigated object;
acquiring event information of the navigation event, wherein the event information comprises broadcast contents and response time required for processing the navigation event; the response duration is obtained by calculation according to historical driving track data corresponding to the navigation event;
and sending the navigation event and the event information to terminal equipment corresponding to the navigated object so that the terminal equipment determines the broadcasting time of the navigation event according to the event information.
12. The method of claim 11, further comprising:
acquiring topological data of a road network or historical driving track data of the navigation event;
calculating response time required for processing the navigation event according to topological data of a road network or the historical driving track data; the response duration includes: at least one of a safe response duration, a shortest response duration, and a longest response duration.
13. The method of claim 12, wherein calculating a response time period required to process the navigation event based on the historical travel track data comprises:
calculating at least one of speed data, acceleration data and direction angle of the vehicle within a set distance range of the position of the navigation event according to the historical driving track data;
determining smooth coping data corresponding to the navigation event from the historical driving track data according to at least one of the speed data, the acceleration data and the direction angle;
determining a first moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a second moment when the vehicle leaves the position of the navigation event from the smooth coping data;
and calculating to obtain the safety response time length required for processing the navigation event according to the first time and the second time.
14. The method of claim 13, further comprising:
according to at least one of the speed data, the acceleration data and the direction angle, determining emergency response data corresponding to the navigation event from the historical driving track data;
determining a third moment when the vehicle starts to continuously decelerate within a set distance range of the position of the navigation event and a fourth moment when the vehicle leaves the position of the navigation event from the emergency response data;
and calculating to obtain the shortest response time required for processing the navigation event according to the third time and the fourth time.
15. The method of any of claims 12-14, wherein calculating a response time period required to process the navigation event based on topology data of a road network comprises:
determining a distance between the navigation event and a last navigation event of the navigation event from topology data of the road network;
calculating the average running time from the position of the last navigation event to the position of the navigation event according to the distance between the navigation event and the last navigation event;
and determining the longest response time required for processing the navigation event according to the average running time and the grade of the road where the navigation event is located.
16. A server, comprising: a memory, a processor, and a communication component;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps of the method of any one of claims 11-15.
17. A terminal device, comprising: a memory, a processor, a communication component, and a display component;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps of the method of any one of claims 2-10.
18. A computer readable storage medium storing a computer program, wherein the computer program is capable of performing the steps of the method of any one of claims 2-10 or the steps of the method of any one of claims 11-15 when executed.
19. A navigation voice broadcasting system, comprising:
a server and a terminal device of a navigated object;
wherein the server is configured to: determining a navigation event contained in a navigation path of the navigated object; acquiring event information of the navigation event, and issuing the navigation event and the event information to the terminal equipment;
the terminal device is configured to: acquiring the real-time driving speed and the real-time positioning position of the navigated object; determining the broadcasting time of the navigation event based on the real-time driving speed and the real-time positioning position of the navigated object and the event information of the navigation event received by the navigated object; wherein the event information includes: broadcasting the content and responding time; and when the condition that the navigation object runs meets the broadcasting time of the navigation event is determined, the broadcasting content is broadcasted in a voice mode.
CN202010969645.1A 2020-09-15 2020-09-15 Navigation voice broadcasting method, equipment, system and storage medium Pending CN114184197A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010969645.1A CN114184197A (en) 2020-09-15 2020-09-15 Navigation voice broadcasting method, equipment, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010969645.1A CN114184197A (en) 2020-09-15 2020-09-15 Navigation voice broadcasting method, equipment, system and storage medium

Publications (1)

Publication Number Publication Date
CN114184197A true CN114184197A (en) 2022-03-15

Family

ID=80601259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010969645.1A Pending CN114184197A (en) 2020-09-15 2020-09-15 Navigation voice broadcasting method, equipment, system and storage medium

Country Status (1)

Country Link
CN (1) CN114184197A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114973740A (en) * 2022-06-06 2022-08-30 北京百度网讯科技有限公司 Method and device for determining voice broadcast time and electronic equipment

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200625210A (en) * 2005-01-07 2006-07-16 Mitac Int Corp Voice navigation apparatus and method
CN101046384A (en) * 2007-04-27 2007-10-03 江苏新科数字技术有限公司 Phonetic prompt method of navigation instrument
CN101339044A (en) * 2008-08-12 2009-01-07 凯立德欣技术(深圳)有限公司 Navigation system and navigation system voice prompt method
CN101419077A (en) * 2008-11-19 2009-04-29 凯立德欣技术(深圳)有限公司 Speech sound broadcasting method and speed sound broadcasting device using the method and navigation system
CN101876551A (en) * 2010-04-09 2010-11-03 深圳市凯立德计算机系统技术有限公司 Navigation method and navigation device of multiple voice broadcasting modes
CN102384751A (en) * 2010-09-01 2012-03-21 北京四维图新科技股份有限公司 Method for realizing voice navigation of navigation terminal and navigation terminal
CN103776460A (en) * 2014-01-27 2014-05-07 上海安吉星信息服务有限公司 Voice broadcasting method of navigation system
CN106156303A (en) * 2016-06-30 2016-11-23 百度在线网络技术(北京)有限公司 Report processing method and processing device
CN106289288A (en) * 2015-05-29 2017-01-04 深圳市腾讯计算机系统有限公司 A kind of navigation information broadcasting method and device
CN107525517A (en) * 2016-10-09 2017-12-29 腾讯科技(深圳)有限公司 voice broadcast method and device
CN107919025A (en) * 2017-11-14 2018-04-17 江西爱驰亿维实业有限公司 Vehicle-mounted voice road conditions broadcasting system, method, equipment and storage medium
CN108775905A (en) * 2018-07-02 2018-11-09 华南理工大学 A kind of electronic navigation voice broadcast method adapting to road equipment
CN108966135A (en) * 2018-07-27 2018-12-07 武汉理工大学 A kind of underground intercommunication secondary navigation system based on iBeacon technology
CN109029479A (en) * 2018-06-25 2018-12-18 北京小米移动软件有限公司 Navigation reminders method and device, electronic equipment, computer readable storage medium
CN110579219A (en) * 2019-09-09 2019-12-17 腾讯大地通途(北京)科技有限公司 Track data processing method and device, storage medium and computer equipment
CN110602642A (en) * 2018-06-13 2019-12-20 北京嘀嘀无限科技发展有限公司 Navigation broadcasting method and system based on cloud service
CN111220172A (en) * 2018-11-23 2020-06-02 北京嘀嘀无限科技发展有限公司 Navigation voice broadcasting method and system

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200625210A (en) * 2005-01-07 2006-07-16 Mitac Int Corp Voice navigation apparatus and method
CN101046384A (en) * 2007-04-27 2007-10-03 江苏新科数字技术有限公司 Phonetic prompt method of navigation instrument
CN101339044A (en) * 2008-08-12 2009-01-07 凯立德欣技术(深圳)有限公司 Navigation system and navigation system voice prompt method
CN101419077A (en) * 2008-11-19 2009-04-29 凯立德欣技术(深圳)有限公司 Speech sound broadcasting method and speed sound broadcasting device using the method and navigation system
CN101876551A (en) * 2010-04-09 2010-11-03 深圳市凯立德计算机系统技术有限公司 Navigation method and navigation device of multiple voice broadcasting modes
CN102384751A (en) * 2010-09-01 2012-03-21 北京四维图新科技股份有限公司 Method for realizing voice navigation of navigation terminal and navigation terminal
CN103776460A (en) * 2014-01-27 2014-05-07 上海安吉星信息服务有限公司 Voice broadcasting method of navigation system
CN106289288A (en) * 2015-05-29 2017-01-04 深圳市腾讯计算机系统有限公司 A kind of navigation information broadcasting method and device
CN106156303A (en) * 2016-06-30 2016-11-23 百度在线网络技术(北京)有限公司 Report processing method and processing device
CN107525517A (en) * 2016-10-09 2017-12-29 腾讯科技(深圳)有限公司 voice broadcast method and device
CN107919025A (en) * 2017-11-14 2018-04-17 江西爱驰亿维实业有限公司 Vehicle-mounted voice road conditions broadcasting system, method, equipment and storage medium
CN110602642A (en) * 2018-06-13 2019-12-20 北京嘀嘀无限科技发展有限公司 Navigation broadcasting method and system based on cloud service
CN109029479A (en) * 2018-06-25 2018-12-18 北京小米移动软件有限公司 Navigation reminders method and device, electronic equipment, computer readable storage medium
CN108775905A (en) * 2018-07-02 2018-11-09 华南理工大学 A kind of electronic navigation voice broadcast method adapting to road equipment
CN108966135A (en) * 2018-07-27 2018-12-07 武汉理工大学 A kind of underground intercommunication secondary navigation system based on iBeacon technology
CN111220172A (en) * 2018-11-23 2020-06-02 北京嘀嘀无限科技发展有限公司 Navigation voice broadcasting method and system
CN110579219A (en) * 2019-09-09 2019-12-17 腾讯大地通途(北京)科技有限公司 Track data processing method and device, storage medium and computer equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114973740A (en) * 2022-06-06 2022-08-30 北京百度网讯科技有限公司 Method and device for determining voice broadcast time and electronic equipment
CN114973740B (en) * 2022-06-06 2023-09-12 北京百度网讯科技有限公司 Method and device for determining voice broadcasting time and electronic equipment

Similar Documents

Publication Publication Date Title
EP3528078B1 (en) Vehicle platoon system control for intersections
US10071745B2 (en) Automated drive assisting system, automated drive assisting method, and computer program
US10315664B2 (en) Automatic driving assistance system, automatic driving assistance method, and computer program
JP6553917B2 (en) Automatic driving support system, automatic driving support method and computer program
JP6474307B2 (en) Automatic driving support system, automatic driving support method, and computer program
JP6252235B2 (en) Automatic driving support system, automatic driving support method, and computer program
US20210049903A1 (en) Method and apparatus for perception-sharing between vehicles
US20210276594A1 (en) Vehicle trajectory prediction near or at traffic signal
TW201800291A (en) Message pushing method, apparatus and device
US20200231178A1 (en) Vehicle control system, vehicle control method, and program
JP2017033403A (en) Driving support apparatus
JP2017156954A (en) Automated driving system
CN114283619A (en) Vehicle obstacle avoidance system, platform framework, method and vehicle based on V2X
JP5772730B2 (en) Driver assistance device
JP2016207063A (en) Automatic driving support system, automatic driving support method, and computer program
CN114184197A (en) Navigation voice broadcasting method, equipment, system and storage medium
JP6880586B2 (en) Information provision method and information provision device
JP2019053394A (en) Automatic driving support device and computer program
CN112907945B (en) Road state determination and navigation route planning method and equipment
CN113963535B (en) Driving decision determination method and device and electronic equipment storage medium
CN115631644A (en) Method and device for controlling vehicle passing, electronic equipment and computer storage medium
EP3889944A1 (en) A vehicle, fleet management and traffic light interaction architecture design via v2x
JP7501039B2 (en) Driving assistance device and computer program
JP2007271550A (en) Route guide system and route guide method
JP6674430B2 (en) Driving support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination