JP5681611B2 - Navigation system, navigation apparatus, method, and server - Google Patents

Navigation system, navigation apparatus, method, and server Download PDF

Info

Publication number
JP5681611B2
JP5681611B2 JP2011245970A JP2011245970A JP5681611B2 JP 5681611 B2 JP5681611 B2 JP 5681611B2 JP 2011245970 A JP2011245970 A JP 2011245970A JP 2011245970 A JP2011245970 A JP 2011245970A JP 5681611 B2 JP5681611 B2 JP 5681611B2
Authority
JP
Japan
Prior art keywords
priority
voice
information
navigation device
rule
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011245970A
Other languages
Japanese (ja)
Other versions
JP2013101083A (en
Inventor
俊一郎 古畑
俊一郎 古畑
祖父江 恒夫
恒夫 祖父江
▲高▼野 英樹
英樹 ▲高▼野
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to JP2011245970A priority Critical patent/JP5681611B2/en
Publication of JP2013101083A publication Critical patent/JP2013101083A/en
Application granted granted Critical
Publication of JP5681611B2 publication Critical patent/JP5681611B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Input/output arrangements for on-board computers output of additional, non-guidance related information, e.g. low fuel level, fuel efficient driving, gear change, speeding, dangerous curve ahead, slippery road, school zone, speed traps, driving behaviour feedback, advertising, virtual billboards or road signs
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station

Description

  The present invention relates to navigation using voice guidance.

  Conventionally, a navigation device having a voice route guidance function is known. The voice route guidance function is a function for outputting guidance information such as an intersection name, a distance to the intersection, and a guidance direction when the vehicle approaches an intersection where the vehicle turns right or left. In addition, a navigation device having a voice warning function is also known. The danger point warning function by voice records the points where accidents have occurred in the past, points where the user felt dangerous (hereinafter referred to as dangerous points), etc. Is a function for outputting voice information of dangerous point information such as the distance to the dangerous point and what danger the dangerous point is.

  As described above, an increasing number of navigation devices perform voice route guidance and / or danger point warning by outputting guidance information and / or danger point information (hereinafter referred to as voice messages). Route guidance and / or danger point warnings by outputting voice messages must be output before reaching intersections and / or danger points. However, when there are multiple intersections and dangerous points in a certain range, it is necessary to output the voice message in duplicate (for example, output of another voice message is started before the output of one voice message is finished). If it has to be done), it may reach an intersection or danger point before it finishes outputting a voice message, or it may pass through an intersection or danger point.

  In order to solve the above problems, the following navigation apparatus is disclosed.

  For example, a navigation device is disclosed in which priorities are set in advance for each voice message, and when voice messages to be output are duplicated, the voice messages are output in order from the voice message having the highest priority (Patent Document 1). Also, for example, priorities are determined in advance for each category of voice messages such as traffic information and route guidance information, and a plurality of times are calculated from the time when the voice message is output and the time when the vehicle reaches the intersection or the dangerous point. A navigation device is disclosed in which it is determined whether or not the voice message is output in duplicate, and if the result of the determination is that the voice message overlaps, a voice message having a high priority is output (Patent Document 2). ).

Japanese Patent Laid-Open No. 2002-236029 Japanese Patent No. 4682658

  According to the above-described conventional navigation device, the order of the voice messages when the voice messages are output in duplicate follows the priority of the voice messages, and the priority is a predetermined fixed value. For this reason, the order of outputting voice messages is not necessarily the order suitable for the situation of the vehicle.

  Accordingly, an object of the present invention is to provide a navigation technique that can output voice messages in an order suitable for the situation of the vehicle.

  A navigation system comprising a navigation device that provides guidance by outputting a voice message, and a server that communicates with the navigation device, a detection unit that detects the position and speed of the navigation device, and a weight for each of the plurality of voice messages A voice weight storage unit that stores voice weight correspondence information associated with each other, a rule storage unit that stores priority rule information that is a rule for determining the priority of each of a plurality of voice messages, and priority A priority determining unit that determines the priority of each of the plurality of voice messages based on the degree rule information; and a voice output unit that outputs the plurality of voice messages from the navigation device in an order according to the determined priority. The rule represented by the priority rule information is a rule for determining the priority of each voice message based on the detected position and speed of the navigation device and the weight of each voice message represented by the voice weight correspondence information. is there. The priority rule information may be, for example, a calculation formula, a correspondence table, or may be in any other format.

  Voice messages can be output in an order suitable for the vehicle situation.

FIG. 1 is a diagram illustrating a hardware configuration of the navigation system according to the embodiment. FIG. 2 is a functional block diagram of the navigation system according to the embodiment. FIG. 3 shows an example of the danger point table. FIG. 4 shows an example of the guidance priority table. FIG. 5 shows an example of an operation history table. FIG. 6 shows a calculation flow of evaluation points in the operation history table. FIG. 7 shows an example of a priority calculation formula table. FIG. 8 shows a sequence diagram of the navigation system. FIG. 9 shows a flow of voice message output order determination.

  Hereinafter, embodiments will be described with reference to the drawings. The same code | symbol is added to the same element in a figure.

  FIG. 1 is a diagram illustrating a hardware configuration of the navigation system according to the embodiment.

  In the following description, various types of information may be described using the expression “xxx table”, but the various types of information may be expressed using a data structure other than a table. In order to show that it does not depend on the data structure, the “xxx table” can be called “xxx information”.

  The navigation system according to the present embodiment includes a business car navigation device used by a delivery company or other business operators, and a portal server such as a distribution center in each district that receives information from the car navigation device. To do.

  The navigation system includes a navigation apparatus (hereinafter simply referred to as a navigation apparatus) 101 capable of voice guidance, and a center portal server (hereinafter referred to as a server) 102 connected to the navigation apparatus 101 via a wireless communication network 103. . The wireless communication network 103 may be the Internet, for example.

  The navigation device 101 controls the CPU 104, the memory 105 that is a temporary storage area, the input device 106 such as a touch panel and a switch, the display output device 107 that controls the display screen on the display, and the sound output from the speaker. Audio communication device 108, positioning device 109 having a GPS signal receiver, inertial sensor, etc., external storage device 110 having an auxiliary storage device such as a hard disk drive or flash memory, and external communication for connecting to a network such as the Internet It has an interface (hereinafter referred to as external communication I / F) 111, a driving condition acquisition device 112 including a sensor that recognizes a driving operation of a user (driver), and a bus 113 that connects the devices 104 to 112 to each other. The external storage device 110 may exist outside the navigation device 101.

  The server 102 includes a CPU 114, a memory 115 as a temporary storage area, an external storage device 116 having an auxiliary storage device such as a hard disk drive or a flash memory, and an external communication interface (hereinafter referred to as an external communication interface) for connecting to a network such as the Internet. (Communication I / F) 117 and a bus 118 for connecting the above devices 114 to 117 to each other. The external storage device 106 may exist outside the server 102.

  FIG. 2 is a functional block diagram of the navigation system according to the present embodiment.

  The navigation device 101 includes an external communication unit 201, a position / vehicle speed detection unit 202, a guidance route creation unit 203, a guidance route storage DB 204, a map DB 205, a screen display unit 206, an operation input unit 207, a driving operation acquisition / analysis unit 208, a driving A feature DB 209, a danger point DB 210, a driving history DB 211, a guidance priority DB 212, a priority calculation formula DB 213, a priority calculation unit 214, a voice guidance creation unit 215, and a voice output unit 216 are provided.

  Each of the guide route storage DB 204, map DB 205, driving feature DB 209, danger point DB 210, driving history DB 211, guidance priority DB 212, and priority calculation DB 213 is constructed in the external storage device 110. Also, external communication unit 201, position / vehicle speed detection unit 202, guidance route creation unit 203, screen display unit 206, operation input unit 207, driving operation acquisition / analysis unit 208, priority calculation unit 214, voice guidance creation unit 215, The audio output unit 216 is realized by the CPU 104 executing a predetermined program, but may be realized by a hardware circuit instead of or in addition thereto.

  The external communication unit 201 exchanges data with the server 102 when called from the external communication I / F 111. Specifically, for example, the external communication unit 201 transmits the data of the driving feature DB 209 to the server 102 and is stored in the dangerous point DB 217, the guidance priority DB 218, the priority calculation formula DB 219, and the driving history DB 220, and the server The data transmitted from 102 is received.

  The position / vehicle speed detection unit 202 detects the position and speed of the vehicle (that is, the position and speed of the navigation device mounted on the vehicle). The position of the vehicle is detected from the latitude and longitude measured by the positioning device 109. The speed of the vehicle is calculated from the position of the vehicle at a plurality of times. The position / vehicle speed detection unit 202 transmits the detected position and speed of the vehicle to the guidance route creation unit 203.

  The guide route creation unit 203 creates a guide route based on data transmitted from the position / vehicle speed detection unit 202 and a request from the user transmitted from the operation input unit 207 described later. Specifically, for example, the guidance route creation unit 203 creates and creates a guidance route from the current position of the vehicle received from the position / vehicle speed detection unit 203 and a destination received from the operation input unit 207 described later. Information representing the guidance route is stored in the memory 105. Then, the guidance route creation unit 203 reads the guidance route stored in the memory 105 and the dangerous point information stored in the dangerous point DB 210, adds the dangerous point information to the guidance route, and guides with the dangerous point information. Create a route. The created guide route with danger point information is stored in the guide route storage DB 204. Note that the guidance route creation unit 203 may re-search for a bypass route when, for example, a traffic jam occurs in the guidance route.

  Map information is stored in the map DB 205.

  The screen display unit 206 reads the guide route with dangerous point information stored in the guide route storage DB 204 and the map information in the map DB 205, and outputs them to the display output device 107 together. Further, the request from the user transmitted from the operation input unit 207 is received and reflected on the display output device 107.

  The operation input unit 207 analyzes the destinations of various requests input by the user to the input device 106, and transmits the analyzed requests to the screen display unit 206 and / or the guide route creation unit 203. Further, the operation input unit 207 transmits the dangerous point information input by the user to the input device 106 to the server 102.

  The driving operation acquisition / analysis unit 208 acquires the characteristics of the driving operation of the user (here, for each driver who drives the vehicle). The feature of the driving operation of the driver is acquired, for example, between a point at a predetermined distance from the front of the intersection (hereinafter, a feature acquisition start point) and the intersection. The characteristics of the driver's driving operation to be acquired include the driver's steering wheel operation, brake operation, accelerator operation, and the like. The feature of the driving operation of the driver is acquired by the positioning device 109, the driving condition acquisition device 112, and the like. The characteristic of the steering wheel operation is whether the driver suddenly turns the steering wheel or gently turns the steering wheel. This is, for example, information such as at which position the driver starts to turn the steering wheel and at what angle the steering wheel is turned from the feature acquisition start point to the intersection. The acquisition of the brake operation includes whether the driver applies the brake suddenly or steps on the brake gently. This is, for example, information on the position at which the driver starts to brake and how much brake is applied between the feature acquisition start point and the intersection. A characteristic of the accelerator operation is whether the driver steps on the accelerator strongly or at a stroke at a stretch. This is, for example, information such as where the driver starts stepping on the accelerator and how much accelerator is pressed between the feature acquisition start point and the intersection. Note that the characteristics of the driving operation of the driver may be acquired when, for example, a plurality of roads are determined in advance and passed through these roads, or may be acquired for all intersections. Furthermore, the characteristics of the driving operation are not acquired only from the characteristic acquisition start point to the intersection, but based on the speed of the vehicle passing through those points when passing through the dangerous point or the guide point. It may be obtained by any method. The driving operation acquisition / analysis unit 208 stores the acquired driving operation characteristics of the driver in the driving characteristics DB 209 as a log (hereinafter, characteristic log).

  Further, the driving operation acquisition / analysis unit 208 analyzes the characteristics of the acquired driving operation of the driver. Specifically, the driving operation acquisition / analysis unit 208 compares each driver's feature log in the driving feature DB 209 with the reference value of the driving operation transmitted from the server 102, converts it to a score, An evaluation score is calculated. Then, the driving operation acquisition / analysis unit 208 creates a driving history table 500 from the calculated driver evaluation points, and stores the driving history table 500 in the driving history DB 211. The operation history table 500 will be described in detail later.

  A feature log is stored in the driving feature DB 209. The feature log stored in the driving feature DB 209 is transmitted to the driving history DB 220 of the server 102 by, for example, the driving operation acquisition / analysis unit 208. The transmission of the feature log to the operation history DB 220 may be performed, for example, every time when the operation is completed, or may be performed every predetermined period.

  The danger point table 210 transmitted from the server 102 is stored in the danger point DB 210. The dangerous point table 300 in the dangerous point DB 210 may be synchronized with a dangerous point table 300 in the dangerous point DB of the server 102 described later at predetermined intervals.

  In the operation history DB 211, the operation history table 500 transmitted from the operation acquisition / analysis unit 208 is stored. The driving history table 500 in the driving history DB 211 may be synchronized with the driving history table 500 of a predetermined driver in the driving history DB 220 of the server 102 at predetermined intervals. In addition to the driving history table 500, the driving history DB 211 may store a history of routes that have been driven in the past for each driver.

  In the guidance priority DB 212, a guidance priority table 400 transmitted from the server 102 is stored. The guidance priority table 400 in the guidance priority DB 212 may be synchronized with the guidance priority table 400 in the guidance priority DB 218 of the server 102 described later at predetermined intervals.

  The priority calculation formula DB 213 stores a priority calculation table 700 transmitted from the server 102. The priority calculation formula table 700 in the priority calculation formula DB 213 may be synchronized with a priority calculation formula table 700 in a priority calculation formula DB 219 of the server 102 described later at predetermined intervals.

  The priority calculation unit 214 calculates the priority based on the position / speed of the vehicle. The priority is a numerical value for determining the order of outputting voice messages. In other words, the voice messages are output in descending order of priority. A voice message with a high priority may be repeatedly output. For example, the priority calculation unit 214 calculates the priority of the priority calculation formula DB213 based on the information of the position / vehicle speed detection unit 202, the danger point DB210, the guidance route storage DB204, the map DB205, the driving history DB211 and the guidance priority DB212. The priority of the guidance message (voice message) is calculated by the formula. Details will be described later. The priority calculation unit 214 transmits the calculated priority to the voice guidance creation unit 215.

  The voice guidance creation unit 215 rearranges voice messages in order of priority based on the priority transmitted from the priority calculation unit 214. The voice guidance creating unit 215 converts the voice messages rearranged in order of priority into a file, and transmits the voiced voice message to the voice output unit 216.

  The voice output unit 216 causes the voice output device 108 to output the filed voice message transmitted from the voice guidance creation unit 215.

  The server 102 includes a danger point DB 217, a guidance priority DB 218, a priority calculation formula DB 219, a driving history DB 220, and an external communication unit 221.

  Each of the dangerous point DB 217, the guidance priority DB 218, the priority calculation formula DB 219, and the driving history DB 220 is constructed in the external storage device 116. The function of the external communication unit 221 is realized by the CPU 114 executing a predetermined program, but may be realized by a hardware circuit instead of or in addition thereto.

  A danger point table 300 is stored in the danger point DB 217. The danger point table 300 will be described in detail later.

A guidance priority table 400 is stored in the guidance priority DB 218. The guidance priority table 400 will be described in detail later.
A priority calculation formula table 900 is stored in the priority calculation formula DB 219. The priority calculation formula table 900 will be described in detail later.

  An operation history table 500 is stored in the operation history DB 220. The operation history table 500 will be described in detail later. In addition, in the driving history DB, a history of a route driven in the past may be stored for each driver in addition to the driving history table.

  The external communication unit 221 transmits and receives data to and from the navigation device 101 when called from the external communication I / F 117. Specifically, the external communication unit 221 receives the data of the driving feature DB 209 from the navigation device 101, and transmits the data of the dangerous point DB 217, the guidance priority DB 218, the priority calculation formula DB 219, and the driving history DB 220 to the navigation device 101. To do.

  FIG. 3 is an example of the danger point table 300 recorded in the danger point DB 217 of the server 102 and the danger point DB 210 of the navigation device 101.

  The dangerous point table 300 is created by, for example, the server 102 or a management device (not shown) connected to the server 102 based on the dangerous point information collected by the navigation device 101. In this case, the dangerous point table 300 may be created based on the dangerous point information registered by each user from the plurality of navigation devices 101 that transmit / receive information to / from the server 102. In addition to or instead of this, the dangerous point table 300 may be created based on the dangerous point information collected by the server 102, such as accident occurrence information announced by the police or the like. The dangerous point DB 217 and the dangerous point DB 210 may be synchronized every predetermined period.

  The danger point table 300 has columns of coordinates 301, danger contents 302, and warning sound 303 for each danger point.

  The coordinate 301 is information indicating the position of the danger point in latitude and longitude. Note that the coordinates 301 may be GPS coordinates in addition to latitude and longitude.

  The danger content 302 is information indicating an event (what kind of danger occurs) at a past or present danger point. Further, the danger content 302 may be an event predicted to occur at the danger point.

  The warning sound 303 is information (for example, a character string composed of numbers and / or characters) that represents a message output from the navigation device 101 when the navigation device 101 approaches a dangerous point. The warning sound 303 may be set by the server 102 when the dangerous point table 300 is created (or modified), or may be manually set (updated) by the user according to the dangerous content 302.

  According to the illustrated example, the danger point located at the coordinates “35.621 north latitude, 139.717 east longitude” has a thick branch. Is output.

  FIG. 4 shows an example of the guidance priority table 400 recorded in the guidance priority DB 218 of the server 102 and the guidance priority DB 212 of the navigation device 101.

  The guidance priority table 400 is created by the server 102 or a management device (not shown) connected to the server 102. For example, the guidance priority DB 218 and the guidance priority DB 212 may be synchronized every predetermined period.

  The guidance priority table 400 stores contents 401, output conditions 402, attributes 403, and weights 404 for each voice message.

  The content 401 stores information representing a message output by voice from the voice output device 108. The content 401 is output when the corresponding output condition 402 is satisfied. The content 401 may be set by the server 102 when creating (or modifying) the guidance priority table 400, or the user manually operates the guidance priority table 400 stored in the guidance priority DB 212 of the navigation device 101. May be set (updated). The contents 401 are based on dangerous point information output when approaching a dangerous point, route guidance information output when approaching a guide point, approaching a point based on business information or other business information Contains business information that is output when notification is required. The dangerous point information may be the same information as the warning sound 303 of the dangerous point table 300.

  The output condition 402 is a condition for outputting a voice message, and is set for each content 401. When the output voice messages overlap, the order of outputting the voice messages 1 is determined by performing the process of determining the voice message output order (see FIG. 9) described later.

  An attribute 403 indicates an attribute of the voice message (content 401). The attributes include the above-described dangerous point information, route guidance information, and business information.

  The weight 404 is a value indicating the importance for each attribute 403. The weight 404 may be set by the server 102 when creating (or correcting) the guidance priority table 400, or may be manually set (corrected) by the user on the navigation device 101 side.

  In the illustrated example, the weight is set to “60” in the route guidance information, “80”, which is heavier than the dangerous point information, and “20”, which is the lightest work information. Also, the voice message “500m ahead, turn right” is output 500m before the guide point.

  FIG. 5 is an example of an operation history table 500 recorded in the operation history DB 220 of the server 102 and the operation history DB 211 of the navigation device 101.

  The driving history table 500 is created by the driving operation acquisition / analysis unit 208 of each navigation device 101. The driving history table 500 is created based on the characteristic log in the driving characteristic DB 209 and the reference value of the driving operation transmitted from the server 102. The characteristic log is a log obtained by acquiring the driving operation of the driver for each of the steering wheel operation, the brake operation, and the accelerator operation (not shown). The feature log regarding the steering wheel operation is a record of the position at which the driver starts to turn the steering wheel and the angle at which the steering wheel is driven between the feature acquisition point and the intersection. The feature log regarding the brake operation is a record of the position at which the driver starts stepping on the brake and the amount of braking applied from the feature acquisition point to the intersection. The feature log of the accelerator operation is a record of the position at which the driver starts stepping on the accelerator and how much the accelerator is pressed between the feature acquisition point and the intersection. On the other hand, the reference value for the driving operation is calculated by a management device (not shown) of the server 102. The reference value of the driving operation may use the characteristics of the driving operation of a predetermined driver who is good at driving operation (or performs standard driving operation). It is also possible to obtain characteristic logs of a plurality of drivers from a plurality of navigation devices 101 and calculate using the average value (or the maximum value or the like may be used). The driving history table 500 is created by calculating the evaluation point 502 of each driver by comparing the characteristic log with the reference value of the driving operation.

  The driving history table 500 has a driver ID 501 and an evaluation point 502 for each driver.

  The driver ID 501 is a driver identification number registered in the server 102 and / or the navigation device 101.

  The evaluation point 502 is an evaluation point when the feature log of each driver is compared with a reference value for driving operation. In the illustrated example, the evaluation point 502 includes a steering wheel operation evaluation point 503, a brake operation evaluation point 504, and an accelerator operation evaluation point 505. Each of the evaluation points 503 to 505 of each driver is obtained by scoring the feature log of the driver, for example, with the reference value for driving operation being 100.

  FIG. 6 is a calculation flow of the evaluation point 502 of the operation history table 500.

  The evaluation score 502 is calculated by the driving operation acquisition / analysis unit 208.

  In step S601, the driving operation acquisition / analysis unit 208 determines whether or not the vehicle is traveling. If the vehicle is traveling, the process proceeds to step S602. On the other hand, if the vehicle is not traveling, the process proceeds to step S604.

  In step S602, the driving operation acquisition / analysis unit 208 acquires characteristics of the driving operation of the driver. Features of the driver's driving operation, that is, features such as the driver's steering wheel operation, brake operation, and accelerator operation are each acquired from the feature acquisition start point to the intersection.

  In step S603, the driving operation acquisition / analysis unit 208 stores the characteristics of the driver's steering wheel operation, brake operation, accelerator operation, and the like acquired in step S602 as a characteristic log in the driving characteristic DB 209. While the vehicle is traveling, the feature log is accumulated in the driving feature DB 209 by repeating steps S601 to S603.

  On the other hand, when the vehicle is not traveling in step S601 (S601: No), in step S604, the driving operation acquisition / analysis unit 208 acquires the feature log of the driver from the driving feature DB 208.

In step S605, the driving operation acquisition / analysis unit 208 compares the characteristic log of the driver acquired in step S604 with the reference value of the driving operation. Specifically, for example, between the feature acquisition point and the intersection, the following multiple types of operations:
(Handle operation) The position where the handle starts to be cut, the angle at which the handle is cut, etc.
(Brake operation) The position where the brake was started, the amount of brake depression, etc.
(Accelerator operation) The position where the accelerator is started, the amount of accelerator depression, etc.
, The feature log and the reference value are compared. Thereby, the difference between the characteristic log of the driver and the reference value of the driving operation is calculated for one type of operation.

  In step S606, the driving operation acquisition / analysis unit 208 calculates an evaluation score from the difference between the corresponding driver feature log calculated in step S605 and the reference value of the driving operation. The evaluation point is calculated by a driving evaluation algorithm. The driving evaluation algorithm scores the driving operation reference values (for example, all the reference values are 100 points), points the difference between the corresponding driver, the feature log, and the driving operation reference values to give a pointed reference This is an algorithm for calculating an evaluation score by subtracting a scored difference from a value.

  In step S607, the driving operation acquisition / analysis unit 208 registers the evaluation score calculated in step S605 in the driving history table 500. That is, the driving operation acquisition / analysis unit 208 registers the steering wheel operation evaluation point 503, the brake operation evaluation point 504, and the accelerator operation evaluation point 505 corresponding to the driver ID 501 of the driver in the driving history table 500. Then, the driving operation acquisition / analysis unit 208 stores the driving history table 500 in which the evaluation points 503 to 505 are registered in the driving history DB 211.

  FIG. 7 is an example of a priority calculation formula table 700 recorded in the priority calculation formula DB 219 of the server 102 and the priority calculation formula DB 112 of the navigation device 101.

  In the priority calculation formula table 700, a priority calculation formula 702 is associated with an attribute 701 for each attribute.

  Similar to the attribute 403 of the guidance priority table 400, the attribute 701 stores voice message attribute information. That is, the attribute 701 includes the above-described dangerous point information, route guidance information, and business information.

  The priority calculation formula 702 stores a formula for calculating the priority for each guidance attribute 701. Priority is given to priority when a vehicle is traveling on a guidance route and any of the following points are detected: a dangerous point, a guide point, or a point based on business information (hereinafter these points are collectively referred to as a warning point). Calculated by the degree calculator 214. Details will be described later. Hereinafter, the priority calculation formula 702 for each attribute 701 will be described.

  When the attribute is route guidance information, the priority is calculated based on the speed of the vehicle, the distance to the guidance point, the weight, and the evaluation point. Specifically, the priority is calculated by multiplying the weight by the reciprocal of the distance to the guide point, the speed, the reciprocal of the steering wheel operation evaluation point, and the reciprocal of the brake operation evaluation point. Accordingly, the shorter the distance from the position of the vehicle (navigation device 101) to the guide point and the higher the vehicle speed, the higher the priority. Also, the higher the weighting for route guidance information, the higher the priority. Furthermore, the lower the driver's handle operation evaluation score and the brake operation evaluation score, the higher the priority.

  When the attribute is dangerous point information, the priority is calculated based on the vehicle speed, the distance to the dangerous point, the weight, the evaluation point, and the number of dangerous point reports. Specifically, the priority is a value obtained by multiplying the weight by the reciprocal of the distance to the dangerous point and the speed of the vehicle, and a value obtained by multiplying the weight by the reciprocal of the evaluation point of the brake operation and the number of times the dangerous point is reported. It is calculated by adding Therefore, the shorter the distance to the danger point and the higher the vehicle speed, the higher the priority. Moreover, a priority becomes high, so that the weighting with respect to danger point information is large. Furthermore, the lower the evaluation score for the driver's brake operation and the higher the number of dangerous point reports, the higher the priority. The number of dangerous point reports is the number of times a dangerous point is reported to the server 102. Specifically, it is the number of times the dangerous content 302 of the dangerous point table 300 is reported (not shown).

  When the attribute is business information, the priority is calculated based on the speed of the vehicle, the distance to the business point, the weight, and other priorities. Specifically, the priority is calculated by multiplying the weight by the reciprocal of the distance to the business location and the speed of the vehicle. Calculated by subtracting degrees. Therefore, the shorter the distance to the danger point and the higher the vehicle speed, the higher the priority. Further, the priority becomes higher as the weighting for the business point information is larger. Furthermore, the priority is lower than when the attribute is route guidance or danger point information.

  The above priority calculation formula is an example of a formula for calculating the priority. For this reason, the priority calculation formula is not limited to that described above. For example, the priority calculation formula for each attribute may be obtained by multiplying the reciprocal of the evaluation point of the accelerator operation, or may be set in any way. However, it is desirable to set the priority calculation formula so that the priority becomes higher as it approaches the warning point, and becomes higher when the evaluation score of the driving operation for each user is low. Moreover, the priority may consider information acquired by other vehicles. Which attribute is given priority may be determined by the user adjusting the weight.

  The order in which voice messages are output is determined based on the priority calculated by the priority calculation formula.

  FIG. 8 shows a sequence diagram of the navigation system.

  In step S801, a driver or a user (hereinafter referred to as a driver in this description) performs an activation operation of the navigation device 101 from the operation input unit 207. In step S802, the navigation device 101 is activated.

  In step S804, the screen display unit 206 of the navigation device 101 displays a destination input instruction on the display output device 107 to the driver. The driver who receives the instruction inputs the destination from the operation input unit 207 (step S805).

  On the other hand, in step S803, the external communication unit 201 of the navigation device 101 transmits an activation notification to the server 102.

The external communication unit 221 of the server 102 receives the activation notification and performs the following processing:
(*) The dangerous point table 300 is read from the dangerous point DB 217 (step S806), and the dangerous point table 300 is transmitted to the navigation device 101 (step S807).
(*) The guidance priority table 400 is read from the guidance priority DB 218 (step S808), and the guidance priority table 400 is transmitted to the navigation device 101 (step S809).
(*) The priority calculation formula table 700 is read from the priority calculation formula DB 219 (step S810), and the priority calculation formula table 700 is transmitted to the navigation device 101 (step S811).
(*) The operation history table 500 is read from the operation history DB 217 (step S812), and the operation history table 500 is transmitted to the navigation device 101 (step S813).
I do.

  In step S814, the guidance route creation unit 203 of the navigation device 101 creates a guidance route from the current position of the vehicle and the destination input by the driver, and uses the dangerous point information stored in the dangerous point DB 210 as the guidance route. In addition, a guide route with danger point information is created and stored in the guide route storage DB 204.

  In step S815, the screen display unit 206 of the navigation device 101 displays the map information in the map DB 205 on the display output device 107 together with the guide route with dangerous point information in the guide route storage DB 204.

  When the position / vehicle speed detection unit 202 detects the movement of the vehicle (step S816), the navigation device 101 starts guidance according to the guidance route (step S817).

  Steps S818 to S823 are repeated until the vehicle arrives at the destination (that is, until step S824). Hereinafter, steps S818 to S823 will be described.

  While driving along the guide route, when the vehicle approaches the warning point closest to the current vehicle position among warning points (danger points, guidance points, or points based on business information), The vehicle speed detection unit 202 detects it (step S818). Further, the position / vehicle speed detection unit 202 detects one or a plurality of warning points close to the detected warning point. In this case, the position / vehicle speed detection unit 202 replaces the warning point with a point (output condition 402 shown in FIG. 4) where a voice message (that is, dangerous point information, route guidance information, business information) starts to be output. May be detected.

  Then, the position / vehicle speed detection unit 202 detects the current speed of the vehicle (step S819), and further calculates each distance to the detected plurality of warning points (step S820).

  In step S821, the priority calculation unit 214 of the navigation device 101 performs voice message output order determination processing for determining the order of voice message output. This process will be described later.

  In step S823, the voice output unit 216 of the navigation device 101 outputs voice messages in order based on the voice message output order determination process.

  When arriving at the destination (step S824), the driver performs an end operation from the operation input unit 207 (step S825), and the route guidance ends (step S826).

  FIG. 9 shows a flow of voice message output order determination.

  In step S901, the priority calculation unit 214 acquires a plurality of warning points including the warning point closest to the vehicle, and calculates each priority. Specifically, for example, the vehicle approaches a point that satisfies the output condition of the voice message at the warning point closest to the vehicle (a point that satisfies the output condition 402 of the guidance priority table 400) (for example, approaches a predetermined distance or less). Step S901 is performed. At this time, the priority calculation unit 214 acquires a plurality (for example, two) of warning points closest to the vehicle and next warning points closest to the vehicle. The priority calculation unit 214 refers to the priority calculation formula table 700 and calculates each priority for the acquired three warning points using the priority calculation formula corresponding to each attribute ( (See FIG. 7).

  In step S902, the priority calculation unit 214 determines whether or not the voice guidance overlaps when three voice messages are output in order. Specifically, first, the priority calculation unit 214 refers to the guidance priority table 400 and calculates each output time when three voice messages are output.

  In step S903, the priority calculation unit 214 sequentially outputs voice messages for each warning point based on the vehicle speed, each distance to the three warning points, and each output time when three voice messages are output. If output, it is determined whether or not those voice messages overlap (output of another voice message is started before the output of a certain voice message is completed). As a result of the determination, if the voice message output overlaps, the process proceeds to step S905. On the other hand, if the output of the voice message does not overlap, the process proceeds to step S904.

  In step S904, the priority calculation unit 214 determines whether or not a voice message with a high priority can be repeatedly output. Specifically, the priority calculation unit 214 repeatedly outputs a high priority voice message based on the vehicle speed, each distance to the three warning points, and each output time when three voice messages are output. Determine if you can. If a voice message with high priority can be repeatedly output, the process proceeds to step S906. On the other hand, if the voice message cannot be repeatedly output, the process proceeds to step S905.

  In step S905, the priority calculation unit 214 determines an output order based on the result of step S903 or step S904 so that voice messages with higher priority are output in order. At this time, the priority calculation unit 214 may cancel the output in order from the voice message having the lowest priority according to the output time of the voice message.

  In step S906, the priority calculation unit 214 performs setting so that voice messages with high priority are repeatedly output and voice messages with high priority are output in order.

  In step S807, the voice guidance creation unit 215 refers to the guidance priority table 400 and converts the voice messages (contents 401) set in the output order into voice files.

  In the above-described embodiment, the navigation system includes a car navigation device for business use used by a delivery company or other vendors, and a portal server such as a distribution center in each district that receives information from the car navigation device. However, it is not limited to this. The navigation device provided in the navigation system may be a car navigation device for home use, or a navigation device using a PND (Portable Navigation Device), a mobile phone, a smartphone, or the like.

  In the above embodiment, the navigation device 101 has the driving operation acquisition / analysis unit 208, but the present invention is not limited to this. For example, the server 102 may have the driving operation acquisition / analysis unit. In this case, the feature log stored in the driving feature DB 209 is transmitted to the server 102 every predetermined period, and is compared with the reference value of the driving operation by the driving operation acquisition / analysis unit of the server 102 to calculate the evaluation score. The operation history table 500 is created.

  In the above embodiment, the guidance route creation unit 203 and the guidance route storage DB 204 are constructed in the navigation device 101, but are not limited thereto. For example, the guidance route creation unit and / or the guidance route storage DB may be constructed in a server.

  As mentioned above, although embodiment was described, it cannot be overemphasized that this invention can be variously changed in the range which is not limited to this embodiment and does not deviate from the summary.

101: Navigation device, 102: Server, 103: Wireless communication network

Claims (15)

  1. A navigation system comprising a navigation device that provides guidance by outputting a voice message, and a server that communicates with the navigation device,
    A detection unit for detecting the position and speed of the navigation device;
    A voice weight storage unit that stores voice weight correspondence information in which a weight is associated with each of a plurality of voice messages;
    A rule storage unit for storing priority rule information which is information representing a rule for determining the priority of each of the plurality of voice messages;
    A priority determining unit that determines the priority of each of the plurality of voice messages based on the priority rule information;
    A voice output unit that outputs the plurality of voice messages from the navigation device in an order according to the determined priority;
    The rule represented by the priority rule information determines the priority of each voice message based on the detected position and speed of the navigation device and the weight of each voice message represented by the voice weight correspondence information. Are the rules for the
    Navigation system.
  2. The navigation system according to claim 1,
    The server
    A message weight transmitter for transmitting the voice weight correspondence information to the navigation device;
    A rule transmission unit that transmits the priority rule information to the navigation device;
    The navigation device includes the voice weight storage unit and the rule storage unit,
    The voice weight storage unit stores voice weight correspondence information transmitted from the server,
    The rule storage unit stores priority rule information transmitted from the server.
    Navigation system.
  3. The navigation system according to claim 1 or 2,
    The navigation device includes the priority determination unit and the audio output unit.
  4. The navigation system according to any one of claims 1 to 3,
    The voice weight correspondence information is associated with a weight for each type of voice message,
    The navigation system in which the type of the voice message includes route guidance information, danger point information, and / or business information.
  5. The navigation system according to any one of claims 1 to 4,
    The server
    A reference driving operation information transmitting unit that transmits reference driving operation information, which is information on characteristics of the driving operation serving as a reference, to the navigation device,
    The navigation device
    A feature acquisition unit that acquires a feature of a driver's driving operation for driving a vehicle having the navigation device;
    An evaluation score calculation unit that calculates the evaluation score of the driver from the reference driving operation information and the characteristics of the driving operation of the driver;
    The rule represented by the priority rule information is a navigation system that is a rule for determining a priority of each voice message based on an evaluation point of the driver.
  6. The navigation system according to claim 5 , wherein
    The feature acquisition unit
    As the driver's driving operation characteristics, the driver's steering operation characteristics, brake operation characteristics and accelerator operation characteristics are respectively obtained,
    The evaluation point calculation unit is a navigation system that calculates an evaluation point for a steering wheel operation, an evaluation point for a brake operation, and an evaluation point for an accelerator operation based on characteristics of the driving operation of the driver.
  7. The navigation system according to claim 6 , wherein
    The navigation device
    Further comprising a feature transmitting unit for transmitting the characteristics of the driving operation of the driver to a server;
    The said server is a navigation system which calculates the said reference | standard driving operation information based on the characteristic of the driving operation of the driver | operator acquired from the several navigation apparatus connected to this server.
  8. The navigation system according to any one of claims 4 to 7,
    The navigation system is a rule represented by the priority rule information for each type of the voice message.
  9. A navigation device capable of providing voice guidance by outputting a voice message,
    A detection unit for detecting the position and speed of the navigation device;
    A voice weight storage unit that stores voice weight correspondence information in which a weight is associated with each of the plurality of voice messages;
    A rule storage unit that stores priority rule information that is information representing a rule for determining the priority of each of the plurality of voice messages;
    A priority determining unit that determines the priority of each of the plurality of voice messages based on the priority rule information;
    A voice output unit that outputs the plurality of voice messages in an order according to the determined priority;
    The rule represented by the priority rule information determines the priority of each voice message based on the detected position and speed of the navigation device and the weight of each voice message represented by the voice weight correspondence information. A navigation device that is a rule for.
  10. The navigation device according to claim 9,
    The voice weight correspondence information is associated with a weight for each type of voice message,
    A navigation device that includes route guidance information, danger point information, and / or business information in the type of the voice message.
  11. The navigation device according to claim 9 or 10, wherein
    A holding unit that holds reference driving operation information that is information on characteristics of the driving operation serving as a reference;
    A feature acquisition unit that acquires a feature of a driver's driving operation for driving a vehicle having the navigation device;
    An evaluation score calculation unit that calculates the evaluation score of the driver from the reference driving operation information and the characteristics of the driving operation of the driver;
    The rule represented by the priority rule information is a navigation device that is a rule for determining the priority of each voice message based on the evaluation score of the driver.
  12. The navigation device according to claim 11 ,
    The feature acquisition unit
    As the driver's driving operation characteristics, the driver's steering operation characteristics, brake operation characteristics and accelerator operation characteristics are respectively obtained,
    The evaluation point calculation unit is a navigation device that calculates an evaluation point for a steering wheel operation, an evaluation point for a brake operation, and an evaluation point for an accelerator operation based on characteristics of the driving operation of the driver.
  13. The navigation device according to any one of claims 10 to 12,
    The navigation device, wherein the rule represented by the priority rule information is a rule for each type of the voice message.
  14. A voice message output method performed by a navigation device capable of voice guidance,
    The detection unit of the navigation device detects the position and speed of the navigation device;
    A priority rule that is a rule for the priority determination unit of the navigation device to determine the priority of each voice message from the detected position and speed of the navigation device and the weight corresponding to each of the voice messages. Determining a priority of each of the plurality of voice messages based on the information;
    A method in which the voice output unit of the navigation device outputs the voice messages in an order based on the determined priority.
  15. A server that communicates with one or more navigation devices capable of providing voice guidance by outputting voice messages,
    A detection unit for detecting the position and speed of the navigation device;
    A voice weight storage unit that stores voice weight correspondence information in which a weight is associated with each of the plurality of voice messages;
    A rule storage unit that stores priority rule information that is information representing a rule for determining the priority of each of the plurality of voice messages;
    A priority determining unit that determines the priority of each of the plurality of voice messages based on the priority rule information;
    A notification unit that notifies the navigation device of the priority determined by the priority determination unit;
    The rule represented by the priority rule information determines the priority of each voice message based on the detected position and speed of the navigation device and the weight of each voice message represented by the voice weight correspondence information. There are rules for
    Server .
JP2011245970A 2011-11-09 2011-11-09 Navigation system, navigation apparatus, method, and server Active JP5681611B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011245970A JP5681611B2 (en) 2011-11-09 2011-11-09 Navigation system, navigation apparatus, method, and server

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2011245970A JP5681611B2 (en) 2011-11-09 2011-11-09 Navigation system, navigation apparatus, method, and server
US13/670,565 US8855915B2 (en) 2011-11-09 2012-11-07 Navigation system, navigation apparatus, method and server
EP12192024.3A EP2592388B1 (en) 2011-11-09 2012-11-09 Navigation system, navigation apparatus, method and server

Publications (2)

Publication Number Publication Date
JP2013101083A JP2013101083A (en) 2013-05-23
JP5681611B2 true JP5681611B2 (en) 2015-03-11

Family

ID=47664047

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011245970A Active JP5681611B2 (en) 2011-11-09 2011-11-09 Navigation system, navigation apparatus, method, and server

Country Status (3)

Country Link
US (1) US8855915B2 (en)
EP (1) EP2592388B1 (en)
JP (1) JP5681611B2 (en)

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
JP5662273B2 (en) * 2011-07-28 2015-01-28 アルパイン株式会社 Interrupt control device and interrupt control method
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9576574B2 (en) * 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
JP6018881B2 (en) * 2012-11-07 2016-11-02 株式会社日立製作所 Navigation device and navigation method
EP2741169B1 (en) * 2012-12-06 2017-03-08 Harman Becker Automotive Systems GmbH Control of an External Drivebox
US10445758B1 (en) 2013-03-15 2019-10-15 Allstate Insurance Company Providing rewards based on driving behaviors detected by a mobile computing device
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
JP6231786B2 (en) * 2013-06-20 2017-11-15 矢崎エナジーシステム株式会社 Analysis device and operation display system
US9733095B2 (en) * 2013-10-07 2017-08-15 Telenav, Inc. Navigation system with guidance delivery mechanism and method of operation thereof
KR20150096897A (en) * 2014-02-17 2015-08-26 삼성전자주식회사 Apparatus and method forecasting vehicle flow
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
CN106471570B (en) 2014-05-30 2019-10-01 苹果公司 Order single language input method more
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
JP2016066199A (en) * 2014-09-24 2016-04-28 矢崎エナジーシステム株式会社 Warning device
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9803988B2 (en) * 2014-12-30 2017-10-31 Excalibur Ip, Llc Method and/or system for walking route recommendation
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10275029B2 (en) * 2015-06-22 2019-04-30 Accenture Global Solutions Limited Directional and awareness guidance device
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
DE102015226530B4 (en) * 2015-12-22 2017-12-21 Continental Automotive Gmbh Method and system for controlling an audio signal output for a vehicle
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179588B1 (en) 2016-06-09 2019-02-22 Apple Inc. Intelligent automated assistant in a home environment
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. User-specific acoustic models
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10403283B1 (en) 2018-06-01 2019-09-03 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US20190371316A1 (en) 2018-06-03 2019-12-05 Apple Inc. Accelerated task performance

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2867589B2 (en) * 1990-04-18 1999-03-08 住友電気工業株式会社 Audio guide device
JP3269108B2 (en) * 1991-03-22 2002-03-25 オムロン株式会社 Message display control apparatus and display control method, and output apparatus and method output the
JPH08194889A (en) * 1995-01-20 1996-07-30 Mitsubishi Motors Corp Controller corresponding to road state ahead of automobile
US6208932B1 (en) * 1996-09-30 2001-03-27 Mazda Motor Corporation Navigation apparatus
JP3703050B2 (en) * 1996-09-30 2005-10-05 マツダ株式会社 Navigation device
WO1998034210A1 (en) * 1997-02-04 1998-08-06 Mannesmann Ag Method for transmitting traffic information and devices for implementing said method
US6438561B1 (en) * 1998-11-19 2002-08-20 Navigation Technologies Corp. Method and system for using real-time traffic broadcasts with navigation systems
JP2002236029A (en) 2001-02-09 2002-08-23 Alpine Electronics Inc Vocal guidance device
KR100454922B1 (en) * 2001-10-31 2004-11-15 삼성전자주식회사 Navigation system for providing a real-time traffic information and method thereof
JP4110866B2 (en) * 2002-07-17 2008-07-02 株式会社日立製作所 Roadside equipment and message priority control device
US7251558B1 (en) * 2003-09-23 2007-07-31 Navteq North America, Llc Method and system for developing traffic messages
JP2006184257A (en) * 2004-12-28 2006-07-13 Nissan Motor Co Ltd System and method for providing information
KR100696802B1 (en) * 2005-02-16 2007-03-19 엘지전자 주식회사 Navigation guidance apparatus for Digital Multimedia Broadcasting and traffic information service method using its
JP4682658B2 (en) 2005-03-23 2011-05-11 日産自動車株式会社 Voice guidance device and voice guidance method
KR100711866B1 (en) * 2005-05-18 2007-04-25 엘지전자 주식회사 Method and apparatus for providing prediction information on traffic and using the information
US7804860B2 (en) * 2005-10-05 2010-09-28 Lg Electronics Inc. Method of processing traffic information and digital broadcast system
CN101636759A (en) * 2007-03-23 2010-01-27 日本先锋公司 Information provision system, information management server, information management method, information management program, and storage medium
JP4441917B2 (en) * 2007-06-06 2010-03-31 アイシン・エィ・ダブリュ株式会社 Navigation device and program
US9978272B2 (en) * 2009-11-25 2018-05-22 Ridetones, Inc Vehicle to vehicle chatting and communication system
US8660788B2 (en) * 2009-12-09 2014-02-25 Telenav, Inc. Navigation system with audio and method of operation thereof

Also Published As

Publication number Publication date
EP2592388A3 (en) 2015-06-03
US20130116919A1 (en) 2013-05-09
EP2592388A2 (en) 2013-05-15
EP2592388B1 (en) 2019-09-04
US8855915B2 (en) 2014-10-07
JP2013101083A (en) 2013-05-23

Similar Documents

Publication Publication Date Title
US9086297B2 (en) Navigation system having maneuver attempt training mechanism and method of operation thereof
US10451427B1 (en) Using train telematics data to reduce accident risk
DE102012208988B4 (en) Fast collision detection technology for connected autonomous and manually controlled vehicles
JP6039159B2 (en) Method for operating navigation device, navigation device and program
US20090112458A1 (en) Navigation system and method for navigating route to destination
US8299915B1 (en) Transfer assistance for a passenger on a public conveyance
CN104468140A (en) Methods, systems and apparatus for sharing information among a group of vehicle
US9691281B2 (en) Navigation system with image assisted navigation mechanism and method of operation thereof
US10126743B2 (en) Vehicle navigation route search system, method, and program
US8467962B2 (en) Navigation system and lane information display method
US10346372B2 (en) Point of interest database maintenance system
EP2783357B1 (en) User-assisted identification of location conditions
US20110022305A1 (en) Car navigation apparatus, a portable information terminal and a car navigation system
US9815476B2 (en) Method and apparatus for providing road surface friction data for a response action
CN102466487B (en) Guiding run course system and vehicle-mounted voice band device and by the method for its guiding route
CN102538811A (en) Systems and methods for planning vehicle routes based on safety factors
JP2012247854A (en) Driving-evaluation system, driving-evaluating program, and driving-evaluation method
JP5263312B2 (en) Traffic jam judging device and vehicle control device
US9506763B2 (en) Method and apparatus for providing aggregated notifications for travel segments
KR20050051894A (en) Telematics system using image data and method for guiding path using the same
WO2012167069A1 (en) Combined radar and gps localization system
JP2017154725A (en) Information processing system, information processing method, and program
US10527450B2 (en) Apparatus and method transitioning between driving states during navigation for highly automated vechicle
US8855915B2 (en) Navigation system, navigation apparatus, method and server
US20150379873A1 (en) Directional parking availability visualization system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20131001

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20140214

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140319

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140401

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140514

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150106

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150109

R150 Certificate of patent or registration of utility model

Ref document number: 5681611

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150