WO2021152713A1 - 車載装置、車車間通信システム、およびプログラム - Google Patents

車載装置、車車間通信システム、およびプログラム Download PDF

Info

Publication number
WO2021152713A1
WO2021152713A1 PCT/JP2020/003070 JP2020003070W WO2021152713A1 WO 2021152713 A1 WO2021152713 A1 WO 2021152713A1 JP 2020003070 W JP2020003070 W JP 2020003070W WO 2021152713 A1 WO2021152713 A1 WO 2021152713A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
message data
information
voice
unit
Prior art date
Application number
PCT/JP2020/003070
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
智子 山口
喜義 久保
和則 青柳
能弘 荻田
Original Assignee
株式会社 東芝
東芝エネルギーシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝, 東芝エネルギーシステムズ株式会社 filed Critical 株式会社 東芝
Priority to PCT/JP2020/003070 priority Critical patent/WO2021152713A1/ja
Priority to JP2021573679A priority patent/JP7279206B2/ja
Publication of WO2021152713A1 publication Critical patent/WO2021152713A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • Embodiments of the present invention relate to in-vehicle devices, vehicle-to-vehicle communication systems, and programs.
  • Vehicle-to-vehicle communication has been under continuous consideration for a long period of time.
  • the use of the 700 MHz band was proposed.
  • the actual degree of widespread use is not sufficient even to this day.
  • the main reasons for this are as follows. (1) There is a dilemma that the initial spread will not progress because users will not be able to enjoy the benefits unless the spread progresses to some extent. In other words, there is a ⁇ wall of dissemination> until the number of dissemination exceeds a certain number. (2)
  • the car replacement cycle has been extended, and it has taken time for new technologies to penetrate.
  • the in-vehicle device includes a receiving unit and a voice synthesis unit.
  • the receiving unit receives from the other vehicle a command code corresponding to the intention of the driver of the other vehicle and message data including at least the position information of the other vehicle.
  • the voice synthesizer sends a voice message from the command code included in the message data when it is determined that the destination of the message data is the own vehicle based on the position information included in the message data and the position information of the own vehicle. Synthesize.
  • FIG. 1 is a functional block diagram showing an example of an in-vehicle system according to an embodiment.
  • FIG. 2 is a diagram showing an example of a usage environment of the in-vehicle device 2.
  • FIG. 3 is a functional block diagram showing an example of the in-vehicle device 2.
  • FIG. 4 is a diagram showing an example of a message data format used in vehicle-to-vehicle communication according to the embodiment.
  • FIG. 5 is a flowchart for explaining the function of the traveling information calculation unit 24i.
  • FIG. 6 is a flowchart for explaining the function of the receiving unit 24g.
  • FIG. 7 is a flowchart for explaining the function of the transfer unit 24d.
  • FIG. 8 is a flowchart for explaining the function of the retweet unit 24c.
  • FIG. 9 is a flowchart for explaining the function of the transmission unit 24b.
  • FIG. 10 is a flowchart for explaining the function of the relative position calculation unit 24f.
  • FIG. 11 is a diagram showing an example of correspondence between the communication partner code and the relative positional relationship.
  • FIG. 12 is a flowchart for explaining the function of the voice recognition unit 24a.
  • FIG. 13 is a diagram showing an example of a voice command format.
  • FIG. 14 is a diagram showing an example of a keyword and a coded sequence for designating a conversation partner.
  • FIG. 15 is a diagram showing an example of a voice command and a coded sequence.
  • FIG. 16 is a conceptual diagram showing a voice synthesized and output by the voice synthesis unit 24e.
  • FIG. 17 is a flowchart for explaining the function of the voice synthesis unit 24e.
  • FIG. 18 is a diagram showing an example of the correspondence between the voice phrase generation character string and the command code.
  • FIG. 19 is a diagram showing an example of voice quality data defined for each vehicle type.
  • FIG. 20 is a diagram showing an example of voice quality data defined for each color of the vehicle.
  • FIG. 21 is a diagram showing an example of voice quality data defined for the driving history.
  • FIG. 22 is a diagram showing an example of keywords expressing the relative positions of the own vehicle and the opponent vehicle.
  • FIG. 23 is a diagram showing an example of voice message data for dialects in Hiroshima Prefecture.
  • FIG. 24 is a diagram showing an example of voice message data for dialects of Osaka Prefecture.
  • FIG. 1 is a functional block diagram showing an example of an in-vehicle system according to an embodiment.
  • the vehicle-to-vehicle communication system may include various types of infrastructure such as hardware such as an in-vehicle device and a program that implements a function executed by the hardware.
  • the in-vehicle system is formed with the in-vehicle device 2 mounted on the vehicle as the core.
  • the in-vehicle device 2 can be mounted on a vehicle equipped with a car navigation system 3.
  • a GPS (Global Positioning System) antenna 1 for the car navigation system 3 and a microphone 5 for acquiring voice data of the driver 6 are connected to the in-vehicle device 2.
  • a plurality of vehicle speakers 4, 7, 8 and 9 are connected to the car navigation system 3.
  • Speakers 4, 7, 8 and 9 are installed on the left front (front L), right front (front R), left rear (rear L), and right rear (rear R) of the driver 6, respectively.
  • Speakers 4, 7, 8 and 9 allow the audio equipment to localize the sound image at any position in the vehicle interior space.
  • the in-vehicle device 2 is a so-called embedded computer provided with a CPU (Central Processing Unit) 24 and a memory 25.
  • the in-vehicle device 2 further includes a GPS module 21, a built-in antenna 22, and a wireless module 23.
  • the GPS module 21 takes in the signal from the GPS antenna 1, branches internally, and outputs the signal to the car navigation system 3.
  • the built-in antenna 22 is used for the wireless module 23 to communicate with an in-vehicle device of another vehicle.
  • FIG. 2 is a diagram showing an example of a usage environment of the in-vehicle device 2.
  • the in-vehicle device 2 is attached to, for example, a dashboard. It is a so-called pon attachment. Preferably, it is mounted below the rear-view mirror in a position that does not obstruct the driver's field of vision.
  • the vehicle-to-vehicle communication system of the embodiment is formed by communicating with each other by a plurality of vehicles equipped with the in-vehicle device 2. In the embodiment, it is assumed that the vehicle-mounted devices 2 of each vehicle exchange message data in a predetermined format with each other.
  • the plurality of in-vehicle devices 2 can exchange message data with each other using, for example, the existing specified low power wireless communication infrastructure infrastructure.
  • a typical example of this type of infrastructure is the smart meter communication infrastructure in Japan.
  • smart meter communication infrastructure in Japan uses the 900 MHz band to mediate the transfer of data between devices.
  • the driver of the vehicle commands the in-vehicle device 2 by voice, for example, "Tell me the average vehicle speed! (Voice command). Then, the in-vehicle device 2 automatically responds that "the average vehicle speed is ⁇ km / h in the left lane and ⁇ km / h in the right lane” (automatic response). In response to this, the driver can decide whether or not to change lanes, for example, "If only one minute is different, let's go as it is”.
  • FIG. 3 is a functional block diagram showing an example of the in-vehicle device 2.
  • the in-vehicle device 2 includes a voice recognition unit 24a, a transmission unit 24b, a retweet unit 24c, a transfer unit 24d, a voice synthesis unit 24e, a relative position calculation unit 24f, a reception unit 24g, a speaker connection unit 24h, a traveling information calculation unit 24i, and A GPS receiving unit 24j is provided.
  • These functional blocks are realized as processing functions of the CPU 24.
  • the voice synthesis unit 24e is connected to the speakers 4, 7, 8 and 9 via the speaker connection unit 24h.
  • the in-vehicle device 2 includes a keyword storage unit 25a, a dialect data storage unit 25b, a vehicle type voice quality definition data storage unit 25c, a driver attribute voice quality definition data storage unit 25d, a command definition data storage unit 25e, and a vehicle driving information storage.
  • a unit 25f, another vehicle traveling information storage unit 25g, a own vehicle setting information storage unit 25h, and a driver setting information storage unit 25i are provided. These can be understood as storage areas provided in the memory 25.
  • the memory 25 also stores a program including instructions for realizing each function of the CPU 24.
  • the receiving unit 24g receives the message data transmitted from another vehicle.
  • the message data includes at least a command code corresponding to the intention of the driver of the other vehicle and the position information of the other vehicle.
  • FIG. 4 is a diagram showing an example of a message data format used in vehicle-to-vehicle communication according to the embodiment.
  • the command code is described in, for example, the tenth field of the message data.
  • the position information of the other vehicle is described in each of a plurality of (for example, (1) to (N)) other vehicle travel history information.
  • the other vehicle traveling history information is data having a structure in which a plurality of historical data (1) to (m) of the other vehicle are associated with the identification information (for example, serial number) of the other vehicle.
  • Each of the historical data is information having a structure in which a time stamp is associated with the position information represented by latitude / longitude, the moving direction, and the moving speed. This information is referred to as running information.
  • the traveling information is extracted from the message data by the receiving unit 24g and stored in the other vehicle traveling information storage unit 25g.
  • the message data also includes fields such as a retweet flag, caller identification information, caller latitude / longitude, call time stamp, communication partner code, and the like.
  • the outgoing vehicle type in the 8th field indicates the attribute of the vehicle that sent the message data.
  • Vehicle attributes include, for example, vehicle type, color, model year, and type.
  • Vehicle types include heavy-duty trucks, medium-duty trucks, ..., ordinary cars, and light cars. Colors are classified into, for example, red, blue, ..., And so on. These pieces of information are stored in advance in the own vehicle setting information storage unit 25h, for example, at the time of initial setting of the in-vehicle device 2.
  • the driver attribute of the calling vehicle in the ninth field indicates the driver attribute (driver attribute) of the vehicle that transmitted the message data.
  • the driver attribute is information indicating the characteristics of the driver, such as male, female, age, veteran, and beginner. These pieces of information are stored in advance in the driver setting information storage unit 25i, for example, at the time of initial setting of the in-vehicle device 2.
  • the configuration of the message data shown in FIG. 4 is an example, and does not necessarily indicate the specific content of the communication message in actual communication, and the correspondence between the numbers and each field is only an example.
  • data compression processing or the like is appropriately performed from the viewpoint of efficiency or the like.
  • the voice synthesis unit 24e determines whether the destination of the message data is the own vehicle based on the position information of the source other vehicle included in the received message data and the position information of the own vehicle acquired from the GPS reception unit 24j. Judge whether or not. Then, when it is determined that the destination of the message data is the own vehicle, the voice synthesis unit 24e synthesizes a voice message from the command code of the received message data. The synthesized voice message is loudly output from the speakers 4, 7, 8 and 9.
  • the transmission unit 24b transmits the message data to another vehicle.
  • the message data includes at least a command code corresponding to the intention of the driver of the own vehicle (caller) and the position information (latitude / longitude) of the own vehicle.
  • the message data also includes the identification information of the calling vehicle and the time stamp at the time of sending.
  • the transmitting unit 24b and the receiving unit 24g use the smart meter communication board 100 to exchange message data between the own vehicle and another vehicle.
  • Each electric power company in Japan is promoting the introduction of smart meters, and from 2020 to 2023, the introduction to almost all households in Japan will be completed.
  • the majority of the communication means use the specified low power radio in the 920 MHz band, and the applicable range is expected to reach 30 to 50 million households. Therefore, as the smart meter communication board 100, it is possible to use a highly reliable and inexpensive one.
  • the 920 MHz band specific low power radio has the following features. (1) A radio station license is not required up to an output of 20 mW. (2) It is a frequency band corresponding to the so-called platinum band in mobile phones, and has relatively good wraparound characteristics to buildings. (3) Since radio waves do not fly too far, it is suitable for relatively short-distance communication such as vehicle-to-vehicle communication. From these features, the smart meter communication board 100 is also suitable for application of vehicle-to-vehicle communication. It is expected that the ⁇ wall of widespread use> will be overcome by utilizing such features and using highly reliable and inexpensive communication functions.
  • the voice recognition unit 24a recognizes the voice emitted by the driver of the own vehicle.
  • the recognition result is sent to the transmitter.
  • the transmission unit stores command data based on the result of voice recognition in message data and transmits it to another vehicle.
  • the traveling information calculation unit 24i acquires the position information of the own vehicle from the GPS receiving unit 24j, and calculates the traveling information of the own vehicle by associating the position information, the moving direction, and the moving speed of the own vehicle with a time stamp.
  • the own vehicle travel information is stored in the own vehicle travel information storage unit 25f, and is stored in message data by the transmission unit 24b and transmitted.
  • the transfer unit 24d transfers the message data received from the other vehicle to another vehicle by another hop (HOP).
  • the retweet unit 24c further transfers the message data received from the other vehicle to another vehicle according to the intention of the driver of the own vehicle.
  • the relative position calculation unit 24f determines the relative position of the communication partner (source) that transmitted this message data with respect to the own vehicle, based on the position information included in the message data received from the other vehicle and the position information of the own vehicle. calculate.
  • the voice synthesis unit 24e determines whether or not the destination of the message data transmitted from the transmission source is the own vehicle based on the relative position.
  • FIG. 5 is a flowchart for explaining the function of the traveling information calculation unit 24i.
  • the traveling information calculation unit 24i periodically acquires the position information of the own vehicle from the GPS receiving unit 24j (step S91) and gives a time stamp (step S92).
  • the traveling information calculation unit 24i reads the previous value of the own vehicle position information from the own vehicle traveling information storage unit 25f (step S93), and compares it with the position information acquired this time. Then, the speed and the moving direction of the own vehicle are calculated based on the change of the position information with respect to time (step S94).
  • the travel information calculation unit 24i adds a time stamp to the obtained result, and cyclically stores the obtained result in the own vehicle travel information storage unit 25f as [own vehicle travel information] (step S95). That is, by erasing the old information and saving the new information, a certain number of information is saved in order from the new information. The above procedure is repeated in the processing loop (LOOP).
  • the history information of [own vehicle driving information] saved cyclically is inserted into the message data shown in FIG. 4, and information is exchanged between vehicles.
  • the traveling information of the own vehicle is received by another vehicle, and is cyclically recorded as [other vehicle traveling information] in the receiving vehicle, and becomes the history information of [other vehicle traveling information].
  • all the information received from other vehicles is given the [own vehicle driving information] of the information transmitting vehicle, so it is necessary to collect and store the [other vehicle driving information] from various vehicles. Can be done.
  • a collection of such driving information can be information indicating the traffic situation in the vicinity. For example, it is possible to extract and provide the following value-added information. Moreover, this information can be collected and provided in an autonomous and decentralized manner, which is a great advantage in that it does not require an overall management system such as a traffic control center system.
  • FIG. 6 is a flowchart for explaining the function of the receiving unit 24g.
  • the receiving unit 24g acquires message data exchanged by vehicle-to-vehicle communication, stores necessary information, and executes necessary actions.
  • the receiving unit 24g waits for the arrival of received data (message data) from another vehicle (step S1).
  • the receiving unit 24g uses the received message data to display [own vehicle driving information], [other vehicle driving information], [communication partner code], [command code], Alternatively, various information such as [retweet flag] is extracted.
  • step S2 if the extraction of the [own vehicle driving information] is successful, that is, if the message data includes the [own vehicle driving information] (step S2: YES), the receiving unit 24g is transferred to the other vehicle traveling information storage unit 25g. The contents are recorded (step S3). If the message data includes [other vehicle traveling information] (step S4: YES), the receiving unit 24g records the content in the other vehicle traveling information storage unit 25g (step S5). That is, for the own vehicle, both [own vehicle travel information of the other vehicle] and [other vehicle travel information of the other vehicle] are other vehicle travel information.
  • the receiving unit 24g determines whether or not the [retweet flag] of the message data is a value (0) indicating the 0th hop (step S6), and if YES, transfers the received message data to the forwarding unit 24d. To transfer only one hop (step S7). A value indicating the first hop is described in the [retweet flag] of the transferred message data.
  • step S6 the receiving unit 24g determines whether or not the command described in the received message data is a retweet target (step S8). If NO, the receiving unit 24g sends the message data to the relative position calculation unit 24f to calculate the relative position of the source other vehicle (step S11). If YES in step S8, the receiving unit 24g determines whether or not the value (2) indicating (retweet completed) is described in the [retweet flag] (step S9). If NO, the receiving unit 24g sends the message data to the retweet unit 24c, and further transfers (retweets) only one hop (step S10). If YES in step S9, the receiving unit 24g sends the message data to the relative position calculation unit 24f to calculate the relative position of the source other vehicle (step S11). The above procedure is repeated in the processing loop (LOOP).
  • LOOP processing loop
  • FIG. 7 is a flowchart for explaining the function of the transfer unit 24d.
  • the transfer unit 24d waits for the arrival of the message data (step S21), and when the message data arrives (YES), the transfer unit 24d sets the [retweet flag] of the received message data to 1. Set the value (1) indicating the hop eye. Then, the transfer unit 24d sends the message data to the transmission unit 24b (step S23), and causes the other vehicle to transmit only one hop. The above procedure is repeated in the processing loop (LOOP).
  • the following effects can be obtained by limiting the transfer to only one hop. That is, the following problems that occur when a large number of transfers of two or more hops are repeated can be avoided.
  • A When the transfer is repeated a plurality of times, an infinite loop of the transfer occurs. Under such circumstances, the transferred data will increase explosively, a so-called broadcast storm will occur, and practical communication will not be possible.
  • B Even in the case where an infinite loop of transfer does not occur, there are cases where the communication nodes (vehicles) participating in the transfer increase exponentially, and only a small part of those communication nodes (vehicles) transfer information. Can occur in cases that do not require.
  • the system can be implemented extremely easily and simply by limiting the number of transfer hops to only one hop. It is possible to easily realize a simple and inexpensive in-vehicle device that can be so-called "ponded", and it will help to overcome the ⁇ wall of widespread use>.
  • FIG. 8 is a flowchart for explaining the function of the retweet unit 24c.
  • the retweet unit 24c temporarily stores the message data in a buffer memory (not shown) or the like (step S32), and sends a message to the voice synthesis unit 24e.
  • the procedure of step S34 is executed. If there is no received data (step S31: NO), the process directly proceeds to step S34.
  • step S34 the retweet unit 24c acquires the recognition result of the voice recognition unit 24a and determines whether or not the recognition result is "retweet" (step S34). By providing this step, it is determined whether or not to transfer (that is, retweet) the message data to another vehicle according to the intention of the driver of the own vehicle.
  • step S34 the retweet unit 24c sends the message data buffered in step S32 to the transmission unit 24b and transmits it to an unspecified number of other vehicles (vehicles in the wireless zone) (step S35). .. That is, when there is a manifestation of intention to retweet from the driver, the retweet unit 24c passes the message data delivered from the other vehicle to the transmission unit 24b and causes the retweet unit 24c to transmit the message data to another vehicle. The above procedure is repeated in the processing loop (LOOP).
  • LOOP processing loop
  • the culture of SNS Social Networking Service
  • it can be expected to be a strong traction force for young people in particular, and it will also help to overcome the ⁇ wall of popularization>.
  • FIG. 9 is a flowchart for explaining the function of the transmission unit 24b.
  • the transmission unit 24b receives a data transmission request from the transfer unit 24d, the voice recognition unit 24a, or the retweet unit 24c, the transmission unit 24b creates message data and transmits it to another vehicle.
  • the processing procedure of the transmission unit 24b includes three determination blocks (steps S41, S43, S48) in the loop.
  • step S41 the transmission unit 24b waits for a transmission request from the transfer unit 24d.
  • step S43 the transmission unit 24b waits for a transmission request from the voice recognition unit 24a.
  • step S48 the transmission unit 24b waits for a transmission request from the retweet unit 24c.
  • the process jumps to the process corresponding to each step.
  • the transmission unit 24b If there is a transmission request from the transfer unit 24d in step S41 (YES), the transmission unit 24b requests the smart meter communication board 100 to transmit data (step S42). If there is a transmission request from the voice recognition unit 24a in step S43 (YES), the transmission unit 24b reads out the vehicle travel information from the vehicle travel information storage unit 25f (step S44), and reads the vehicle travel information from the other vehicle travel information storage unit 25g. The other vehicle traveling information is read out (step S45), and message data to be transmitted is created (step S46). Then, the transmission unit 24b requests the smart meter communication board 100 to transmit this message data (step S47).
  • the transmission unit 24b If there is a transmission request from the retweet unit 24c in step S48 (YES), the transmission unit 24b creates message data in which the retweet flag is set (that is, set to 1) (step S49), and transmits this message data. Request the smart meter communication board 100 (step S50).
  • FIG. 10 is a flowchart for explaining the function of the relative position calculation unit 24f.
  • the relative position calculation unit 24f waits for the arrival of message data from another vehicle (step S61).
  • the relative position calculation unit 24f reads out the [own vehicle traveling information] shown in FIG. 4 from the received message data, and acquires the position, speed, and direction of the transmitting vehicle (step). S62).
  • the relative position calculation unit 24f acquires the position, speed, and direction of the own vehicle from the own vehicle traveling information storage unit 25f (step S63).
  • the relative position calculation unit 24f uses this information to determine the relative relationship between the own vehicle and the vehicle of the communication partner to which the message data is transmitted. The positional relationship, the relative speed, and the relative moving direction are calculated (step S64).
  • the relative position calculation unit 24f matches the position of the caller described in the communication partner code (FIG. 4: 5th field) of the received message data with the calculated relative positional relationship. It is determined whether or not the data is used (step S65). If both match (YES), the relative position calculation unit 24f determines that this message data is the data transmitted for the own vehicle. Then, the relative position calculation unit 24f sends the message data and various information acquired from the message data to the voice synthesis unit 24e (step S66). The above procedure is repeated in the processing loop (LOOP).
  • FIG. 11 is a diagram showing an example of correspondence between the communication partner code and the relative positional relationship.
  • the positional relationship between the sender of the message data and the recipient is distinguished in detail, for example, [all], [all in front], [all in back], ..., [back left], [back left].
  • the communication partner code is associated with each position in advance.
  • the voice synthesis unit 24e Based on the information sent from the relative position calculation unit 24f, the voice synthesis unit 24e expresses the relative position centered on the own vehicle and the command content of the message data by voice and conveys it to the driver of the own vehicle.
  • FIG. 12 is a flowchart for explaining the function of the voice recognition unit 24a.
  • the voice recognition unit 24a acquires voice data emitted by the driver of the own vehicle from the microphone 5 (FIG. 1) and performs voice recognition.
  • speech recognition existing techniques as shown in Non-Patent Documents 1 and 2 can be applied.
  • the voice recognition unit 24a determines whether or not the voice acquired from the microphone 5 has a keyword for designating a communication partner (step S81).
  • the format of the voice command is fixed in advance and handled.
  • FIG. 13 is a diagram showing an example of a voice command format.
  • the voice command includes two fields, "keyword to specify the conversation partner" and "command".
  • FIG. 14 is a diagram showing an example of a keyword and a coded sequence for designating a conversation partner.
  • the keywords such as "all”, “all of Zenpo”, “all of Koho”, ..., etc., have the codes “0", “1", “2" for identifying each keyword. ], ..., Are associated with each other.
  • This information is stored in advance in the keyword storage unit 25a (FIG. 3).
  • FIG. 15 is a diagram showing an example of a voice command and a coded sequence. Commands correspond to common codes “1", “2", “11”, “12”, etc. for each of multiple categories such as “thank you”, “reply”, “prompt”, “direction”, and so on. This is the attached information. This information is stored in advance in the command definition data storage unit 25e (FIG. 3).
  • the voice recognition unit 24a reads out the code for identifying the communication partner (communication partner code) corresponding to the keyword from the keyword storage unit 25a. (Step S82). Next, the voice recognition unit 24a determines whether or not the voice following the communication partner identification code is a command (step S83), and if YES, sets the command identification code (command code) in the command definition data storage unit 25e. Read from.
  • the voice recognition unit 24a notifies the transmission unit 24b of the read communication partner code and the command code, and requests transmission of a data message including both codes (step S85).
  • the communication partner code and command code are stored in each field in the format shown in FIG. The above procedure is repeated in the processing loop (LOOP).
  • the voice synthesis unit 24e has three major functions. That is, it has a function of transmitting the meaning of the received command to the driver of the own vehicle, a function of transmitting the position of the other party from which the message is sent to the driver of the own vehicle, and a function of synthesizing anthropomorphic voice.
  • the voice synthesized and output by the voice synthesis unit 24e mainly includes two parts, as shown in FIG. 16, for example.
  • FIG. 16 is a conceptual diagram showing a voice synthesized and output by the voice synthesis unit 24e.
  • the voice message that is loudly output in the vehicle has a [keyword part] for designating a reception partner and a [phrase part] for expressing the meaning of the voice command.
  • FIG. 17 is a flowchart for explaining the function of the voice synthesis unit 24e.
  • the voice synthesis unit 24e determines the presence / absence of message data received from another vehicle (step S71), and if there is message data (YES), the voice synthesis unit 24e determines the received message.
  • the command code included in the data is extracted (step S72).
  • the voice synthesis unit 24e acquires a character string for generating a voice phrase corresponding to the acquired command code from the command definition data storage unit 25e (step S73). If, for example, the command code "1" is extracted in step S72, the message represents "thank you", and the corresponding character string such as "thank you” is acquired.
  • FIG. 18 is a diagram showing an example of the correspondence between the voice phrase generation character string and the command code. This information is stored in advance in the command definition data storage unit 25e. As shown in FIG. 18, a character string having a predetermined pattern is assigned to each command code. In other words, no matter what kind of voice is emitted by the transmitting vehicle, only a specific pattern of voice is generated by the receiving vehicle that is the receiver. By doing so, it is possible to prevent slanderous messages and message exchanges that promote incitement. Further, in FIG. 17, in which the function of transmitting the meaning of the received command to the driver of the own vehicle is realized, the voice synthesis unit 24e then reads out the transmission vehicle type from the received message data (step S74).
  • the voice synthesis unit 24e reads out the voice quality definition data corresponding to the transmitting vehicle type from the voice quality definition data storage unit 25c for each vehicle type (step S75).
  • FIG. 19 is a diagram showing an example of voice quality data defined for each vehicle type. For example, for a large truck, information such as [height] is (low), [blurring] is (large), [harmonic] is (small), and so on is defined in advance.
  • FIG. 20 is a diagram showing an example of voice quality data defined for each color of the vehicle. For example, for reddish colors, information that makes youth image such as [height] is (high), [blurred] is (none), [harmonic] is (small), and so on is defined in advance. ..
  • the driver of the vehicle judges the rough image of the driver based on the model and color of the other vehicle. For example, if you look at a large black truck, you might think of a male driver with a big, thick voice, and if you look at a pink light car, you might think of a small female driver with a thin voice. Let's go.
  • the vehicle type and external color of the own vehicle are registered in the in-vehicle device 2, and this information is transmitted as the transmission vehicle attribute information of the vehicle-to-vehicle communication.
  • the receiving side can support the identification of the transmitting vehicle of the communication by synthesizing the voice of the impression according to the vehicle type and the external color.
  • the voice synthesis unit 24e reads out the driver attribute from the ninth field (FIG. 4) of the received message data (step S76).
  • the voice synthesis unit 24e reads out the voice quality definition data corresponding to the driver attribute from the voice quality definition data storage unit 25d for each driver attribute (step S77).
  • FIG. 21 is a diagram showing an example of voice quality data defined for the driving history. For example, for a veteran driver, calm information such as [height] is (medium), [blurring] is (none), [harmonic] is (small), and so on is predefined. ..
  • the voice synthesis unit 24e synthesizes the voice generation character string according to the combination of the read voice quality definition data (step S78) to, so to speak, ⁇ personify> the transmitting vehicle.
  • the above procedure is repeated in the processing loop (LOOP).
  • the voice synthesis unit 24e synthesizes and outputs a keyword (FIG. 14) that specifies the reception partner from the relative position information of the own vehicle and the reception partner vehicle sent from the relative position calculation unit 24f. This makes it possible to accurately inform the driver of the own vehicle of the relative position of the other vehicle.
  • FIG. 22 is a diagram showing an example of a keyword expressing the relative position between the own vehicle and the opponent vehicle.
  • a keyword that specifies the other party such as "the car in front”
  • the keywords are not limited to the terms shown in FIG. 22. Any term may be used as long as the keyword can be associated with the positional relationship between the own vehicle and the other vehicle when heard as voice.
  • the voice synthesis unit 24e forms a three-dimensional sound field in the riding space of the own vehicle. That is, the voice synthesis unit 24e synthesizes a three-dimensional sound image at a position corresponding to the position of the communication partner as seen by the driver, based on the relative positional relationship between the communication partner and the own vehicle. This can help the driver identify the communication partner.
  • the sound image position may change with time and the Doppler effect may be provided according to the movement of the communication partner. That is, the position and pitch of the sound image may be changed to reflect the relative speed and relative movement direction of the communication partner. By doing so, it becomes easier for the driver to identify the communication partner.
  • a message from a vehicle traveling in the oncoming lane can be easily recognized by the driver as a message from an oncoming vehicle by synthesizing the voice as a sound image moving from the front right to the rear right with the Doppler effect. It is possible to make it.
  • Anthropomorphization can be realized, for example, by implementing a function of parameterizing and adjusting the sound quality for speech synthesis in the speech synthesis unit 24e.
  • Anthropomorphization can support the driver in identifying the communication partner from another aspect.
  • the anthropomorphization in the embodiment means that the voice has a virtual personality by controlling the voice quality used for voice synthesis.
  • Non-Patent Document 1 due to recent advances in speech synthesis technology, speakers of a wide range of ages and genders can be selected to create speech data, a combination of emotions of joy, anger, sadness, and comfort, and voice. By adjusting the intonation and speed, it has become possible to synthesize expressive voice.
  • Non-Patent Document 2 it is also possible to synthesize voice based on the voice data of a specific person. By repeatedly performing communication using anthropomorphism, it is possible for the driver to identify the communication partner only from the sound quality of the voice, and it is possible to recognize that he is the usual person. Become.
  • Driving history (2) Gender / age (3) Prefecture of origin ⁇ Personification according to driving history>
  • the driver is driving while changing his / her consciousness according to the driving experience of the other driver. For example, when a beginner or an elderly person is driving, he / she takes a distance between vehicles and is conscious of smooth lane movement. Therefore, when the in-vehicle device 2 is initially set, the driving history of the driver of the own vehicle is registered, and the driving history is exchanged between the vehicles as the driver attribute information.
  • the vehicle that has received the message data it is possible to support the identification of the driving history of the transmitting vehicle of the communication by synthesizing the voice with the impression tone according to the driving history of the driver of the communication partner.
  • a veteran driver can express the driving history of a transmitting vehicle by synthesizing a voice with a moist and calm voice, and a beginner with a voice with a frightening and annoying impression.
  • the definition data in FIG. 21 reflects such attributes. For example, it is conceivable to divide the driving level into beginner, intermediate, and advanced according to the mileage of one month, and use the voice quality according to the driving level.
  • the in-vehicle device 2 when the in-vehicle device 2 is initially set, the gender and age of the driver of the own vehicle are registered, and the information is transmitted as the driver attribute information of the inter-vehicle communication.
  • the vehicle that receives this it is possible to support driving according to the driver by synthesizing voice with an impression tone according to the gender and age of the driver of the communication partner.
  • a young female driver can express a thin and high voice quality
  • an elderly man can express the gender and generation of a transmitting vehicle by synthesizing voice with a calm and dandy impression voice quality.
  • the in-vehicle device 2 when the in-vehicle device 2 is initially set, the registered prefecture of the own vehicle is registered, and the information is transmitted as the driver attribute information of the vehicle-to-vehicle communication.
  • the vehicle that receives the message data it is possible to support driving according to the driving partner by synthesizing the voice with a tone using a (famous) dialect corresponding to the registered prefecture of the communication partner.
  • a (famous) dialect corresponding to the registered prefecture of the communication partner.
  • Dialect switching can be easily performed by preparing message data for voice synthesis corresponding to the command for each prefecture.
  • FIG. 23 is a diagram showing an example of voice message data for dialects in Hiroshima Prefecture. The characteristics of the Hiroshima dialect are highlighted with respect to the character string for generating a voice phrase of the standard language shown in FIG.
  • FIG. 24 is a diagram showing an example of voice message data for dialects of Osaka Prefecture. It is an expression that clearly shows the characteristics of the Kansai dialect. Information on the region to which the dialect should be applied can be inferred to some extent from the license plate of the vehicle.
  • ⁇ Effect> As described above, according to the embodiment, it is possible to promote the spread of the vehicle-to-vehicle communication system from various aspects by exchanging message data by vehicle-to-vehicle communication. Moreover, no center device is particularly required, and information sharing between vehicles can be promoted by autonomous decentralized processing among a plurality of vehicles.
  • both the one-hop transfer and the retweet function as an application layer by software, it is possible to reduce the device scale of the in-vehicle device 2 and realize it as a so-called "ponging" type in-vehicle device.
  • the communication function can be further simplified, and the size and weight can be further reduced.
  • information such as the position information, moving speed, and moving direction of each vehicle is exchanged as [own vehicle running information] and [other vehicle running information], so that traffic information can be shared locally.
  • the environment can be formed.
  • the average vehicle speed information for each lane the number of convoys before and after the own vehicle, the means for collecting witness vehicle data at the time of a traffic accident, or bird's-eye view data including the movement of surrounding vehicles at the time of a traffic accident.
  • the in-vehicle device 2 can also be applied as a means for providing the above. That is, the in-vehicle device 2 can be used as a complement to the drive recorder.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Circuit For Audible Band Transducer (AREA)
PCT/JP2020/003070 2020-01-29 2020-01-29 車載装置、車車間通信システム、およびプログラム WO2021152713A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/003070 WO2021152713A1 (ja) 2020-01-29 2020-01-29 車載装置、車車間通信システム、およびプログラム
JP2021573679A JP7279206B2 (ja) 2020-01-29 2020-01-29 車載装置、車車間通信システム、およびプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/003070 WO2021152713A1 (ja) 2020-01-29 2020-01-29 車載装置、車車間通信システム、およびプログラム

Publications (1)

Publication Number Publication Date
WO2021152713A1 true WO2021152713A1 (ja) 2021-08-05

Family

ID=77078056

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/003070 WO2021152713A1 (ja) 2020-01-29 2020-01-29 車載装置、車車間通信システム、およびプログラム

Country Status (2)

Country Link
JP (1) JP7279206B2 (enrdf_load_stackoverflow)
WO (1) WO2021152713A1 (enrdf_load_stackoverflow)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002183889A (ja) * 2000-10-03 2002-06-28 Honda Motor Co Ltd 車車間通信装置
JP2010272083A (ja) * 2009-05-25 2010-12-02 Denso Corp 車載通信装置および通信システム
JP2015194828A (ja) * 2014-03-31 2015-11-05 パナソニックIpマネジメント株式会社 運転支援装置
JP2017068741A (ja) * 2015-10-01 2017-04-06 パナソニックIpマネジメント株式会社 車載端末装置、歩行者端末装置、歩車間通信システム、ならびに歩車間通信方法
JP2017182776A (ja) * 2016-03-29 2017-10-05 株式会社デンソー 車両周辺監視装置及びコンピュータプログラム
JP2019040305A (ja) * 2017-08-23 2019-03-14 株式会社デンソー 収集システム及びセンタ

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6031294B2 (ja) * 2012-08-07 2016-11-24 本田技研工業株式会社 車両用通信装置
JP2014134897A (ja) 2013-01-08 2014-07-24 Denso Corp 車両用意思疎通装置、及びそれを用いた車両用コミュニケーションシステム
JP6143647B2 (ja) * 2013-11-06 2017-06-07 本田技研工業株式会社 情報発信源から移動体に対する情報を表示する方法、プログラム、及び電子機器

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002183889A (ja) * 2000-10-03 2002-06-28 Honda Motor Co Ltd 車車間通信装置
JP2010272083A (ja) * 2009-05-25 2010-12-02 Denso Corp 車載通信装置および通信システム
JP2015194828A (ja) * 2014-03-31 2015-11-05 パナソニックIpマネジメント株式会社 運転支援装置
JP2017068741A (ja) * 2015-10-01 2017-04-06 パナソニックIpマネジメント株式会社 車載端末装置、歩行者端末装置、歩車間通信システム、ならびに歩車間通信方法
JP2017182776A (ja) * 2016-03-29 2017-10-05 株式会社デンソー 車両周辺監視装置及びコンピュータプログラム
JP2019040305A (ja) * 2017-08-23 2019-03-14 株式会社デンソー 収集システム及びセンタ

Also Published As

Publication number Publication date
JPWO2021152713A1 (enrdf_load_stackoverflow) 2021-08-05
JP7279206B2 (ja) 2023-05-22

Similar Documents

Publication Publication Date Title
JP6814364B2 (ja) 情報提供装置、及び車両
US10875525B2 (en) Ability enhancement
US20070162550A1 (en) Vehicle-to-vehicle instant messaging with locative addressing
CN207328264U (zh) 可用于无人驾驶与有人驾驶车辆的道路行人提醒系统
CN105910610B (zh) 用于动态位置报告速率确定的方法和设备
US20130131918A1 (en) System and method for an information and entertainment system of a motor vehicle
CN104067326A (zh) 用户辅助的位置情况标识
CN101113906A (zh) 供车辆导航系统使用的路线匹配方法
JP2002536648A (ja) 関連交通情報を得るためかつ動的経路最適化のための方法、および、装置
CN104781865A (zh) 用于借助于至少一辆机动车提供行驶路段信息的方法
CN105682046A (zh) 车载命名数据网络中基于数据属性的兴趣包转发方法
CN102325151A (zh) 一种移动车载终端及平台管理服务系统
TW202508313A (zh) 救急車輛的預警地理圍欄管理系統與方法
CN108009169B (zh) 一种数据处理方法、装置及设备
Dimitrakopoulos Current technologies in vehicular communication
CN109808737A (zh) 用于适应列车平交道口的导航的方法和设备
KR101776750B1 (ko) 운전자의 통화가능정보를 제공하는 서버, 장치 및 방법이 구현된 컴퓨터로 판독 가능한 기록매체
CN109767770A (zh) 一种车载语音导航和语音聊天系统
CN112134928B (zh) 一种车载视频分布式调度和车联社交的通信系统
US11710405B2 (en) Method for determining a communications scenario and associated terminal
WO2013136499A1 (ja) 移動体通信装置、管理装置、移動体通信方法、移動体通信プログラム、及び記録媒体
WO2021152713A1 (ja) 車載装置、車車間通信システム、およびプログラム
CN114093200A (zh) 一种道路人车双向多模态预警的系统与方法
CN118800050A (zh) 基于路侧感知与视觉引导的泛车路协同方法及系统
JP2012048715A (ja) ウェブ上の個人化されたコンテンツを提供するインターネットテレマティクスサービス提供システムおよび提供方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20917225

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021573679

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20917225

Country of ref document: EP

Kind code of ref document: A1