WO2024079942A1 - Information processing apparatus, control method, program, and storage medium - Google Patents

Information processing apparatus, control method, program, and storage medium Download PDF

Info

Publication number
WO2024079942A1
WO2024079942A1 PCT/JP2023/022911 JP2023022911W WO2024079942A1 WO 2024079942 A1 WO2024079942 A1 WO 2024079942A1 JP 2023022911 W JP2023022911 W JP 2023022911W WO 2024079942 A1 WO2024079942 A1 WO 2024079942A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driver
workload
display
information processing
Prior art date
Application number
PCT/JP2023/022911
Other languages
French (fr)
Japanese (ja)
Inventor
高志 飯澤
廣人 根岸
昂生 加峯
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2024079942A1 publication Critical patent/WO2024079942A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems

Definitions

  • the present invention relates to a system that mediates communication between a driver of a mobile object and a user outside the mobile object.
  • Patent Document 1 discloses a conversation providing system that enables a call between the vehicle driver and a conversation partner by communicating with an in-vehicle system via a communication network.
  • one of the objectives of the present invention is to provide an information processing device that can effectively mediate communication between a driver of a moving body and a user outside the moving body.
  • the invention described in claim 1 is An information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising: A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body; a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard; a display control means for displaying information relating to the cause on a terminal device of the user;
  • the present invention is characterized in that the information processing device has the following features.
  • a control method executed by an information processing device used in a system that mediates communication between a driver of a moving body and a user present outside the moving body comprising: A calculation step of calculating an index value related to a degree of a driving load of a driver of the moving body; a detection step of detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard; a display control step of displaying information about the cause on a terminal device of the user;
  • the control method is characterized by having the following features.
  • the invention described in claim 11 further comprises: A program executed by a computer of an information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising: A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body; a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard; The program causes a computer to function as a display control means for displaying information about the cause on a terminal device of the user.
  • 1 illustrates a configuration example of a driving call system according to an embodiment.
  • 2 shows an example of a schematic configuration of an in-vehicle device.
  • 2 shows an example of a schematic configuration of an external terminal.
  • 13 shows a first example of a display on an external terminal when a target vehicle is traveling.
  • 13 shows a second example of a display on the external terminal when the target vehicle is traveling.
  • 13 shows a third example of a display on the external terminal when the target vehicle is traveling.
  • 4 is an example of a flowchart showing a procedure of a process executed by an in-vehicle device.
  • 13 shows a fourth display example of the external terminal according to the modified example.
  • 13 shows a configuration example of a driving call system according to a modified example.
  • 2 illustrates an example of a schematic configuration of a server device.
  • an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body includes a calculation means for calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control means for displaying information relating to the cause on the user's terminal device.
  • the information processing device described above can conveniently allow a user outside the vehicle to recognize the causes of the driver's increased driving load. This allows the user to communicate with the driver according to the causes of the increased driving load.
  • the calculation means calculates the index value based on the current state of at least one of the driving behavior of the moving body, the driver, or the driving environment.
  • the detection means calculates a score that evaluates the current state of at least one of the driving behavior of the moving body, the driver, or the driving environment for each element related to the driving load, and calculates the index value based on the score. This allows the information processing device to accurately calculate the index value of the driving load.
  • the detection means determines the factor that is the cause based on the score. This aspect enables the information processing device to accurately detect the cause of the increased driving load.
  • the display control means causes the terminal device to display information related to the cause and information indicating the magnitude of the index value.
  • the information processing device can conveniently present the degree of driving load, together with the cause of the increased driving load, to a user outside the mobile body.
  • the display control means causes the terminal device to display a line representing the trajectory of the driver's line of sight on an image captured from the moving body of the outside of the moving body. This aspect allows the user of the terminal device to conveniently grasp the direction in which the driver is not paying much attention, and the user of the terminal device can assist the driver in checking for safety in the direction in which the driver is not paying much attention by communicating with the driver.
  • the display control means may change the display mode of the line depending on the index value.
  • the display control means may display the line when the index value indicates that the driving load is higher than a predetermined standard, and may hide the line when the index value indicates that the driving load is equal to or lower than the predetermined standard.
  • the display control means highlights in the captured image an object to which the driver should direct his or her gaze. This aspect enables the user of the terminal device to identify an object to which the driver is not directing his or her gaze and to communicate with the driver to encourage the driver to direct his or her gaze to the object.
  • a control method is executed by an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body, the control method comprising a calculation step of calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection step of detecting a cause of increased driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control step of displaying information relating to the cause on the user's terminal device.
  • the information processing device can allow a user outside the mobile body to appropriately recognize the cause of increased driving load on the driver.
  • a program executed by a computer of an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body causes the computer to function as a calculation means for calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection means for detecting a cause of increased driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control means for displaying information relating to the cause on the user's terminal device.
  • the computer of the information processing device can conveniently allow a user outside the mobile body to recognize the cause of increased driving load on the driver.
  • the program is stored in a storage medium.
  • System Configuration Fig. 1 shows a configuration example of a driving call system according to a first embodiment.
  • the driving call system has an in-vehicle device 1 installed in a vehicle and an external terminal 2 used by a person outside the vehicle.
  • the in-vehicle device 1 and the external terminal 2 communicate with each other to mediate a voice call between the driver of the vehicle and the user of the external terminal 2, and the user of the external terminal 2 talks to the driver of the vehicle while checking various information obtained from the vehicle on the external terminal 2.
  • the on-board device 1 moves with the vehicle and performs processes to realize a call between the driver of the vehicle and the user of the external terminal 2.
  • the vehicle equipped with the on-board device 1 is also referred to as the "target vehicle”.
  • the on-board device 1 performs data communication with the external terminal 2 via a communication network 3 such as the Internet or a dedicated communication network.
  • Data exchanged between the on-board device 1 and the external terminal 2 includes voice data generated during a call between the driver of the target vehicle and the user of the external terminal 2, and display instruction data for displaying information about the target vehicle or the driver.
  • the on-board device 1 transmits display instruction data including information about the driver's driving load (workload) (also referred to as "workload-related information”) to the external terminal 2.
  • workload also referred to as "workload-related information
  • the vehicle-mounted device 1 may be a navigation device that is installed in the target vehicle and provides route guidance to a set destination, or may be a mobile terminal such as a smartphone.
  • the vehicle-mounted device 1 may also be incorporated in the target vehicle.
  • the vehicle-mounted device 1 is an example of an "information processing device.”
  • the target vehicle is an example of a "mobile body.”
  • the external terminal 2 is a terminal operated by a person outside the target vehicle, and performs data communication with the in-vehicle device 1 via the communication network 3.
  • the external terminal 2 is, for example, a mobile terminal such as a smartphone.
  • the external terminal 2 establishes communication with the in-vehicle device 1 while the target vehicle is being driven, and exchanges voice data with the in-vehicle device 1 for a call between the driver and the user of the external terminal 2.
  • the external terminal 2 also receives display instruction data including workload-related information from the in-vehicle device 1 during the call, and displays the driver's workload state, etc. based on the display instruction data. In this way, the external terminal 2 allows the user of the external terminal 2 to recognize the driver's workload state, and preferably provides information that can be used to determine a convenient timing for talking to the driver.
  • the external terminal 2 is an example of a "terminal device".
  • the communication between the in-vehicle unit 1 and the external terminal 2 may be realized by being relayed by a server device (not shown). Even if the in-vehicle unit 1 and the external terminal 2 establish direct communication, the in-vehicle unit 1 and/or the external terminal 2 may exchange information (e.g., communication address information) necessary for the in-vehicle unit 1 and the external terminal 2 to establish communication with the server device. In these cases, the server device performs the processes necessary for the in-vehicle unit 1 and the external terminal 2 to perform data communication (including authentication processes for the in-vehicle unit 1 and the external terminal 2).
  • information e.g., communication address information
  • the vehicle-mounted device 1 mainly has a communication unit 11, a storage unit 12, an input unit 13, a control unit 14, a sensor group 15, a display unit 16, and a sound output unit 17.
  • the elements in the vehicle-mounted device 1 are connected to each other via a bus line 10.
  • the communication unit 11 performs data communication with other terminals based on the control of the control unit 14. For example, the communication unit 11 may receive map data for updating the map DB (Database) 4 from a map management server (not shown).
  • map data for updating the map DB (Database) 4 from a map management server (not shown).
  • the storage unit 12 is composed of various types of memory such as RAM (Random Access Memory), ROM (Read Only Memory), and non-volatile memory (including hard disk drives, flash memories, etc.).
  • the storage unit 12 stores programs for the in-vehicle device 1 to execute predetermined processes.
  • the above-mentioned programs may include application programs for making calls, and application programs for causing the external terminal 2 to display maps and images captured in the target vehicle.
  • the storage unit 12 is also used as a working memory for the control unit 14.
  • the programs executed by the in-vehicle device 1 may be stored in a storage medium other than the storage unit 12.
  • the memory unit 12 also stores a map DB (Data Base) 4. Various data necessary for route guidance is recorded in the map DB 4.
  • the map DB 4 is data necessary for map display based on a specified position such as the current position of the target vehicle.
  • the map DB 4 is a database that includes, for example, road data that represents a road network by a combination of nodes and links, and facility data that indicates facilities that are candidates for a destination, a stop-off point, or a landmark.
  • the map DB 4 may be updated based on information received by the communication unit 11 from a map management server under the control of the control unit 14.
  • the input unit 13 is a button, a touch panel, a remote controller, a voice input device, etc. that the user can operate.
  • the display unit 16 is a display or the like that displays information based on the control of the control unit 14.
  • the sound output unit 17 is a speaker or the like that outputs sound based on the control of the control unit 14.
  • the sensor group 15 includes various sensors that sense the state of the target vehicle or the environment outside the vehicle.
  • the sensor group 15 includes an exterior camera 51, a driver camera 52, a vehicle behavior detector 53, and a biosensor 54.
  • the exterior camera 51 is one or more cameras that capture images outside the target vehicle, such as the area in front of the target vehicle, and generates images captured at predetermined time intervals (also called “exterior image”).
  • the driver camera 52 is a camera that is installed so that the driver's face is included in the capture range, and generates images captured at predetermined time intervals (also called “driver image”).
  • the vehicle behavior detector 53 generates detection signals indicating the behavior of the target vehicle, such as the current position, vehicle speed, acceleration, steering angle, etc.
  • the vehicle behavior detector 53 includes, for example, a GNSS (Global Navigation Satellite System) receiver, a gyro sensor, an IMU (Inertial Measurement Unit), a vehicle speed sensor, an acceleration sensor, a steering angle sensor, etc.
  • GNSS Global Navigation Satellite System
  • IMU Inertial Measurement Unit
  • the biosensor 54 is one or more sensors that generate biosignals that indicate the driver's biological phenomena, such as heart rate and amount of sweat.
  • the sensor group 15 may include various external sensors (including cameras, lidars, radars, ultrasonic sensors, infrared sensors, sonar, etc.) and internal sensors in addition to the exterior camera 51, driver camera 52, vehicle behavior detector 53, and biosensor 54.
  • external sensors including cameras, lidars, radars, ultrasonic sensors, infrared sensors, sonar, etc.
  • internal sensors in addition to the exterior camera 51, driver camera 52, vehicle behavior detector 53, and biosensor 54.
  • the control unit 14 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the entire vehicle-mounted device 1.
  • the control unit 14 functions as a "calculation means,” a “detection means,” a “display control means,” and a computer that executes programs, etc.
  • the processing executed by the control unit 14 is not limited to being realized by software programs, but may be realized by any combination of hardware, firmware, and software.
  • the processing executed by the control unit 14 may also be realized by using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcomputer.
  • the program executed by the control unit 14 in this embodiment may be realized by using this integrated circuit.
  • the control unit 14 may receive map information from a map management server (not shown) via the communication unit 11.
  • the control unit 14 may receive map information from a map management server (not shown) via the communication unit 11.
  • the input unit 13, the display unit 16, and the sound output unit 17 may be provided in the target vehicle as an external device of the vehicle-mounted unit 1, and may supply the generated signal to the vehicle-mounted unit 1.
  • at least some of the sensors in the sensor group 15 may be sensors mounted on the target vehicle.
  • the vehicle-mounted unit 1 may acquire information output by a sensor mounted on the target vehicle from the target vehicle based on a communication protocol such as CAN (Controller Area Network).
  • CAN Controller Area Network
  • FIG. 3 shows an example of the schematic configuration of the external terminal 2.
  • the external terminal 2 mainly has a communication unit 21, a memory unit 22, an input unit 23, a control unit 24, a sensor group 25, a display unit 26, and a sound output unit 27.
  • the elements in the external terminal 2 are connected to each other via a bus line 20.
  • the communication unit 21 performs data communication with other terminals under the control of the control unit 24.
  • the storage unit 22 is composed of various types of memory such as RAM, ROM, and non-volatile memory.
  • the storage unit 22 stores programs for the external terminal 2 to execute predetermined processes.
  • the above-mentioned programs may include application programs for calling the driver, displaying information related to the driving of the target vehicle of the in-vehicle unit 1 (including displaying maps and various captured images), and displaying information related to the driver's workload when communication with the in-vehicle unit 1 is established.
  • the storage unit 22 is also used as a working memory for the control unit 24.
  • the programs executed by the external terminal 2 may be stored in a storage medium other than the storage unit 22.
  • the input unit 23 is a button, a touch panel, a remote controller, a voice input device, etc. that the user can operate.
  • the display unit 26 is a display or the like that displays information based on the control of the control unit 24.
  • the sound output unit 27 is a speaker or the like that outputs sound based on the control of the control unit 24.
  • the sensor group 25 includes an internal sensor that senses the state of the external terminal 2, and an external sensor that senses the state of the external world of the external terminal 2.
  • the control unit 24 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the entire external terminal 2. Note that the configuration of the external terminal 2 shown in FIG. 3 is an example, and various modifications may be made to the configuration shown in FIG. 3.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the in-vehicle device 1 identifies the cause of the increased workload (also called a "workload increase cause") and displays the identified workload increase cause on the external terminal 2. This allows the in-vehicle device 1 to make the user of the external terminal 2 aware of the workload increase cause, and the user of the external terminal 2 to take appropriate action according to the workload increase cause.
  • the vehicle-mounted device 1 calculates a workload value based on a score (also called an "element-specific score") that evaluates the current state for each element related to the workload (also called “workload-related element”).
  • the workload-related element is an element that represents the state related to the target vehicle, the driver, or the driving environment.
  • Examples of workload-related elements include elements related to the environment of the road (driving road) on which the target vehicle is traveling (also called “road environment elements"), elements related to driving behavior (driving operations) that affect the workload (also called “driving behavior elements”), and elements related to the driver's physical condition (also called “driver physical condition elements”).
  • the workload-related element is an element that is a candidate for the cause of an increase in workload.
  • the workload-related element may be an element obtained by further subdividing at least one of the road environment elements, driving behavior elements, and driver physical condition elements.
  • the vehicle-mounted device 1 calculates a score (also called an "element-specific score") that evaluates the current state of the target vehicle, driver, or driving environment for each workload-related element.
  • the vehicle-mounted device 1 determines the total value, average value, or other representative value of the calculated element-specific scores as the workload value.
  • the workload-related elements for which the element-specific score is calculated may be all of the road environment elements, driving behavior elements, and driver physical condition elements, or any one of these, or any two of them.
  • the element-specific score may preferably be weighted according to the degree of influence of each workload-related element on the workload. In this case, for example, the weighting coefficient multiplied by the element-specific score of an element with a high degree of influence on the workload is a larger value, and information indicating these weighting coefficients is stored in advance in the storage unit 12, etc.
  • the vehicle-mounted device 1 calculates the element-specific score of the road environment element
  • the vehicle-mounted device 1 calculates the element-specific score by taking into account at least one of the congestion degree, width, speed limit, and visibility of the road to be traveled.
  • the vehicle-mounted device 1 refers to the road information in the map DB 4 or the road traffic information acquired by the communication unit 11, and identifies each degree of the congestion degree, width, speed limit, visibility, etc. of the road to be traveled, and converts each identified degree into an element-specific score by referring to a predetermined formula or table.
  • the predetermined formula or table is stored in the storage unit 12 or the like in advance.
  • the road environment element may be subdivided into multiple workload-related elements.
  • the vehicle-mounted device 1 may regard each of the congestion degree, width, speed limit, and visibility of the road to be traveled as a workload-related element, and calculate the element-specific score for each of these.
  • the vehicle-mounted device 1 determines whether the current driving behavior corresponds to a driving behavior such as a right turn, a left turn, merging, or a temporary stop, and calculates the element-specific scores according to the determined driving behavior. In this case, for example, the vehicle-mounted device 1 determines the element-specific scores of the driving behavior elements by referring to a table or the like that associates the corresponding driving behavior with the element-specific scores of the driving behavior elements.
  • the vehicle-mounted device 1 may determine the current driving behavior based on information obtained by the route guidance process (for example, the current position of the target vehicle and the route to the set destination), or may determine the current driving behavior based on a signal indicating the state of the turn signal or the steering state obtained from the target vehicle.
  • the vehicle-mounted device 1 may also determine the element-specific scores of the driving behavior elements based on the presence or absence of auto-cruise driving (or the presence or absence of other automatic driving functions).
  • the driving behavior elements may be subdivided into multiple workload-related elements.
  • the vehicle-mounted device 1 determines the element-specific score for each of these workload-related elements by referring to a predetermined table or the like, depending on the presence or absence of the corresponding driving action.
  • the vehicle-mounted device 1 calculates the element-specific score by taking into account at least one of the driver's continuous driving time, drowsiness level (alertness level), and other bioindicators that affect driving. In this case, the vehicle-mounted device 1, for example, identifies the drowsiness level or other bioindicators that affect driving based on the signal output by the biosensor 54. The vehicle-mounted device 1 then refers to a predetermined formula or table and converts the identified continuous driving time and one or more bioindicators into an element-specific score of the driver's physical condition element. Note that the driver's physical condition element may be subdivided into multiple workload-related elements. In this case, for example, the vehicle-mounted device 1 may regard each of the drowsiness level and the continuous driving time as workload-related elements and calculate the element-specific scores for these, respectively.
  • the vehicle-mounted device 1 can determine the workload value by taking into account various factors that affect the workload.
  • the vehicle-mounted device 1 executes a process for detecting the cause of an increase in workload when the calculated workload value becomes greater than a predetermined threshold value (also called the "workload threshold value").
  • the workload threshold value is, for example, determined in advance and stored in the storage unit 12.
  • the workload threshold value is an example of a "predetermined standard.”
  • the vehicle-mounted device 1 determines the proportion of the workload value that the element-specific score of each workload-related element accounts for (also called the "workload value occupancy proportion"). Next, the vehicle-mounted device 1 adds up the workload value occupancy proportions in order from the workload-related element with the highest workload value occupancy proportion until the total workload value occupancy proportion reaches a predetermined proportion. Then, when the total workload value occupancy proportion reaches a predetermined proportion, the vehicle-mounted device 1 detects the workload-related element with the added workload value occupancy proportion as the cause of the workload increase.
  • the above-mentioned predetermined proportion is, for example, determined in advance and stored in the memory unit 12.
  • the vehicle-mounted device 1 determines whether the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is 30% or more. If the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is 30% or more, the vehicle-mounted device 1 detects the workload-related element as a cause of the workload increase. On the other hand, if the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is not 30% or more, the vehicle-mounted device 1 determines whether the total value of the workload value occupancy ratios of the workload-related elements with the first and second workload value occupancy ratios is 30% or more.
  • the vehicle-mounted device 1 detects these workload-related elements as a cause of the workload increase. On the other hand, if the workload value occupancy ratios of the workload-related elements with the first and second highest workload value occupancy ratios are not 30% or more, the vehicle-mounted device 1 determines whether the total value of the workload value occupancy ratios of the workload-related elements with the first to third highest workload value occupancy ratios is 30% or more.
  • the vehicle-mounted device 1 repeatedly executes this process until the total value of the workload value occupancy ratios of the workload-related elements is 30% or more, and at the point when the total value is 30% or more, detects the workload-related element with the added workload value occupancy ratio as the cause of the workload increase.
  • the vehicle-mounted device 1 can appropriately detect one or more workload-related factors that cause an increase in the workload.
  • one or more of the following factors are identified as causes of increased workload: the environment of the road during driving (degree of congestion, width, speed limit, visibility), driving behavior, and the driver's physical condition (continuous driving time, degree of drowsiness based on biometric information, etc.).
  • Display Example Fig. 4 shows a first display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling.
  • the external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in Fig. 4 on the display unit 26 based on the display instruction data received from the in-vehicle device 1.
  • the display screen shown in FIG. 4 has an outside-vehicle image display area 60, a workload-related information display area 61, and a map display area 62.
  • the external terminal 2 displays the latest external image (video) generated by the external camera 51 on the external image display area 60.
  • the vehicle-mounted device 1 transmits display instruction data including the latest external image generated by the external camera 51 to the external terminal 2.
  • the external terminal 2 displays a display window 71 and an indicator 72 in the workload-related information display area 61 based on the workload-related information included in the display instruction data sent by the vehicle-mounted device 1.
  • the display window 71 displays the level of the workload value calculated by the in-vehicle unit 1 and the cause of the workload increase detected by the in-vehicle unit 1.
  • the display window 71 displays "High workload", which indicates the highest level.
  • the in-vehicle unit 1 detects the cause of the workload increase (here, turning right at an intersection), and the external terminal 2 displays "Turn right at intersection” in the display window 71.
  • Indicator 72 also indicates the degree of workload, and the greater the workload value calculated by in-vehicle device 1, the further to the right the gauge extends.
  • the gauge of indicator 72 extends to the near right end.
  • the color of the gauge may change depending on the workload value. For example, the higher the workload value, the more noticeable the color of the gauge may be.
  • the "High Workload" display in display window 71 and indicator 72 are examples of "information indicating the magnitude of the index value.”
  • the external terminal 2 also displays a map of the vicinity of the current position of the target vehicle on the map display area 62.
  • the in-vehicle device 1 generates display instruction data for the external terminal 2 to display the above-mentioned map based on the state of the target vehicle estimated based on the vehicle behavior detector 53, the map DB 4, and information related to route guidance to the destination, and transmits the display instruction data to the external terminal 2.
  • a current position mark 73 indicating the current position of the target vehicle and a route line 74 indicating the guided route along which the in-vehicle device 1 will guide the driver of the target vehicle are superimposed on the map on the map display area 62.
  • the in-vehicle device 1 can conveniently make the user of the external terminal 2 aware of the fact that the workload is high and the cause of this (here, turning right at an intersection). This allows the user of the external terminal 2 to, for example, determine that the driver of the in-vehicle device 1 should concentrate on driving, and take measures such as refraining from talking to the driver about matters other than driving (for example, requests to go shopping, etc.) until the driver has completed the right turn at the intersection.
  • FIG. 5 shows a second display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling.
  • the external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 5 on the display unit 26 based on the display instruction data received from the in-vehicle device 1.
  • the external terminal 2 has an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen.
  • the external terminal 2 displays the latest external image (video) generated by the external camera 51 in the external image display area 60 based on the display command data received from the vehicle-mounted device 1.
  • the external terminal 2 also displays a map of the area around the current position of the target vehicle in the map display area 62 based on the display command data received from the vehicle-mounted device 1.
  • the external terminal 2 also displays a display window 71A and an indicator 72A indicating the degree of workload on the workload-related information display area 61 based on the workload-related information included in the display instruction data received from the vehicle-mounted device 1.
  • the vehicle-mounted device 1 since the workload value calculated by the vehicle-mounted device 1 is equal to or lower than the workload threshold, the vehicle-mounted device 1 does not detect the cause of the workload increase. Therefore, the external terminal 2 displays "low workload" in the display window 71, indicating that the workload value level is the lowest level, and does not display the cause of the workload increase.
  • the workload-related information includes the fact that the vehicle is in auto-cruise driving.
  • the external terminal 2 displays the fact that the vehicle is in auto-cruise driving in the workload-related information display area 61.
  • the external terminal 2 also displays an indicator 72A including a gauge whose length corresponds to the level of the workload value in the workload-related information display area 61.
  • the in-vehicle device 1 conveniently lets the user of the external terminal 2 know that the workload is low. This allows, for example, the user of the external terminal 2 to determine that the driver of the in-vehicle device 1 is relatively available, and to choose the right timing to talk to the driver about necessary matters.
  • FIG. 6 shows a third display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling.
  • the external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 6 on the display unit 26 based on the display instruction data received from the in-vehicle device 1.
  • the external terminal 2 has an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen.
  • the external terminal 2 displays the latest external image (video) generated by the external camera 51 on the external image display area 60 based on the display instruction data received from the vehicle-mounted device 1, in the same manner as in the first and second display examples.
  • the external terminal 2 also displays a map of the area around the current position of the target vehicle on the map display area 62 based on the display instruction data received from the vehicle-mounted device 1, in the same manner as in the first and second display examples.
  • the external terminal 2 Based on the workload-related information received from the vehicle-mounted device 1, the external terminal 2 displays a display window 71B showing the workload level and the cause of workload increase, and an indicator 72B visually showing the workload level, in the workload-related information display area 61.
  • the cause of the workload increase is displayed in the display window 71B.
  • the vehicle-mounted device 1 detects the drowsiness level of the driver's physical condition element and the visibility of the road environment element as causes of the workload increase, and as a result, "drowsiness + low visibility" is displayed in the display window 71B.
  • the external terminal 2 also displays an indicator 72B including a gauge whose length corresponds to the level of the determined workload value, in the workload-related information display area 61.
  • the in-vehicle device 1 can conveniently make the user of the external terminal 2 aware of the high workload and its cause. This allows the user of the external terminal 2 to take action, such as proactively speaking to the driver of the in-vehicle device 1 to wake him up.
  • the external terminal 2 may display an image captured by the driver instead of an image captured outside the vehicle, based on an operation of the user of the external terminal 2 on the external terminal 2.
  • the external terminal 2 may provide only one of the external image display area 60 and the map display area 62 on the display screen.
  • the external terminal 2 may provide only the workload-related information display area 61 on the display screen.
  • Processing Flow Fig. 7 is an example of a flowchart showing the procedure of processing executed by the vehicle-mounted device 1.
  • the vehicle-mounted device 1 executes the processing of the flowchart shown in Fig. 7 when communication with the external terminal 2 is established and a call is started.
  • the vehicle-mounted device 1 calculates the score of each workload-related element related to the workload of the driver of the target vehicle (i.e., element-specific score) (step S101). In this case, the vehicle-mounted device 1 calculates element-specific scores for road environment elements, driving behavior elements, driver physical condition elements, or elements that are further subdivided from these, based on the map DB 4 and data output by the sensor group 15.
  • the vehicle-mounted device 1 calculates a workload value based on the scores of each workload-related element (element-specific scores) (step S102).
  • the vehicle-mounted device 1 may calculate a workload value by adding up or averaging (including using a weighting factor) the element-specific scores, or may calculate the workload value by substituting each element-specific score into a formula for calculating the workload value.
  • the vehicle-mounted device 1 determines whether the workload value is greater than the workload threshold (step S103). If the workload value is greater than the workload threshold (step S103; Yes), the vehicle-mounted device 1 detects the cause of the workload increase (step S104). The vehicle-mounted device 1 then supplies display instruction data including the workload value and workload-related information related to the detected cause of the workload increase to the external terminal 2 (step S105). As a result, the external terminal 2 performs display related to the workload value and the cause of the workload increase based on the display instruction data.
  • the external terminal 2 may perform display based on at least one of the image taken outside the vehicle, the image of the driver, and the map display information.
  • the in-vehicle device 1 supplies display instruction data including workload-related information on the workload value to the external terminal 2 (step S106).
  • the external terminal 2 executes display related to the workload value based on the display instruction data.
  • the display instruction data includes at least one of an image taken outside the vehicle, an image of the driver, and map display information
  • the external terminal 2 may perform display based on at least one of an image taken outside the vehicle, an image of the driver, and map display information.
  • step S107 determines whether the call has ended. If the vehicle-mounted unit 1 determines that the call has ended (step S107; Yes), it ends the processing of the flowchart. On the other hand, if the vehicle-mounted unit 1 determines that the call has not ended (step S107; No), it returns the processing to step S101.
  • the vehicle-mounted device 1 When the vehicle-mounted device 1 displays the outside-of-vehicle captured image on the external terminal 2, the vehicle-mounted device 1 may superimpose a line (gaze tracking line) representing the recognized trajectory of the driver's gaze on the outside-of-vehicle captured image.
  • a line gaze tracking line
  • FIG. 8 shows a fourth display example of the external terminal 2 relating to the first modification.
  • the external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 8 on the display unit 26 based on the display instruction data received from the in-vehicle device 1.
  • the external terminal 2 provides an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen, similar to the first to third display examples, based on the display instruction data received from the in-vehicle device 1.
  • gaze tracking lines 66 indicating the trajectory of the gaze direction of the driver of the target vehicle within a most recent specified time are superimposed on the latest outside-vehicle image in correspondence with the gaze direction in the outside-vehicle image.
  • the vehicle-mounted device 1 detects the gaze direction on the outside-vehicle image taken at the same time as the driver's image based on any gaze detection technology, and generates gaze direction data indicating the detected gaze direction.
  • the vehicle-mounted device 1 superimposes the gaze tracking lines 66 indicating the gaze direction of the driver within the most recent specified time indicated by the gaze direction data on the outside-vehicle image included in the display instruction data to be transmitted to the outside-vehicle terminal 2.
  • the above-mentioned specified time is stored in advance in the storage unit 12, for example.
  • the outside-vehicle terminal 2 displays the outside-vehicle image with the gaze tracking lines 66 superimposed on it in the outside-vehicle image display area 60.
  • the vehicle-mounted device 1 may, for example, prestore face-gaze correspondence data that associates each position on the image captured outside the vehicle with a face image model when the driver is looking at that position, and generate gaze direction data based on the face-gaze correspondence data.
  • the vehicle-mounted device 1 identifies the face image model that most closely matches the driver's face image extracted from the driver's captured image, and generates gaze direction data that indicates the position on the image captured outside the vehicle that corresponds to the identified face image model.
  • the user of the external terminal 2 can easily grasp the direction in which the driver is not paying much attention (to the right in the example of FIG. 8), and the user of the external terminal 2 can assist the driver in checking for safety in the direction in which the driver is not paying much attention by talking to the driver.
  • the driver can feel reassured as if a passenger is watching over him or her, and can feel a sense of unity as if the user of the external terminal 2 is actually riding with the driver.
  • the vehicle-mounted device 1 may change the display mode (display color, display on/off, etc.) of the gaze tracking line according to the workload value. For example, the vehicle-mounted device 1 superimposes the gaze tracking line on the outside-of-vehicle captured image when the workload value is equal to or greater than a predetermined threshold, and hides the gaze tracking line when the workload value is less than the threshold.
  • the above-mentioned predetermined threshold may be the same as the workload threshold, or may be different. In this way, the vehicle-mounted device 1 can display the gaze tracking line only when there is a high probability that the user of the outside-vehicle terminal 2 needs assistance in checking safety.
  • the vehicle-mounted device 1 may display the gaze tracking line in a color according to the workload value. In this case, the vehicle-mounted device 1 displays the gaze tracking line in a color that is more noticeable the higher the workload value, for example.
  • the vehicle-mounted device 1 may highlight the object (also called the "gaze object") to which the driver should direct his/her gaze in the outside-vehicle captured image and display it on the outside-vehicle terminal 2.
  • the gaze object includes, for example, a traffic light, a road sign, an obstacle, a pedestrian, etc.
  • the vehicle-mounted device 1 uses any object recognition technology (including those using deep learning models such as instance segmentation) to extract objects corresponding to a predetermined type as the gaze object from the outside-vehicle captured image.
  • the vehicle-mounted device 1 then processes the outside-vehicle captured image so as to highlight the area of the object on the extracted outside-vehicle captured image by edging or the like and superimpose the gaze tracking line, and transmits the processed outside-vehicle captured image to the outside-vehicle terminal 2 together with the display instruction data.
  • the outside-vehicle terminal 2 that has received the display instruction data displays the outside-vehicle captured image with the gaze tracking line superimposed and the gaze object highlighted. This allows the user of the external terminal 2 to recognize the presence of an object that the driver is not looking at from the image captured outside the vehicle, and to advise the driver to pay attention to the object.
  • At least a part of the processing executed by the vehicle-mounted device 1 may be executed by a server device that performs data communication with the vehicle-mounted device 1 and the external terminal 2 .
  • FIG. 9 shows an example of the configuration of a driving call system according to the second modification.
  • the driving call system has an in-vehicle device 1A, an external terminal 2, and a server device 5.
  • the in-vehicle device 1A and the server device 5, and the external terminal 2 and the server device 5 each perform data communication via the communication network 3.
  • the vehicle-mounted device 1A has the same configuration as the vehicle-mounted device 1 described in the first embodiment above (see FIG. 2). Note that if the server device 5 performs processing based on the map DB 4, the vehicle-mounted device 1A does not need to have the map DB 4. The vehicle-mounted device 1A then transmits to the server device 5 an upload signal that includes information output by the sensor group 15 and input information input by the input unit 13.
  • the server device 5 relays data necessary for voice calls between the in-vehicle device 1A and the external terminal 2. During a voice call, the server device 5 generates display instruction data based on an upload signal or the like received from the in-vehicle device 1A, and transmits the generated display instruction data to the external terminal 2. Specifically, during a voice call, the server device 5 executes the process of the flowchart shown in FIG. 7 based on an upload signal or the like received from the in-vehicle device 1A.
  • FIG. 10 shows an example of the schematic configuration of the server device 5.
  • the server device 5 mainly has a communication unit 41, a storage unit 42, and a control unit 44.
  • the elements in the server device 5 are connected to each other via a bus line 40.
  • the communication unit 41 performs data communication with external devices such as the in-vehicle unit 1A and the external terminal 2 under the control of the control unit 44.
  • the storage unit 42 is composed of various types of memory such as RAM, ROM, and non-volatile memory (including a hard disk drive, flash memory, etc.).
  • the storage unit 42 stores programs for the server device 5 to execute predetermined processes.
  • the storage unit 42 also includes a map DB 4.
  • the control unit 44 includes a CPU, GPU, etc., and controls the entire server device 5.
  • the control unit 44 also executes the programs stored in the storage unit 42 to perform processes required to display workload-related information on the external terminal 2.
  • the driving call system allows the user of the external terminal 2 to easily recognize the driver's workload state, and enables the user to choose a convenient time to talk to the driver.
  • the server device 5 is an example of an "information processing device.”
  • the in-vehicle device 1 or the server device 5 is an information processing device used in a system that mediates communication between a driver of a target vehicle, which is a moving body, and a user of an external terminal 2 located outside the target vehicle, and has a calculation means, a detection means, and a display control means.
  • the calculation means calculates a workload value that is an index value related to the degree of driving load (workload) of the driver of the target vehicle.
  • the detection means detects the cause of the increased driving load when the workload value indicates that the driving load is higher than a predetermined standard.
  • the display control means causes the external terminal 2 to display information related to the cause of the increased driving load. This enables the user of the external terminal 2 to choose a convenient timing for communicating with the driver.
  • Non-transitory computer readable media include various types of tangible storage media.
  • Examples of non-transitory computer readable media include magnetic storage media (e.g., flexible disks, magnetic tapes, hard disk drives), magneto-optical storage media (e.g., magneto-optical disks), CD-ROM (Read Only Memory), CD-R, CD-R/W, semiconductor memory (e.g., mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Computer And Data Communications (AREA)

Abstract

An on-vehicle machine 1 or a server device 5 is an information processing apparatus used in a system that intermediates communication between a driver of a target vehicle which is a movable body and a user of an out-of-vehicle terminal 2 which is present outside the target vehicle, and comprises a calculation means, a detection means, and a display control means. The calculation means calculates a workload value which is an index value relating to the degree of driving load (workload) of the driver of the target vehicle. The detection means detects a cause of increasing the driving load when the workload value indicates that the driving load is higher than a prescribed reference. The display control means displays, on the out-of-vehicle terminal 2, information relating to the cause of increasing the driving load.

Description

情報処理装置、制御方法、プログラム及び記憶媒体Information processing device, control method, program, and storage medium
 本発明は、移動体の運転者と移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに関する。 The present invention relates to a system that mediates communication between a driver of a mobile object and a user outside the mobile object.
 従来から、車両の運転者と車外の人との会話を実現する技術が知られている。例えば、特許文献1には、通信ネットワークを介して車載システムと通信することで車両の運転者と会話相手との通話を実現する会話提供システムが開示されている。 Technologies that enable conversations between a vehicle driver and a person outside the vehicle have been known for some time. For example, Patent Document 1 discloses a conversation providing system that enables a call between the vehicle driver and a conversation partner by communicating with an in-vehicle system via a communication network.
特開2017-138277号公報JP 2017-138277 A
 運転者と車外の人との通話では、車外の人は運転者の心理状態が把握し辛い為、運転者の運転負荷が高い状況でも会話を中断する事ができず、運転者の運転負荷を高めてしまう状況があった。 When the driver is talking to someone outside the vehicle, it is difficult for the person outside the vehicle to understand the driver's psychological state, so even when the driver is under high driving stress, the conversation cannot be interrupted, which can increase the driver's driving stress.
 本発明は、上述した課題を鑑み、移動体の運転者と移動体の外部に存在するユーザとのコミュニケーションを好適に仲介することが可能な情報処理装置を提供することを目的の一つとする。 In consideration of the above-mentioned problems, one of the objectives of the present invention is to provide an information processing device that can effectively mediate communication between a driver of a moving body and a user outside the moving body.
 請求項1に記載の発明は、
 移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置であって、
 前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、
 前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、
 前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段と、
を有する情報処理装置であることを特徴とする。
The invention described in claim 1 is
An information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising:
A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body;
a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
a display control means for displaying information relating to the cause on a terminal device of the user;
The present invention is characterized in that the information processing device has the following features.
 また、請求項10に記載の発明は、
 移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置が実行する制御方法であって、
 前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出工程と、
 前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出工程と、
 前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御工程と、
を有する制御方法であることを特徴とする。
The invention described in claim 10 is as follows:
A control method executed by an information processing device used in a system that mediates communication between a driver of a moving body and a user present outside the moving body, comprising:
A calculation step of calculating an index value related to a degree of a driving load of a driver of the moving body;
a detection step of detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
a display control step of displaying information about the cause on a terminal device of the user;
The control method is characterized by having the following features.
 また、請求項11に記載の発明は、
 移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置のコンピュータが実行するプログラムであって、
 前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、
 前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、
 前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段
としてコンピュータを機能させるプログラムである。
The invention described in claim 11 further comprises:
A program executed by a computer of an information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising:
A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body;
a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
The program causes a computer to function as a display control means for displaying information about the cause on a terminal device of the user.
実施例に係る運転通話システムの構成例を示す。1 illustrates a configuration example of a driving call system according to an embodiment. 車載機の概略構成の一例を示す。2 shows an example of a schematic configuration of an in-vehicle device. 車外端末の概略構成の一例を示す。2 shows an example of a schematic configuration of an external terminal. 対象車両が走行中における車外端末の第1表示例を示す。13 shows a first example of a display on an external terminal when a target vehicle is traveling. 対象車両が走行中における車外端末の第2表示例を示す。13 shows a second example of a display on the external terminal when the target vehicle is traveling. 対象車両が走行中における車外端末の第3表示例を示す。13 shows a third example of a display on the external terminal when the target vehicle is traveling. 車載機が実行する処理の手順を示すフローチャートの一例である。4 is an example of a flowchart showing a procedure of a process executed by an in-vehicle device. 変形例に係る車外端末の第4表示例を示す。13 shows a fourth display example of the external terminal according to the modified example. 変形例に係る運転通話システムの構成例を示す。13 shows a configuration example of a driving call system according to a modified example. サーバ装置の概略構成の一例を示す。2 illustrates an example of a schematic configuration of a server device.
 本発明の1つの好適な実施形態では、移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置であって、前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段と、を有する。 In one preferred embodiment of the present invention, an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body includes a calculation means for calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control means for displaying information relating to the cause on the user's terminal device.
 上記の情報処理装置は、移動体の外部に存在するユーザに対し、運転者の運転負荷を高めている原因を好適に認識させることができる。これにより、ユーザは、運転負荷を高めている原因に応じた運転者とのコミュニケーションを図ることができる。 The information processing device described above can conveniently allow a user outside the vehicle to recognize the causes of the driver's increased driving load. This allows the user to communicate with the driver according to the causes of the increased driving load.
 上記情報処理装置の一態様では、前記算出手段は、前記移動体の運転動作、運転者、又は走行環境の少なくともいずれかに関する現在の状態に基づき、前記指標値を算出する。好適な例では、前記検出手段は、前記移動体の運転動作、運転者、又は走行環境の少なくともいずれかに関する現在の状態を前記運転負荷に関連がある要素ごとに評価したスコアを算出し、前記スコアに基づき、前記指標値を算出する。これにより、情報処理装置は、運転負荷の指標値を的確に算出することができる。 In one aspect of the information processing device, the calculation means calculates the index value based on the current state of at least one of the driving behavior of the moving body, the driver, or the driving environment. In a preferred example, the detection means calculates a score that evaluates the current state of at least one of the driving behavior of the moving body, the driver, or the driving environment for each element related to the driving load, and calculates the index value based on the score. This allows the information processing device to accurately calculate the index value of the driving load.
 上記情報処理装置の他の一態様では、前記検出手段は、前記スコアに基づき、前記原因となる前記要素を決定する。この態様により、情報処理装置は、運転負荷を高めている原因を的確に検出することが可能となる。 In another aspect of the information processing device, the detection means determines the factor that is the cause based on the score. This aspect enables the information processing device to accurately detect the cause of the increased driving load.
 上記情報処理装置の他の一態様では、前記表示制御手段は、前記原因に関する情報と、前記指標値の大きさを表す情報とを、前記端末装置に表示させる。この態様により、情報処理装置は、運転負荷を高めている原因と共に、運転負荷の度合いを好適に移動体の外部に存在するユーザに提示することができる。 In another aspect of the information processing device, the display control means causes the terminal device to display information related to the cause and information indicating the magnitude of the index value. With this aspect, the information processing device can conveniently present the degree of driving load, together with the cause of the increased driving load, to a user outside the mobile body.
 上記情報処理装置の他の一態様では、前記表示制御手段は、前記運転者の視線方向の軌跡を表す線を、前記移動体から前記移動体の外を撮影した撮影画像に重畳して前記端末装置に表示させる。この態様により、運転者の注意が薄い方向を端末装置のユーザが好適に把握することができるため、端末装置のユーザは、運転者とのコミュニケーションにより、運転者の注意が薄い方向の安全確認を補助することができる。好適な例では、前記表示制御手段は、前記指標値に応じて前記線の表示態様を変化させるとよい。また、好適な例では、前記表示制御手段は、前記運転負荷が所定基準よりも高いことを前記指標値が示す場合に、前記線を表示し、前記運転負荷が前記所定基準以下であることを前記指標値が示す場合に、前記線を非表示にするとよい。 In another aspect of the information processing device, the display control means causes the terminal device to display a line representing the trajectory of the driver's line of sight on an image captured from the moving body of the outside of the moving body. This aspect allows the user of the terminal device to conveniently grasp the direction in which the driver is not paying much attention, and the user of the terminal device can assist the driver in checking for safety in the direction in which the driver is not paying much attention by communicating with the driver. In a preferred example, the display control means may change the display mode of the line depending on the index value. In another preferred example, the display control means may display the line when the index value indicates that the driving load is higher than a predetermined standard, and may hide the line when the index value indicates that the driving load is equal to or lower than the predetermined standard.
 上記情報処理装置の他の一態様では、前記表示制御手段は、前記運転者が視線を向けるべき対象物を、前記撮影画像において強調表示する。この態様により、端末装置のユーザは、運転者が目を向けていない対象物を把握し、当該対象物に目を向けるように運転者を促すコミュニケーションを図ることなどが可能となる。 In another aspect of the information processing device, the display control means highlights in the captured image an object to which the driver should direct his or her gaze. This aspect enables the user of the terminal device to identify an object to which the driver is not directing his or her gaze and to communicate with the driver to encourage the driver to direct his or her gaze to the object.
 本発明の他の好適な実施形態では、移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置が実行する制御方法であって、前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出工程と、前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出工程と、前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御工程と、を有する。情報処理装置は、この制御方法を実行することで、移動体の外部に存在するユーザに対し、運転者の運転負荷を高めている原因を好適に認識させることができる。 In another preferred embodiment of the present invention, a control method is executed by an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body, the control method comprising a calculation step of calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection step of detecting a cause of increased driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control step of displaying information relating to the cause on the user's terminal device. By executing this control method, the information processing device can allow a user outside the mobile body to appropriately recognize the cause of increased driving load on the driver.
 本発明のさらに別の実施形態では、移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置のコンピュータが実行するプログラムであって、前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段としてコンピュータを機能させる。情報処理装置のコンピュータは、このプログラムを実行することで、移動体の外部に存在するユーザに対し、運転者の運転負荷を高めている原因を好適に認識させることができる。好適には、上記プログラムは、記憶媒体に記憶される。 In yet another embodiment of the present invention, a program executed by a computer of an information processing device used in a system that mediates communication between a driver of a mobile body and a user outside the mobile body causes the computer to function as a calculation means for calculating an index value relating to the degree of driving load on the driver of the mobile body, a detection means for detecting a cause of increased driving load when the index value indicates that the driving load is higher than a predetermined standard, and a display control means for displaying information relating to the cause on the user's terminal device. By executing this program, the computer of the information processing device can conveniently allow a user outside the mobile body to recognize the cause of increased driving load on the driver. Preferably, the program is stored in a storage medium.
 以下、図面を参照して本発明の好適な実施例について説明する。 Below, a preferred embodiment of the present invention will be described with reference to the drawings.
 (1)システム構成
 図1は、第1実施例に係る運転通話システムの構成例を示す。運転通話システムは、車両に設けられた車載機1と、車外に存在する者が使用する車外端末2とを有する。運転通話システムでは、車載機1と車外端末2とが互いに通信することで、車両の運転者と車外端末2のユーザとの間の音声通話を媒介し、車外端末2のユーザは、車両から得られる種々の情報を車外端末2により確認しながら車両の運転者と通話を行う。
(1) System Configuration Fig. 1 shows a configuration example of a driving call system according to a first embodiment. The driving call system has an in-vehicle device 1 installed in a vehicle and an external terminal 2 used by a person outside the vehicle. In the driving call system, the in-vehicle device 1 and the external terminal 2 communicate with each other to mediate a voice call between the driver of the vehicle and the user of the external terminal 2, and the user of the external terminal 2 talks to the driver of the vehicle while checking various information obtained from the vehicle on the external terminal 2.
 車載機1は、車両と共に移動し、当該車両の運転者と車外端末2のユーザとの通話を実現するための処理などを行う。以後では、車載機1が搭載された車両を「対象車両」とも呼ぶ。車載機1は、インターネットや専用通信網などの通信網3を介し、車外端末2とデータ通信を行う。車載機1と車外端末2とが授受を行うデータには、対象車両の運転者と車外端末2のユーザとの通話において生成される音声データ、及び、対象車両又は運転者に関する情報を表示するための表示指示データなどが含まれている。本実施例では、車載機1は、運転者の運転負荷(ワークロード)に関する情報(「ワークロード関連情報」とも呼ぶ。)を含む表示指示データを車外端末2に送信する。 The on-board device 1 moves with the vehicle and performs processes to realize a call between the driver of the vehicle and the user of the external terminal 2. Hereinafter, the vehicle equipped with the on-board device 1 is also referred to as the "target vehicle". The on-board device 1 performs data communication with the external terminal 2 via a communication network 3 such as the Internet or a dedicated communication network. Data exchanged between the on-board device 1 and the external terminal 2 includes voice data generated during a call between the driver of the target vehicle and the user of the external terminal 2, and display instruction data for displaying information about the target vehicle or the driver. In this embodiment, the on-board device 1 transmits display instruction data including information about the driver's driving load (workload) (also referred to as "workload-related information") to the external terminal 2.
 車載機1は、対象車両に設置され、設定された目的地への経路案内を行うナビゲーション装置であってもよく、スマートフォンなどの携帯端末であってもよい。また、車載機1は、対象車両に組み込まれてもよい。車載機1は、「情報処理装置」の一例である。また、対象車両は、「移動体」の一例である。 The vehicle-mounted device 1 may be a navigation device that is installed in the target vehicle and provides route guidance to a set destination, or may be a mobile terminal such as a smartphone. The vehicle-mounted device 1 may also be incorporated in the target vehicle. The vehicle-mounted device 1 is an example of an "information processing device." The target vehicle is an example of a "mobile body."
 車外端末2は、対象車両の外に存在する者が操作する端末であり、通信網3を介し、車載機1とデータ通信を行う。車外端末2は、例えば、スマートフォンなどの携帯端末である。例えば、車外端末2は、対象車両の運転中に車載機1と通信を確立し、運転者と車外端末2のユーザとが通話を行うための音声データの授受を車載機1と行う。また、車外端末2は、通話中においてワークロード関連情報を含む表示指示データを車載機1から受信し、表示指示データに基づき、運転者のワークロードの状態等を表示する。これにより、車外端末2は、車外端末2のユーザに運転者のワークロードの状態を認識させ、運転者に話しかけるのに都合がよいタイミングなどを見計らう材料となる情報を好適に提供する。車外端末2は、「端末装置」の一例である。 The external terminal 2 is a terminal operated by a person outside the target vehicle, and performs data communication with the in-vehicle device 1 via the communication network 3. The external terminal 2 is, for example, a mobile terminal such as a smartphone. For example, the external terminal 2 establishes communication with the in-vehicle device 1 while the target vehicle is being driven, and exchanges voice data with the in-vehicle device 1 for a call between the driver and the user of the external terminal 2. The external terminal 2 also receives display instruction data including workload-related information from the in-vehicle device 1 during the call, and displays the driver's workload state, etc. based on the display instruction data. In this way, the external terminal 2 allows the user of the external terminal 2 to recognize the driver's workload state, and preferably provides information that can be used to determine a convenient timing for talking to the driver. The external terminal 2 is an example of a "terminal device".
 なお、車載機1と車外端末2との通信は、図示しないサーバ装置により中継されることで実現されてもよい。また、車載機1と車外端末2とが直接通信を確立する場合であっても、車載機1と車外端末2とが通信を確立するために必要な情報(例えば通信アドレス情報等)の授受を、車載機1又は/及び車外端末2がサーバ装置と行ってもよい。これらの場合、サーバ装置は、車載機1と車外端末2とがデータ通信を行うために必要な処理(車載機1及び車外端末2の認証処理を含む)を行う。 The communication between the in-vehicle unit 1 and the external terminal 2 may be realized by being relayed by a server device (not shown). Even if the in-vehicle unit 1 and the external terminal 2 establish direct communication, the in-vehicle unit 1 and/or the external terminal 2 may exchange information (e.g., communication address information) necessary for the in-vehicle unit 1 and the external terminal 2 to establish communication with the server device. In these cases, the server device performs the processes necessary for the in-vehicle unit 1 and the external terminal 2 to perform data communication (including authentication processes for the in-vehicle unit 1 and the external terminal 2).
 (2)装置構成
 図2は、車載機1の概略構成の一例を示す。車載機1は、主に、通信部11と、記憶部12と、入力部13と、制御部14と、センサ群15と、表示部16と、音出力部17と、を有する。車載機1内の各要素は、バスライン10を介して相互に接続されている。
(2) Device Configuration Fig. 2 shows an example of a schematic configuration of the vehicle-mounted device 1. The vehicle-mounted device 1 mainly has a communication unit 11, a storage unit 12, an input unit 13, a control unit 14, a sensor group 15, a display unit 16, and a sound output unit 17. The elements in the vehicle-mounted device 1 are connected to each other via a bus line 10.
 通信部11は、制御部14の制御に基づき、他の端末とのデータ通信を行う。通信部11は、例えば、地図DB(DataBase)4を更新するための地図データを図示しない地図管理サーバから受信してもよい。 The communication unit 11 performs data communication with other terminals based on the control of the control unit 14. For example, the communication unit 11 may receive map data for updating the map DB (Database) 4 from a map management server (not shown).
 記憶部12は、RAM(Random Access Memory)、ROM(Read Only Memory)、不揮発性メモリ(ハードディスクドライブ、フラッシュメモリなどを含む)などの各種のメモリにより構成される。記憶部12は、車載機1が所定の処理を実行するためのプログラムが記憶される。上述のプログラムは、通話を行うためのアプリケーションプログラム、車外端末2に地図表示や対象車両において撮影された画像の表示を実行させるためのアプリケーションプログラムなどを含んでもよい。また、記憶部12は、制御部14の作業メモリとして使用される。なお、車載機1が実行するプログラムは、記憶部12以外の記憶媒体に記憶されてもよい。 The storage unit 12 is composed of various types of memory such as RAM (Random Access Memory), ROM (Read Only Memory), and non-volatile memory (including hard disk drives, flash memories, etc.). The storage unit 12 stores programs for the in-vehicle device 1 to execute predetermined processes. The above-mentioned programs may include application programs for making calls, and application programs for causing the external terminal 2 to display maps and images captured in the target vehicle. The storage unit 12 is also used as a working memory for the control unit 14. The programs executed by the in-vehicle device 1 may be stored in a storage medium other than the storage unit 12.
 また、記憶部12は、地図DB(DataBase)4を記憶する。地図DB4には、経路案内に必要な種々のデータが記録されている。地図DB4は、対象車両の現在位置などの所定の位置を基準とした地図表示に必要なデータである。地図DB4は、例えば、道路網をノードとリンクの組合せにより表した道路データ、及び、目的地、立寄地、又はランドマークの候補となる施設を示す施設データなどを含むデータベースである。地図DB4は、制御部14の制御下において、通信部11が地図管理サーバから受信する情報に基づき更新されてもよい。 The memory unit 12 also stores a map DB (Data Base) 4. Various data necessary for route guidance is recorded in the map DB 4. The map DB 4 is data necessary for map display based on a specified position such as the current position of the target vehicle. The map DB 4 is a database that includes, for example, road data that represents a road network by a combination of nodes and links, and facility data that indicates facilities that are candidates for a destination, a stop-off point, or a landmark. The map DB 4 may be updated based on information received by the communication unit 11 from a map management server under the control of the control unit 14.
 入力部13は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等である。表示部16は、制御部14の制御に基づき表示を行うディスプレイ等である。音出力部17は、制御部14の制御に基づき音を出力するスピーカ等である。 The input unit 13 is a button, a touch panel, a remote controller, a voice input device, etc. that the user can operate. The display unit 16 is a display or the like that displays information based on the control of the control unit 14. The sound output unit 17 is a speaker or the like that outputs sound based on the control of the control unit 14.
 センサ群15は、対象車両の状態又は車外の環境に関するセンシングを行う種々のセンサを含んでいる。センサ群15は、車外撮影カメラ51と、運転者カメラ52と、車両挙動検出器53と、生体センサ54と、を有する。 The sensor group 15 includes various sensors that sense the state of the target vehicle or the environment outside the vehicle. The sensor group 15 includes an exterior camera 51, a driver camera 52, a vehicle behavior detector 53, and a biosensor 54.
 車外撮影カメラ51は、対象車両の前方などの対象車両の外を撮影する1又は複数のカメラであり、所定時間間隔により撮影した画像(「車外撮影画像」とも呼ぶ。)を生成する。運転者カメラ52は、運転者の顔を撮影範囲に含むように設けられたカメラであり、所定時間間隔により撮影した画像(「運転者画像」とも呼ぶ。)を生成する。 The exterior camera 51 is one or more cameras that capture images outside the target vehicle, such as the area in front of the target vehicle, and generates images captured at predetermined time intervals (also called "exterior image"). The driver camera 52 is a camera that is installed so that the driver's face is included in the capture range, and generates images captured at predetermined time intervals (also called "driver image").
 車両挙動検出器53は、現在位置、車速、加速度、操舵角等の対象車両の挙動を示す検出信号を生成する。車両挙動検出器53は、例えば、GNSS(Global Navigation Satellite System)受信機、ジャイロセンサ、IMU(Inertial Measurement Unit)、車速センサ、加速度センサ、操舵角センサなどを含む。 The vehicle behavior detector 53 generates detection signals indicating the behavior of the target vehicle, such as the current position, vehicle speed, acceleration, steering angle, etc. The vehicle behavior detector 53 includes, for example, a GNSS (Global Navigation Satellite System) receiver, a gyro sensor, an IMU (Inertial Measurement Unit), a vehicle speed sensor, an acceleration sensor, a steering angle sensor, etc.
 生体センサ54は、運転者の心拍、発汗量等の生体現象を示す生体信号を生成する1又は複数のセンサである。 The biosensor 54 is one or more sensors that generate biosignals that indicate the driver's biological phenomena, such as heart rate and amount of sweat.
 なお、センサ群15は、車外撮影カメラ51、運転者カメラ52、車両挙動検出器53、及び生体センサ54の他、種々の外界センサ(カメラ、ライダ、レーダ、超音波センサ、赤外線センサ、ソナーなどを含む)及び内界センサを含んでもよい。 The sensor group 15 may include various external sensors (including cameras, lidars, radars, ultrasonic sensors, infrared sensors, sonar, etc.) and internal sensors in addition to the exterior camera 51, driver camera 52, vehicle behavior detector 53, and biosensor 54.
 制御部14は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)などを含み、車載機1の全体を制御する。制御部14は、「算出手段」、「検出手段」、「表示制御手段」、及びプログラムを実行するコンピュータ等として機能する。 The control unit 14 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the entire vehicle-mounted device 1. The control unit 14 functions as a "calculation means," a "detection means," a "display control means," and a computer that executes programs, etc.
 なお、制御部14が実行する処理は、プログラムによるソフトウェアで実現することに限ることなく、ハードウェア、ファームウェア、及びソフトウェアのうちのいずれかの組み合わせ等により実現してもよい。また、制御部14が実行する処理は、例えばFPGA(Field-Programmable Gate Array)又はマイコン等の、ユーザがプログラミング可能な集積回路を用いて実現してもよい。この場合、この集積回路を用いて、制御部14が本実施例において実行するプログラムを実現してもよい。 The processing executed by the control unit 14 is not limited to being realized by software programs, but may be realized by any combination of hardware, firmware, and software. The processing executed by the control unit 14 may also be realized by using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcomputer. In this case, the program executed by the control unit 14 in this embodiment may be realized by using this integrated circuit.
 図2に示す車載機1の構成は一例であり、図2に示す構成に対して種々の変更がなされてもよい。例えば、地図DB4を記憶部12が記憶する代わりに、制御部14が通信部11を介し、図示しない地図管理サーバから地図情報を受信してもよい。他の例では、入力部13、表示部16、及び音出力部17の少なくともいずれかは、車載機1の外部装置として対象車両内に設けられ、生成した信号を車載機1に供給してもよい。また、センサ群15の少なくとも一部のセンサは、対象車両に備え付けられたセンサであってもよい。この場合、車載機1は、対象車両に備え付けられたセンサが出力する情報を、対象車両からCAN(Controller Area Network)などの通信プロトコルに基づき取得してもよい。 2 is an example, and various modifications may be made to the configuration shown in FIG. 2. For example, instead of the map DB 4 being stored in the memory unit 12, the control unit 14 may receive map information from a map management server (not shown) via the communication unit 11. In another example, at least one of the input unit 13, the display unit 16, and the sound output unit 17 may be provided in the target vehicle as an external device of the vehicle-mounted unit 1, and may supply the generated signal to the vehicle-mounted unit 1. Also, at least some of the sensors in the sensor group 15 may be sensors mounted on the target vehicle. In this case, the vehicle-mounted unit 1 may acquire information output by a sensor mounted on the target vehicle from the target vehicle based on a communication protocol such as CAN (Controller Area Network).
 図3は、車外端末2の概略構成の一例を示す。車外端末2は、主に、通信部21と、記憶部22と、入力部23と、制御部24と、センサ群25と、表示部26と、音出力部27と、を有する。車外端末2内の各要素は、バスライン20を介して相互に接続されている。 FIG. 3 shows an example of the schematic configuration of the external terminal 2. The external terminal 2 mainly has a communication unit 21, a memory unit 22, an input unit 23, a control unit 24, a sensor group 25, a display unit 26, and a sound output unit 27. The elements in the external terminal 2 are connected to each other via a bus line 20.
 通信部21は、制御部24の制御に基づき、他の端末とのデータ通信を行う。記憶部22は、RAM、ROM、不揮発性メモリなどの各種のメモリにより構成される。記憶部22は、車外端末2が所定の処理を実行するためのプログラムが記憶される。上述のプログラムは、車載機1と通信を確立した状態において、運転者との通話、車載機1の対象車両の運転に関する表示(地図表示及び各種撮影画像の表示を含む)、及び運転者のワークロードに関する表示を行うためのアプリケーションプログラムなどを含んでもよい。また、記憶部22は、制御部24の作業メモリとして使用される。なお、車外端末2が実行するプログラムは、記憶部22以外の記憶媒体に記憶されてもよい。 The communication unit 21 performs data communication with other terminals under the control of the control unit 24. The storage unit 22 is composed of various types of memory such as RAM, ROM, and non-volatile memory. The storage unit 22 stores programs for the external terminal 2 to execute predetermined processes. The above-mentioned programs may include application programs for calling the driver, displaying information related to the driving of the target vehicle of the in-vehicle unit 1 (including displaying maps and various captured images), and displaying information related to the driver's workload when communication with the in-vehicle unit 1 is established. The storage unit 22 is also used as a working memory for the control unit 24. The programs executed by the external terminal 2 may be stored in a storage medium other than the storage unit 22.
 入力部23は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等である。表示部26は、制御部24の制御に基づき表示を行うディスプレイ等である。音出力部27は、制御部24の制御に基づき音を出力するスピーカ等である。センサ群25は、車外端末2の状態をセンシングする内界センサ、及び、車外端末2の外界の状態をセンシングする外界センサを含む。 The input unit 23 is a button, a touch panel, a remote controller, a voice input device, etc. that the user can operate. The display unit 26 is a display or the like that displays information based on the control of the control unit 24. The sound output unit 27 is a speaker or the like that outputs sound based on the control of the control unit 24. The sensor group 25 includes an internal sensor that senses the state of the external terminal 2, and an external sensor that senses the state of the external world of the external terminal 2.
 制御部24は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)などを含み、車外端末2の全体を制御する。なお、図3に示す車外端末2の構成は一例であり、図3に示す構成に対して種々の変更がなされてもよい。 The control unit 24 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the entire external terminal 2. Note that the configuration of the external terminal 2 shown in FIG. 3 is an example, and various modifications may be made to the configuration shown in FIG. 3.
 (3)ワークロード関連情報の表示
 次に、対象車両の運転者の運転負荷(ワークロード)に関するワークロード関連情報の表示について説明する。概略的には、車載機1は、運転者のワークロードが所定度合い以上となっている場合に、ワークロードを高めている原因(「ワークロード上昇原因」とも呼ぶ。)を特定し、特定したワークロード上昇原因を車外端末2に表示させる。これにより、車載機1は、ワークロード上昇原因を車外端末2のユーザに認識させ、車外端末2のユーザは、ワークロード上昇原因に応じた適切な対応をとることが可能となる。
(3) Display of Workload-Related Information Next, a display of workload-related information regarding the driving load (workload) of the driver of the target vehicle will be described. In summary, when the driver's workload is equal to or greater than a predetermined level, the in-vehicle device 1 identifies the cause of the increased workload (also called a "workload increase cause") and displays the identified workload increase cause on the external terminal 2. This allows the in-vehicle device 1 to make the user of the external terminal 2 aware of the workload increase cause, and the user of the external terminal 2 to take appropriate action according to the workload increase cause.
 (3-1)ワークロード値の算出
 まず、ワークロードの度合いを示す値(スコア)であるワークロード値の算出方法について説明する。以後では、説明の便宜上、ワークロード値は、ワークロード値が高いほど、運転者のワークロードの度合いが高いことを示すものとする。
(3-1) Calculation of Workload Value First, a method for calculating the workload value, which is a value (score) indicating the degree of the workload, will be described. Hereinafter, for the sake of convenience, the workload value indicates that the higher the workload value, the higher the degree of the driver's workload.
 車載機1は、ワークロードに関連がある要素(「ワークロード関連要素」とも呼ぶ。)ごとに現在の状態を評価したスコア(「要素別スコア」とも呼ぶ。)に基づき、ワークロード値を算出する。ワークロード関連要素は、対象車両、運転者、又は走行環境に関する状態を表す要素であり、ワークロード関連要素の例は、対象車両が走行中の道路(走行道路)の環境に関する要素(「道路環境要素」とも呼ぶ。)、ワークロードに影響がある運転動作(運転操作)に関する要素(「運転動作要素」とも呼ぶ。)、運転者の体調に関する要素(「運転者体調要素」とも呼ぶ。)などを含む。ワークロード関連要素は、言い換えると、ワークロード上昇原因の候補となる要素である。なお、ワークロード関連要素は、道路環境要素、運転動作要素、又は運転者体調要素の少なくともいずれかがさらに細分化された要素であってもよい。 The vehicle-mounted device 1 calculates a workload value based on a score (also called an "element-specific score") that evaluates the current state for each element related to the workload (also called "workload-related element"). The workload-related element is an element that represents the state related to the target vehicle, the driver, or the driving environment. Examples of workload-related elements include elements related to the environment of the road (driving road) on which the target vehicle is traveling (also called "road environment elements"), elements related to driving behavior (driving operations) that affect the workload (also called "driving behavior elements"), and elements related to the driver's physical condition (also called "driver physical condition elements"). In other words, the workload-related element is an element that is a candidate for the cause of an increase in workload. Note that the workload-related element may be an element obtained by further subdividing at least one of the road environment elements, driving behavior elements, and driver physical condition elements.
 従って、例えば、車載機1は、通話のための通信を車外端末2と確立している間、ワークロード関連要素毎に対象車両、運転者、又は走行環境に関する現在の状態を評価したスコア(「要素別スコア」とも呼ぶ。)を算出する。そして、車載機1は、算出した要素別スコアの合計値、平均値又はその他の代表値を、ワークロード値として定める。なお、要素別スコアを算出する対象となるワークロード関連要素は、道路環境要素、運転動作要素、運転者体調要素の全てであってもよく、これらのいずれか1つであってもよく、任意の2つであってもよい。また、要素別スコアは、好適には、各ワークロード関連要素のワークロードへの影響度合いに応じて重み付けされてもよい。この場合、例えば、ワークロードへの影響度が高い要素の要素別スコアに乗じる重み係数ほど大きい値となり、これらの重み係数を示す情報が予め記憶部12等に記憶されている。 Therefore, for example, while communication for a call is established with the external terminal 2, the vehicle-mounted device 1 calculates a score (also called an "element-specific score") that evaluates the current state of the target vehicle, driver, or driving environment for each workload-related element. The vehicle-mounted device 1 then determines the total value, average value, or other representative value of the calculated element-specific scores as the workload value. Note that the workload-related elements for which the element-specific score is calculated may be all of the road environment elements, driving behavior elements, and driver physical condition elements, or any one of these, or any two of them. In addition, the element-specific score may preferably be weighted according to the degree of influence of each workload-related element on the workload. In this case, for example, the weighting coefficient multiplied by the element-specific score of an element with a high degree of influence on the workload is a larger value, and information indicating these weighting coefficients is stored in advance in the storage unit 12, etc.
 ここで、車載機1は、道路環境要素の要素別スコアを算出する場合、例えば、走行道路の混雑度、幅員、制限速度、視認性の少なくとも1つを勘案して要素別スコアを算出する。この場合、例えば、車載機1は、地図DB4の道路情報又は通信部11により取得される道路交通情報等を参照し、走行道路の混雑度、幅員、制限速度、視認性等の各度合いを特定し、特定した各度合いを所定の式又はテーブルを参照して要素別スコアに換算する。所定の式又はテーブルは予め記憶部12等に記憶されている。なお、道路環境要素は、複数のワークロード関連要素として細分化されてもよい。この場合、例えば、車載機1は、走行道路の混雑度、幅員、制限速度、視認性の各々をワークロード関連要素とみなし、これらの要素別スコアを夫々算出してもよい。 Here, when the vehicle-mounted device 1 calculates the element-specific score of the road environment element, for example, the vehicle-mounted device 1 calculates the element-specific score by taking into account at least one of the congestion degree, width, speed limit, and visibility of the road to be traveled. In this case, for example, the vehicle-mounted device 1 refers to the road information in the map DB 4 or the road traffic information acquired by the communication unit 11, and identifies each degree of the congestion degree, width, speed limit, visibility, etc. of the road to be traveled, and converts each identified degree into an element-specific score by referring to a predetermined formula or table. The predetermined formula or table is stored in the storage unit 12 or the like in advance. Note that the road environment element may be subdivided into multiple workload-related elements. In this case, for example, the vehicle-mounted device 1 may regard each of the congestion degree, width, speed limit, and visibility of the road to be traveled as a workload-related element, and calculate the element-specific score for each of these.
 また、車載機1は、運転動作要素の要素別スコアを算出する場合、例えば、現在の運転動作が、右折、左折、合流、一時停止等のいずれの運転動作に該当するかを判定し、判定した運転動作に応じた要素別スコアを算出する。この場合、例えば、車載機1は、該当する運転動作と運転動作要素の要素別スコアとを対応付けたテーブル等を参照し、運転動作要素の要素別スコアを決定する。ここで、車載機1は、現在の運転動作を、経路案内処理により得られる情報(例えば対象車両の現在位置と設定された目的地までの経路)により判定してもよく、対象車両から得られるウィンカーの状態又はステアリングの状態を表す信号に基づき判定してもよい。また、車載機1は、オートクルーズ走行の有無(又はその他の自動運転機能の有無)に基づき、運転動作要素の要素別スコアを決定してもよい。なお、運転動作要素は、複数のワークロード関連要素として細分化されてもよい。例えば、右折の有無、左折の有無、合流の有無、一時停止の有無、オートクルーズ走行の有無が夫々ワークロード関連要素として設定され、車載機1は、これらの各ワークロード関連要素の要素別スコアを、所定のテーブル等を参照し、対応する運転動作の有無に応じて決定する。 When calculating the element-specific scores of the driving behavior elements, the vehicle-mounted device 1 determines whether the current driving behavior corresponds to a driving behavior such as a right turn, a left turn, merging, or a temporary stop, and calculates the element-specific scores according to the determined driving behavior. In this case, for example, the vehicle-mounted device 1 determines the element-specific scores of the driving behavior elements by referring to a table or the like that associates the corresponding driving behavior with the element-specific scores of the driving behavior elements. Here, the vehicle-mounted device 1 may determine the current driving behavior based on information obtained by the route guidance process (for example, the current position of the target vehicle and the route to the set destination), or may determine the current driving behavior based on a signal indicating the state of the turn signal or the steering state obtained from the target vehicle. The vehicle-mounted device 1 may also determine the element-specific scores of the driving behavior elements based on the presence or absence of auto-cruise driving (or the presence or absence of other automatic driving functions). The driving behavior elements may be subdivided into multiple workload-related elements. For example, whether or not a right turn was made, whether or not a left turn was made, whether or not a merge was made, whether or not a temporary stop was made, and whether or not auto-cruise was made are each set as workload-related elements, and the vehicle-mounted device 1 determines the element-specific score for each of these workload-related elements by referring to a predetermined table or the like, depending on the presence or absence of the corresponding driving action.
 また、車載機1は、運転者体調要素の要素別スコアを算出する場合、運転者の連続運転時間、眠気度(覚醒度)、その他運転に影響を与える生体指標の少なくとも1つを勘案して要素別スコアを算出する。この場合、車載機1は、例えば、眠気度又はその他運転に影響を与える生体指標を、生体センサ54が出力する信号に基づき特定する。そして、車載機1は、所定の式又はテーブルを参照し、特定した連続運転時間と1又は複数の生体指標とを、運転者体調要素の要素別スコアに換算する。なお、運転者体調要素は、複数のワークロード関連要素として細分化されてもよい。この場合、例えば、車載機1は、眠気度、連続運転時間の各々をワークロード関連要素とみなし、これらの要素別スコアを夫々算出してもよい。 When calculating the element-specific score of the driver's physical condition element, the vehicle-mounted device 1 calculates the element-specific score by taking into account at least one of the driver's continuous driving time, drowsiness level (alertness level), and other bioindicators that affect driving. In this case, the vehicle-mounted device 1, for example, identifies the drowsiness level or other bioindicators that affect driving based on the signal output by the biosensor 54. The vehicle-mounted device 1 then refers to a predetermined formula or table and converts the identified continuous driving time and one or more bioindicators into an element-specific score of the driver's physical condition element. Note that the driver's physical condition element may be subdivided into multiple workload-related elements. In this case, for example, the vehicle-mounted device 1 may regard each of the drowsiness level and the continuous driving time as workload-related elements and calculate the element-specific scores for these, respectively.
 以上のようにすることで、車載機1は、ワークロードに影響がある種々の要素を勘案してワークロード値を決定することができる。 By doing the above, the vehicle-mounted device 1 can determine the workload value by taking into account various factors that affect the workload.
 (3-2)ワークロード上昇原因の検出
 次に、ワークロード上昇原因の検出方法について説明する。
(3-2) Detection of Cause of Increase in Workload Next, a method for detecting the cause of an increase in workload will be described.
 車載機1は、算出したワークロード値が所定の閾値(「ワークロード閾値」とも呼ぶ。)より大きい値となった場合に、ワークロード上昇原因の検出処理を実行する。ワークロード閾値は、例えば予め定められ、記憶部12に記憶されている。ワークロード閾値は、「所定基準」の一例である。 The vehicle-mounted device 1 executes a process for detecting the cause of an increase in workload when the calculated workload value becomes greater than a predetermined threshold value (also called the "workload threshold value"). The workload threshold value is, for example, determined in advance and stored in the storage unit 12. The workload threshold value is an example of a "predetermined standard."
 そして、車載機1は、ワークロード値が閾値より大きい値となった場合、各ワークロード関連要素の要素別スコアがワークロード値に占める割合(「ワークロード値占有割合」とも呼ぶ。)を特定する。次に、車載機1は、ワークロード値占有割合の合計値が所定割合になるまで、ワークロード値占有割合が高いワークロード関連要素から順にワークロード値占有割合を合算する。そして、車載機1は、ワークロード値占有割合の合計値が予め定めた所定割合に達した場合、ワークロード値占有割合が合算されたワークロード関連要素を、ワークロード上昇原因として検出する。上述の所定割合は、例えば予め定められ、記憶部12に記憶されている。 Then, when the workload value becomes greater than the threshold value, the vehicle-mounted device 1 determines the proportion of the workload value that the element-specific score of each workload-related element accounts for (also called the "workload value occupancy proportion"). Next, the vehicle-mounted device 1 adds up the workload value occupancy proportions in order from the workload-related element with the highest workload value occupancy proportion until the total workload value occupancy proportion reaches a predetermined proportion. Then, when the total workload value occupancy proportion reaches a predetermined proportion, the vehicle-mounted device 1 detects the workload-related element with the added workload value occupancy proportion as the cause of the workload increase. The above-mentioned predetermined proportion is, for example, determined in advance and stored in the memory unit 12.
 例えば、上述の所定割合が30%である場合について例示する。この場合、車載機1は、ワークロード値占有割合が1位となるワークロード関連要素のワークロード値占有割合が30%以上を占めるか否か判定する。そして、車載機1は、ワークロード値占有割合が1位のワークロード関連要素のワークロード値占有割合が30%以上を占める場合には、当該ワークロード関連要素をワークロード上昇原因として検出する。一方、ワークロード値占有割合が1位のワークロード関連要素のワークロード値占有割合が30%以上を占めない場合には、車載機1は、ワークロード値占有割合が1位と2位のワークロード関連要素のワークロード値占有割合の合計値が30%以上を占めるかを判定する。そして、車載機1は、ワークロード値占有割合が1位と2位のワークロード関連要素のワークロード値占有割合の合計値が30%以上を占める場合には、これらのワークロード関連要素をワークロード上昇原因として検出する。一方、ワークロード値占有割合が1位と2位のワークロード関連要素のワークロード値占有割合が30%以上を占めない場合には、車載機1は、ワークロード値占有割合が1位~3位のワークロード関連要素のワークロード値占有割合の合計値が30%以上を占めるかを判定する。この処理を、車載機1は、ワークロード関連要素のワークロード値占有割合の合計値が30%以上を占めるまで繰り返し実行し、当該合計値が30%以上を占めた時点でワークロード値占有割合を合算したワークロード関連要素を、ワークロード上昇原因として検出する。 For example, the above-mentioned specified ratio is 30%. In this case, the vehicle-mounted device 1 determines whether the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is 30% or more. If the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is 30% or more, the vehicle-mounted device 1 detects the workload-related element as a cause of the workload increase. On the other hand, if the workload value occupancy ratio of the workload-related element with the first workload value occupancy ratio is not 30% or more, the vehicle-mounted device 1 determines whether the total value of the workload value occupancy ratios of the workload-related elements with the first and second workload value occupancy ratios is 30% or more. If the total value of the workload value occupancy ratios of the workload-related elements with the first and second workload value occupancy ratios is 30% or more, the vehicle-mounted device 1 detects these workload-related elements as a cause of the workload increase. On the other hand, if the workload value occupancy ratios of the workload-related elements with the first and second highest workload value occupancy ratios are not 30% or more, the vehicle-mounted device 1 determines whether the total value of the workload value occupancy ratios of the workload-related elements with the first to third highest workload value occupancy ratios is 30% or more. The vehicle-mounted device 1 repeatedly executes this process until the total value of the workload value occupancy ratios of the workload-related elements is 30% or more, and at the point when the total value is 30% or more, detects the workload-related element with the added workload value occupancy ratio as the cause of the workload increase.
 これにより、車載機1は、ワークロード上昇原因となる1又は複数のワークロード関連要素を、好適に検出することができる。即ち、ワークロードを高めている原因として、走行中道路の環境(混雑度、幅員、制限速度、視認性)、運転動作、ドライバの体調(連続運転時間や、生体情報に基づく眠気度など)の各要素のうちの一又は複数の要素が特定される。 As a result, the vehicle-mounted device 1 can appropriately detect one or more workload-related factors that cause an increase in the workload. In other words, one or more of the following factors are identified as causes of increased workload: the environment of the road during driving (degree of congestion, width, speed limit, visibility), driving behavior, and the driver's physical condition (continuous driving time, degree of drowsiness based on biometric information, etc.).
 (3-3)表示例
 図4は、車載機1の対象車両が走行中における車外端末2の第1表示例を示す。車外端末2は、車載機1と通信を確立しており、車載機1から受信する表示指示データに基づき、図4に示す表示画面を表示部26に表示している。
(3-3) Display Example Fig. 4 shows a first display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling. The external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in Fig. 4 on the display unit 26 based on the display instruction data received from the in-vehicle device 1.
 ここで、図4に示す表示画面は、車外撮影画像表示欄60と、ワークロード関連情報表示欄61と、地図表示欄62とを有する。 Here, the display screen shown in FIG. 4 has an outside-vehicle image display area 60, a workload-related information display area 61, and a map display area 62.
 車外端末2は、車外撮影画像表示欄60上において、車外撮影カメラ51が生成した最新の車外撮影画像(動画像)を表示する。この場合、車載機1は、車外撮影カメラ51が生成した最新の車外撮影画像を含む表示指示データを車外端末2に送信する。 The external terminal 2 displays the latest external image (video) generated by the external camera 51 on the external image display area 60. In this case, the vehicle-mounted device 1 transmits display instruction data including the latest external image generated by the external camera 51 to the external terminal 2.
 また、車外端末2は、車載機1が送信する表示指示データに含まれるワークロード関連情報に基づき、ワークロード関連情報表示欄61上において、表示ウィンドウ71と、インジケータ72とを表示する。 In addition, the external terminal 2 displays a display window 71 and an indicator 72 in the workload-related information display area 61 based on the workload-related information included in the display instruction data sent by the vehicle-mounted device 1.
 表示ウィンドウ71には、車載機1が算出したワークロード値のレベルと、車載機1が検出したワークロード上昇原因とが表示されている。ここでは、車載機1が算出したワークロード値が最も高いレベル(例えば5段階に分けた場合の最高レベル)であることから、表示ウィンドウ71には、最も高いレベルを示す「ワークロード大」が表示されている。また、ワークロード値がワークロード閾値を上回っていることから、車載機1は、ワークロード上昇原因(ここでは、交差点の右折)を検出し、車外端末2は、表示ウィンドウ71に、「交差点右折」を表示している。 The display window 71 displays the level of the workload value calculated by the in-vehicle unit 1 and the cause of the workload increase detected by the in-vehicle unit 1. Here, since the workload value calculated by the in-vehicle unit 1 is at the highest level (for example, the highest level when divided into five levels), the display window 71 displays "High workload", which indicates the highest level. In addition, since the workload value exceeds the workload threshold, the in-vehicle unit 1 detects the cause of the workload increase (here, turning right at an intersection), and the external terminal 2 displays "Turn right at intersection" in the display window 71.
 また、インジケータ72は、ワークロードの度合いを表しており、車載機1が算出したワークロード値が大きいほど、右方向に延びたゲージを表す。ここでは、ワークロード値が最大値付近となっていることから、インジケータ72のゲージは右端付近まで延びている。なお、ゲージは、ワークロード値に応じて色が変化してもよい。例えば、ワークロード値が高いほど、目立つ色によりゲージが表示されてもよい。表示ウィンドウ71の「ワークロード大」の表示及びインジケータ72は、「指標値の大きさを表す情報」の一例である。 Indicator 72 also indicates the degree of workload, and the greater the workload value calculated by in-vehicle device 1, the further to the right the gauge extends. Here, since the workload value is close to the maximum value, the gauge of indicator 72 extends to the near right end. The color of the gauge may change depending on the workload value. For example, the higher the workload value, the more noticeable the color of the gauge may be. The "High Workload" display in display window 71 and indicator 72 are examples of "information indicating the magnitude of the index value."
 また、車外端末2は、地図表示欄62上において、対象車両の現在位置付近の地図を表示する。車載機1は、車両挙動検出器53に基づき推定した対象車両の状態と地図DB4と目的地までの経路案内に関する情報とに基づき上述の地図を車外端末2が表示するための表示指示データを生成し、当該表示指示データを車外端末2に送信する。なお、ここでは、地図表示欄62には、対象車両の現在位置を示す現在位置マーク73と、車載機1が対象車両の運転者を案内する案内経路を示す経路線74とが地図に重畳されている。 The external terminal 2 also displays a map of the vicinity of the current position of the target vehicle on the map display area 62. The in-vehicle device 1 generates display instruction data for the external terminal 2 to display the above-mentioned map based on the state of the target vehicle estimated based on the vehicle behavior detector 53, the map DB 4, and information related to route guidance to the destination, and transmits the display instruction data to the external terminal 2. Note that here, a current position mark 73 indicating the current position of the target vehicle and a route line 74 indicating the guided route along which the in-vehicle device 1 will guide the driver of the target vehicle are superimposed on the map on the map display area 62.
 このように、第1表示例では、車載機1は、ワークロードが高い状態の場合に、ワークロードが高い状態である旨及びその原因(ここでは、交差点右折中であること)を好適に車外端末2のユーザに認識させることができる。これにより、例えば、車外端末2のユーザは、車載機1の運転者が運転に集中すべきと判断し、交差点右折が完了するまでは運転以外の内容(例えば、買い物の依頼等)について話しかけるのを控えるなどの対応をとることができる。 In this way, in the first display example, when the workload is high, the in-vehicle device 1 can conveniently make the user of the external terminal 2 aware of the fact that the workload is high and the cause of this (here, turning right at an intersection). This allows the user of the external terminal 2 to, for example, determine that the driver of the in-vehicle device 1 should concentrate on driving, and take measures such as refraining from talking to the driver about matters other than driving (for example, requests to go shopping, etc.) until the driver has completed the right turn at the intersection.
 図5は、車載機1の対象車両が走行中における車外端末2の第2表示例を示す。車外端末2は、車載機1と通信を確立しており、車載機1から受信する表示指示データに基づき、図5に示す表示画面を表示部26に表示している。車外端末2は、表示画面上に、車外撮影画像表示欄60と、ワークロード関連情報表示欄61と、地図表示欄62とを設けている。 FIG. 5 shows a second display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling. The external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 5 on the display unit 26 based on the display instruction data received from the in-vehicle device 1. The external terminal 2 has an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen.
 車外端末2は、車外撮影画像表示欄60上において、車載機1から受信する表示指示データに基づき、車外撮影カメラ51が生成した最新の車外撮影画像(動画像)を表示する。また、車外端末2は、地図表示欄62上において、車載機1から受信する表示指示データに基づき、対象車両の現在位置付近の地図を表示する。 The external terminal 2 displays the latest external image (video) generated by the external camera 51 in the external image display area 60 based on the display command data received from the vehicle-mounted device 1. The external terminal 2 also displays a map of the area around the current position of the target vehicle in the map display area 62 based on the display command data received from the vehicle-mounted device 1.
 また、車外端末2は、車載機1から受信する表示指示データに含まれるワークロード関連情報に基づき、ワークロード関連情報表示欄61上において、ワークロードの度合いを示す表示ウィンドウ71A及びインジケータ72Aを表示する。ここでは、車載機1が算出したワークロード値がワークロード閾値以下であることから、車載機1は、ワークロード上昇原因の検出を行っていない。よって、車外端末2は、表示ウィンドウ71に、ワークロード値のレベルが最も低いレベルであることを示す「ワークロード小」と表示し、ワークロード上昇原因を表示していない。なお、ここでは、車載機1は、現在の対象車両の運転モードが運転者のワークロードが低い状態となるオートクルーズ走行であることを車両から検出したことから、オートクルーズ走行中であることをワークロード関連情報に含めている。そして、車外端末2は、車載機1から受信するワークロード関連情報に基づき、オートクルーズ走行中であることをワークロード関連情報表示欄61に表示している。また、車外端末2は、ワークロード値のレベルに応じた長さとなるゲージを含むインジケータ72Aを、ワークロード関連情報表示欄61に表示している。 The external terminal 2 also displays a display window 71A and an indicator 72A indicating the degree of workload on the workload-related information display area 61 based on the workload-related information included in the display instruction data received from the vehicle-mounted device 1. Here, since the workload value calculated by the vehicle-mounted device 1 is equal to or lower than the workload threshold, the vehicle-mounted device 1 does not detect the cause of the workload increase. Therefore, the external terminal 2 displays "low workload" in the display window 71, indicating that the workload value level is the lowest level, and does not display the cause of the workload increase. Note that, here, since the vehicle-mounted device 1 detects from the vehicle that the current driving mode of the target vehicle is auto-cruise driving, which puts the driver in a low workload state, the workload-related information includes the fact that the vehicle is in auto-cruise driving. Based on the workload-related information received from the vehicle-mounted device 1, the external terminal 2 displays the fact that the vehicle is in auto-cruise driving in the workload-related information display area 61. The external terminal 2 also displays an indicator 72A including a gauge whose length corresponds to the level of the workload value in the workload-related information display area 61.
 このように、第2表示例では、車載機1は、ワークロードが低い状態の場合には、ワークロードが低い状態である旨を好適に車外端末2のユーザに認識させる。これにより、例えば、車外端末2のユーザは、車載機1の運転者が比較的余裕がある状態であると判断し、必要な用事について運転者に話しかけるタイミングを見計らうことができる。 In this way, in the second display example, when the workload is low, the in-vehicle device 1 conveniently lets the user of the external terminal 2 know that the workload is low. This allows, for example, the user of the external terminal 2 to determine that the driver of the in-vehicle device 1 is relatively available, and to choose the right timing to talk to the driver about necessary matters.
 図6は、車載機1の対象車両が走行中における車外端末2の第3表示例を示す。車外端末2は、車載機1と通信を確立しており、車載機1から受信する表示指示データに基づき、図6に示す表示画面を表示部26に表示している。車外端末2は、表示画面上に、車外撮影画像表示欄60と、ワークロード関連情報表示欄61と、地図表示欄62とを設けている。 FIG. 6 shows a third display example of the external terminal 2 when the target vehicle of the in-vehicle device 1 is traveling. The external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 6 on the display unit 26 based on the display instruction data received from the in-vehicle device 1. The external terminal 2 has an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen.
 車外端末2は、車外撮影画像表示欄60上において、第1表示例及び第2表示例と同様に、車載機1から受信する表示指示データに基づき、車外撮影カメラ51が生成した最新の車外撮影画像(動画像)を表示する。また、車外端末2は、地図表示欄62上において、第1表示例及び第2表示例と同様に、車載機1から受信する表示指示データに基づき、対象車両の現在位置付近の地図を表示する。 The external terminal 2 displays the latest external image (video) generated by the external camera 51 on the external image display area 60 based on the display instruction data received from the vehicle-mounted device 1, in the same manner as in the first and second display examples. The external terminal 2 also displays a map of the area around the current position of the target vehicle on the map display area 62 based on the display instruction data received from the vehicle-mounted device 1, in the same manner as in the first and second display examples.
 また、車外端末2は、車載機1から受信するワークロード関連情報に基づき、ワークロード関連情報表示欄61上において、ワークロードの度合い及びワークロード上昇原因を示す表示ウィンドウ71Bと、ワークロードの度合いを視覚的に示すインジケータ72Bとを表示する。ここでは、車載機1が算出したワークロード値がワークロード閾値より大きい値であることから、ワークロード上昇原因が表示ウィンドウ71B上に表示されている。具体的には、運転者体調要素の眠気度と、道路環境要素の視認性とを、夫々ワークロード上昇原因として車載機1が検出し、その結果、表示ウィンドウ71B上において、「眠気+低視認性」が表示されている。また、表示ウィンドウ71B上では、車載機1が算出したワークロード値に応じたワークロードのレベルを示す「ワークロードやや大」が表示されている。また、車外端末2は、判定したワークロード値のレベルに応じた長さとなるゲージを含むインジケータ72Bをワークロード関連情報表示欄61に表示している。 Based on the workload-related information received from the vehicle-mounted device 1, the external terminal 2 displays a display window 71B showing the workload level and the cause of workload increase, and an indicator 72B visually showing the workload level, in the workload-related information display area 61. Here, since the workload value calculated by the vehicle-mounted device 1 is greater than the workload threshold, the cause of the workload increase is displayed in the display window 71B. Specifically, the vehicle-mounted device 1 detects the drowsiness level of the driver's physical condition element and the visibility of the road environment element as causes of the workload increase, and as a result, "drowsiness + low visibility" is displayed in the display window 71B. Also, "workload somewhat high", which indicates the level of the workload according to the workload value calculated by the vehicle-mounted device 1, is displayed in the display window 71B. The external terminal 2 also displays an indicator 72B including a gauge whose length corresponds to the level of the determined workload value, in the workload-related information display area 61.
 このように、第3表示例では、車載機1は、運転者に眠気が発生していることによりワークロードが高い状態の場合に、ワークロードが高い状態である旨及びその原因を好適に車外端末2のユーザに認識させることができる。これにより、例えば、車外端末2のユーザは、車載機1の運転者が覚醒するように積極的に話しかけるなどの対応をとることができる。 In this way, in the third display example, when the workload is high due to the driver feeling drowsy, the in-vehicle device 1 can conveniently make the user of the external terminal 2 aware of the high workload and its cause. This allows the user of the external terminal 2 to take action, such as proactively speaking to the driver of the in-vehicle device 1 to wake him up.
 なお、第1表示例~第3表示例における表示レイアウトは一例であり、種々の変更が適用されてもよい。例えば、車外端末2は、車外端末2のユーザの車外端末2への操作に基づき、車外撮影画像に代えて、運転者撮影画像を表示してもよい。他の例では、車外端末2は、車外撮影画像表示欄60と、地図表示欄62とのいずれか一方のみを表示画面に設けてもよい。さらに別の例では、車外端末2は、ワークロード関連情報表示欄61のみを表示画面に設けてもよい。 Note that the display layouts in the first to third display examples are merely examples, and various modifications may be applied. For example, the external terminal 2 may display an image captured by the driver instead of an image captured outside the vehicle, based on an operation of the user of the external terminal 2 on the external terminal 2. In another example, the external terminal 2 may provide only one of the external image display area 60 and the map display area 62 on the display screen. In yet another example, the external terminal 2 may provide only the workload-related information display area 61 on the display screen.
 (4)処理フロー
 図7は、車載機1が実行する処理の手順を示すフローチャートの一例である。車載機1は、図7に示すフローチャートの処理を、車外端末2との通信が確立され、通話が開始された場合に実行する。
(4) Processing Flow Fig. 7 is an example of a flowchart showing the procedure of processing executed by the vehicle-mounted device 1. The vehicle-mounted device 1 executes the processing of the flowchart shown in Fig. 7 when communication with the external terminal 2 is established and a call is started.
 まず、車載機1は、対象車両の運転者のワークロードに関連する各ワークロード関連要素のスコア(即ち要素別スコア)を算出する(ステップS101)。この場合、車載機1は、地図DB4及びセンサ群15が出力するデータ等に基づき、道路環境要素、運転動作要素、運転者体調要素、又はこれらを細分化した要素ごとの要素別スコアを算出する。 First, the vehicle-mounted device 1 calculates the score of each workload-related element related to the workload of the driver of the target vehicle (i.e., element-specific score) (step S101). In this case, the vehicle-mounted device 1 calculates element-specific scores for road environment elements, driving behavior elements, driver physical condition elements, or elements that are further subdivided from these, based on the map DB 4 and data output by the sensor group 15.
 次に、車載機1は、各ワークロード関連要素のスコア(要素別スコア)に基づき、ワークロード値を算出する(ステップS102)。この場合、車載機1は、要素別スコアを合算又は平均(重み係数を用いる場合を含む)したワークロード値を算出してもよく、ワークロード値を算出する式に各要素別スコアを代入することで、ワークロード値を算出してもよい。 Next, the vehicle-mounted device 1 calculates a workload value based on the scores of each workload-related element (element-specific scores) (step S102). In this case, the vehicle-mounted device 1 may calculate a workload value by adding up or averaging (including using a weighting factor) the element-specific scores, or may calculate the workload value by substituting each element-specific score into a formula for calculating the workload value.
 次に、車載機1は、ワークロード値がワークロード閾値より大きい値であるか否か判定する(ステップS103)。そして、ワークロード値がワークロード閾値より大きい値である場合(ステップS103;Yes)、車載機1は、ワークロード上昇原因を検出する(ステップS104)。そして、車載機1は、ワークロード値及び検出したワークロード上昇原因に関するワークロード関連情報を含む表示指示データを車外端末2に供給する(ステップS105)。これにより、車外端末2は、表示指示データに基づいて、ワークロード値及びワークロード上昇原因に関する表示を実行する。なお、表示指示データに、車外撮影画像、運転者画像、現在位置周辺の地図を表示するための地図表示情報の少なくともいずれかが含まれている場合には、車外端末2は、車外撮影画像、運転者画像、又は地図表示情報の少なくともいずれかに基づく表示を行ってもよい。 Next, the vehicle-mounted device 1 determines whether the workload value is greater than the workload threshold (step S103). If the workload value is greater than the workload threshold (step S103; Yes), the vehicle-mounted device 1 detects the cause of the workload increase (step S104). The vehicle-mounted device 1 then supplies display instruction data including the workload value and workload-related information related to the detected cause of the workload increase to the external terminal 2 (step S105). As a result, the external terminal 2 performs display related to the workload value and the cause of the workload increase based on the display instruction data. Note that if the display instruction data includes at least one of an image taken outside the vehicle, an image of the driver, and map display information for displaying a map around the current position, the external terminal 2 may perform display based on at least one of the image taken outside the vehicle, the image of the driver, and the map display information.
 一方、ワークロード値がワークロード閾値以下である場合(ステップS103;No)、車載機1は、ワークロード値に関するワークロード関連情報を含む表示指示データを車外端末2に供給する(ステップS106)。これにより、車外端末2は、表示指示データに基づいて、ワークロード値に関する表示を実行する。なお、表示指示データには、車外撮影画像、運転者画像、地図表示情報の少なくともいずれかが含まれている場合には、車外端末2は、車外撮影画像、運転者画像、又は地図表示情報の少なくともいずれかに基づく表示を行ってもよい。 On the other hand, if the workload value is equal to or less than the workload threshold (step S103; No), the in-vehicle device 1 supplies display instruction data including workload-related information on the workload value to the external terminal 2 (step S106). As a result, the external terminal 2 executes display related to the workload value based on the display instruction data. Note that if the display instruction data includes at least one of an image taken outside the vehicle, an image of the driver, and map display information, the external terminal 2 may perform display based on at least one of an image taken outside the vehicle, an image of the driver, and map display information.
 そして、車載機1は、通話が終了したか否か判定する(ステップS107)。そして、車載機1は、通話が終了したと判定した場合(ステップS107;Yes)、フローチャートの処理を終了する。一方、車載機1は、通話が終了していないと判定した場合(ステップS107;No)、ステップS101へ処理を戻す。 Then, the vehicle-mounted unit 1 determines whether the call has ended (step S107). If the vehicle-mounted unit 1 determines that the call has ended (step S107; Yes), it ends the processing of the flowchart. On the other hand, if the vehicle-mounted unit 1 determines that the call has not ended (step S107; No), it returns the processing to step S101.
 (5)変形例
 次に、上述した実施例に好適な変形例について説明する。以下の変形例は組み合わせて上述の実施例に適用してもよい。
(5) Modifications Next, preferred modifications of the above-described embodiment will be described. The following modifications may be combined and applied to the above-described embodiment.
 (変形例1)
 車載機1は、車外撮影画像を車外端末2に表示させる場合、認識した運転者の視線の軌跡を表す線(視線トラッキング線)を車外撮影画像に重畳してもよい。
(Variation 1)
When the vehicle-mounted device 1 displays the outside-of-vehicle captured image on the external terminal 2, the vehicle-mounted device 1 may superimpose a line (gaze tracking line) representing the recognized trajectory of the driver's gaze on the outside-of-vehicle captured image.
 図8は、変形例1に係る車外端末2の第4表示例を示す。車外端末2は、車載機1と通信を確立しており、車載機1から受信する表示指示データに基づき、図8に示す表示画面を表示部26に表示している。ここでは、一例として、車外端末2は、車載機1から受信する表示指示データに基づき、第1表示例~第3表示例と同様に、車外撮影画像表示欄60と、ワークロード関連情報表示欄61と、地図表示欄62とを表示画面に設けている。 FIG. 8 shows a fourth display example of the external terminal 2 relating to the first modification. The external terminal 2 has established communication with the in-vehicle device 1, and displays the display screen shown in FIG. 8 on the display unit 26 based on the display instruction data received from the in-vehicle device 1. Here, as an example, the external terminal 2 provides an external-vehicle captured image display area 60, a workload-related information display area 61, and a map display area 62 on the display screen, similar to the first to third display examples, based on the display instruction data received from the in-vehicle device 1.
 ここで、車外撮影画像表示欄60上では、直近の所定時間内における対象車両の運転者の視線方向の軌跡を示す視線トラッキング線66が、車外撮影画像内における視線方向に対応付けて、最新の車外撮影画像上に重畳表示されている。この場合、車載機1は、任意の視線検出技術に基づき、運転者撮影画像から同時刻に撮影された車外撮影画像上の視線方向を検出し、検出した視線方向を示す視線方向データを生成する。そして、車載機1は、車外端末2に送信する表示指示データに含める車外撮影画像に、視線方向データが示す直近の所定時間内の運転者の視線方向を示す視線トラッキング線66を重畳する。上記の所定時間は、例えば予め記憶部12に記憶されている。そして、表示指示データを受信した車外端末2は、視線トラッキング線66が重畳された車外撮影画像を、車外撮影画像表示欄60に表示する。 Here, on the outside-vehicle image display area 60, gaze tracking lines 66 indicating the trajectory of the gaze direction of the driver of the target vehicle within a most recent specified time are superimposed on the latest outside-vehicle image in correspondence with the gaze direction in the outside-vehicle image. In this case, the vehicle-mounted device 1 detects the gaze direction on the outside-vehicle image taken at the same time as the driver's image based on any gaze detection technology, and generates gaze direction data indicating the detected gaze direction. Then, the vehicle-mounted device 1 superimposes the gaze tracking lines 66 indicating the gaze direction of the driver within the most recent specified time indicated by the gaze direction data on the outside-vehicle image included in the display instruction data to be transmitted to the outside-vehicle terminal 2. The above-mentioned specified time is stored in advance in the storage unit 12, for example. Then, upon receiving the display instruction data, the outside-vehicle terminal 2 displays the outside-vehicle image with the gaze tracking lines 66 superimposed on it in the outside-vehicle image display area 60.
 なお、車載機1は、例えば、車外撮影画像上の各位置と当該位置を運転者が見ているときの顔画像モデルとを対応付けた顔視線対応データを予め記憶しておき、当該顔視線対応データに基づき、視線方向データを生成してもよい。この場合、車載機1は、運転者撮影画像から抽出した運転者の顔画像と最も一致度が高い顔画像モデルを特定し、特定した顔画像モデルに対応する車外撮影画像上の位置を示す視線方向データを生成する。 The vehicle-mounted device 1 may, for example, prestore face-gaze correspondence data that associates each position on the image captured outside the vehicle with a face image model when the driver is looking at that position, and generate gaze direction data based on the face-gaze correspondence data. In this case, the vehicle-mounted device 1 identifies the face image model that most closely matches the driver's face image extracted from the driver's captured image, and generates gaze direction data that indicates the position on the image captured outside the vehicle that corresponds to the identified face image model.
 このように、本変形例によれば、運転者の注意が薄い方向(図8の例では右方向)を車外端末2のユーザが好適に把握することができるため、車外端末2のユーザは、運転者との通話により、運転者の注意が薄い方向の安全確認を補助することができる。また、運転者は同乗者が見守ってくれているような安心感を得ることができ、あたかも車外端末2のユーザが実際に同乗しているような一体感を得ることができる。 In this way, according to this modified example, the user of the external terminal 2 can easily grasp the direction in which the driver is not paying much attention (to the right in the example of FIG. 8), and the user of the external terminal 2 can assist the driver in checking for safety in the direction in which the driver is not paying much attention by talking to the driver. In addition, the driver can feel reassured as if a passenger is watching over him or her, and can feel a sense of unity as if the user of the external terminal 2 is actually riding with the driver.
 好適な例では、車載機1は、ワークロード値に応じて、視線トラッキング線の表示態様(表示色、表示有無等)を変化させてもよい。例えば、車載機1は、ワークロード値が所定の閾値以上となった場合に視線トラッキング線を車外撮影画像に重畳表示し、ワークロード値が上記閾値未満となった場合に視線トラッキング線を非表示にする。上述の所定の閾値は、ワークロード閾値と同一であってもよく、異なっていてもよい。これにより、車載機1は、車外端末2のユーザによる安全確認の補助が必要である蓋然性が高い場合に限定して視線トラッキング線を表示することができる。一方、車外端末2のユーザによる安全確認の補助が不要な通常時においては、車外端末2のユーザは車外撮影画像により風景を楽しむことができる。他の例では、車載機1は、ワークロード値に応じた色により視線トラッキング線を表示してもよい。この場合、車載機1は、例えば、ワークロード値が高いほど目立つ色により視線トラッキング線を表示する。 In a preferred example, the vehicle-mounted device 1 may change the display mode (display color, display on/off, etc.) of the gaze tracking line according to the workload value. For example, the vehicle-mounted device 1 superimposes the gaze tracking line on the outside-of-vehicle captured image when the workload value is equal to or greater than a predetermined threshold, and hides the gaze tracking line when the workload value is less than the threshold. The above-mentioned predetermined threshold may be the same as the workload threshold, or may be different. In this way, the vehicle-mounted device 1 can display the gaze tracking line only when there is a high probability that the user of the outside-vehicle terminal 2 needs assistance in checking safety. On the other hand, during normal times when the user of the outside-vehicle terminal 2 does not need assistance in checking safety, the user of the outside-vehicle terminal 2 can enjoy the scenery from the outside-vehicle captured image. In another example, the vehicle-mounted device 1 may display the gaze tracking line in a color according to the workload value. In this case, the vehicle-mounted device 1 displays the gaze tracking line in a color that is more noticeable the higher the workload value, for example.
 また、視線トラッキング線を表示する場合、車載機1は、車外撮影画像において運転者が視線を向けるべき対象物(「注視対象物」とも呼ぶ。)を強調表示して車外端末2に表示させてもよい。注視対象物は、例えば、信号機、道路標識、障害物、歩行者などを含む。この場合、車載機1は、任意の物体認識技術(インスタンスセグメンテーションなどの深層学習モデルを利用したものを含む)を用い、予め定めた注視対象物として定めた種類に該当する物体を車外撮影画像から抽出する。そして、車載機1は、抽出した車外撮影画像上の対象物のエリアを縁取りなどにより強調し、かつ、視線トラッキング線を重畳するように、車外撮影画像を加工し、加工した車外撮影画像を表示指示データに含めて車外端末2に送信する。表示指示データを受信した車外端末2は、視線トラッキング線が重畳され、かつ、注視対象物が強調された車外撮影画像を表示する。これにより、車外端末2のユーザは、車外撮影画像により、運転者が注視していない注視対象物の存在を認識し、当該注視対象物に注意を向けるように運転者にアドバイスすることができる。 When displaying the gaze tracking line, the vehicle-mounted device 1 may highlight the object (also called the "gaze object") to which the driver should direct his/her gaze in the outside-vehicle captured image and display it on the outside-vehicle terminal 2. The gaze object includes, for example, a traffic light, a road sign, an obstacle, a pedestrian, etc. In this case, the vehicle-mounted device 1 uses any object recognition technology (including those using deep learning models such as instance segmentation) to extract objects corresponding to a predetermined type as the gaze object from the outside-vehicle captured image. The vehicle-mounted device 1 then processes the outside-vehicle captured image so as to highlight the area of the object on the extracted outside-vehicle captured image by edging or the like and superimpose the gaze tracking line, and transmits the processed outside-vehicle captured image to the outside-vehicle terminal 2 together with the display instruction data. The outside-vehicle terminal 2 that has received the display instruction data displays the outside-vehicle captured image with the gaze tracking line superimposed and the gaze object highlighted. This allows the user of the external terminal 2 to recognize the presence of an object that the driver is not looking at from the image captured outside the vehicle, and to advise the driver to pay attention to the object.
 (変形例2)
 車載機1が実行する処理の少なくとも一部の処理を、車載機1及び車外端末2とデータ通信を行うサーバ装置が実行してもよい。
(Variation 2)
At least a part of the processing executed by the vehicle-mounted device 1 may be executed by a server device that performs data communication with the vehicle-mounted device 1 and the external terminal 2 .
 図9は、変形例2に係る運転通話システムの構成例を示す。運転通話システムは、車載機1Aと、車外端末2と、サーバ装置5とを有する。車載機1Aとサーバ装置5、及び、車外端末2とサーバ装置5は、夫々、通信網3を介してデータ通信を行う。 FIG. 9 shows an example of the configuration of a driving call system according to the second modification. The driving call system has an in-vehicle device 1A, an external terminal 2, and a server device 5. The in-vehicle device 1A and the server device 5, and the external terminal 2 and the server device 5 each perform data communication via the communication network 3.
 車載機1Aは、上述の第1実施例において説明した車載機1と同様の構成(図2参照)を有する。なお、地図DB4に基づく処理をサーバ装置5が行う場合には、車載機1Aは、地図DB4を有しなくともよい。そして、車載機1Aは、センサ群15が出力する情報、及び、入力部13により入力された入力情報などを含むアップロード信号をサーバ装置5に送信する。 The vehicle-mounted device 1A has the same configuration as the vehicle-mounted device 1 described in the first embodiment above (see FIG. 2). Note that if the server device 5 performs processing based on the map DB 4, the vehicle-mounted device 1A does not need to have the map DB 4. The vehicle-mounted device 1A then transmits to the server device 5 an upload signal that includes information output by the sensor group 15 and input information input by the input unit 13.
 サーバ装置5は、車載機1Aと車外端末2との音声通話に必要なデータの中継を行う。また、サーバ装置5は、音声通話中において、車載機1Aから受信するアップロード信号等に基づき、表示指示データを生成し、生成した表示指示データを車外端末2へ送信する。具体的には、サーバ装置5は、音声通話中において、車載機1Aから受信するアップロード信号等に基づき、図7に示されるフローチャートの処理を実行する。 The server device 5 relays data necessary for voice calls between the in-vehicle device 1A and the external terminal 2. During a voice call, the server device 5 generates display instruction data based on an upload signal or the like received from the in-vehicle device 1A, and transmits the generated display instruction data to the external terminal 2. Specifically, during a voice call, the server device 5 executes the process of the flowchart shown in FIG. 7 based on an upload signal or the like received from the in-vehicle device 1A.
 図10は、サーバ装置5の概略構成の一例を示す。サーバ装置5は、主に、通信部41と、記憶部42と、制御部44とを有する。サーバ装置5内の各要素は、バスライン40を介して相互に接続されている。 FIG. 10 shows an example of the schematic configuration of the server device 5. The server device 5 mainly has a communication unit 41, a storage unit 42, and a control unit 44. The elements in the server device 5 are connected to each other via a bus line 40.
 通信部41は、制御部44の制御に基づき、車載機1A及び車外端末2などの外部装置とのデータ通信を行う。記憶部42は、RAM、ROM、不揮発性メモリ(ハードディスクドライブ、フラッシュメモリなどを含む)などの各種のメモリにより構成される。記憶部42は、サーバ装置5が所定の処理を実行するためのプログラムが記憶される。また、記憶部42は、地図DB4を含んでいる。制御部44は、CPU、GPUなどを含み、サーバ装置5の全体を制御する。また、制御部44は、記憶部42に記憶されたプログラムを実行することで、車外端末2にワークロード関連情報を表示させるために必要な処理を実行する。 The communication unit 41 performs data communication with external devices such as the in-vehicle unit 1A and the external terminal 2 under the control of the control unit 44. The storage unit 42 is composed of various types of memory such as RAM, ROM, and non-volatile memory (including a hard disk drive, flash memory, etc.). The storage unit 42 stores programs for the server device 5 to execute predetermined processes. The storage unit 42 also includes a map DB 4. The control unit 44 includes a CPU, GPU, etc., and controls the entire server device 5. The control unit 44 also executes the programs stored in the storage unit 42 to perform processes required to display workload-related information on the external terminal 2.
 このように、車外端末2にワークロード関連情報を表示させるために必要な処理をサーバ装置5が実行する場合であっても、運転通話システムは、実施例と同様、運転者のワークロードの状態を車外端末2のユーザに好適に認識させ、ユーザは、運転者に話しかけるのに都合がよいタイミングなどを見計らうことが可能となる。本変形例において、サーバ装置5は、「情報処理装置」の一例である。 In this way, even if the server device 5 executes the processing required to display workload-related information on the external terminal 2, the driving call system, as in the embodiment, allows the user of the external terminal 2 to easily recognize the driver's workload state, and enables the user to choose a convenient time to talk to the driver. In this modified example, the server device 5 is an example of an "information processing device."
 以上説明したように、車載機1又はサーバ装置5は、移動体である対象車両の運転者と対象車両の外部に存在する車外端末2のユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置であって、算出手段と、検出手段と、表示制御手段とを有する。算出手段は、対象車両の運転者の運転負荷(ワークロード)の度合いに関する指標値であるワークロード値を算出する。検出手段は、運転負荷が所定基準よりも高いことをワークロード値が示す際における、運転負荷を高めている原因を検出する。表示制御手段は、運転負荷を高めている原因に関する情報を、車外端末2に表示させる。これにより、車外端末2のユーザは、運転者とコミュニケーションをとるのに都合がよいタイミングなどを見計らうことが可能となる。 As described above, the in-vehicle device 1 or the server device 5 is an information processing device used in a system that mediates communication between a driver of a target vehicle, which is a moving body, and a user of an external terminal 2 located outside the target vehicle, and has a calculation means, a detection means, and a display control means. The calculation means calculates a workload value that is an index value related to the degree of driving load (workload) of the driver of the target vehicle. The detection means detects the cause of the increased driving load when the workload value indicates that the driving load is higher than a predetermined standard. The display control means causes the external terminal 2 to display information related to the cause of the increased driving load. This enables the user of the external terminal 2 to choose a convenient timing for communicating with the driver.
 なお、上述した各実施例において、プログラムは、様々なタイプの非一時的なコンピュータ可読媒体(Non-transitory Computer Readable Medium)を用いて格納され、コンピュータである制御部等に供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記憶媒体(Tangible Storage Medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記憶媒体(例えばフレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記憶媒体(例えば光磁気ディスク)、CD-ROM(Read Only Memory)、CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(Random Access Memory)を含む。 In each of the above-mentioned embodiments, the program can be stored using various types of non-transitory computer readable media and supplied to a control unit, which is a computer. Non-transitory computer readable media include various types of tangible storage media. Examples of non-transitory computer readable media include magnetic storage media (e.g., flexible disks, magnetic tapes, hard disk drives), magneto-optical storage media (e.g., magneto-optical disks), CD-ROM (Read Only Memory), CD-R, CD-R/W, semiconductor memory (e.g., mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
 以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。すなわち、本願発明は、請求の範囲を含む全開示、技術的思想にしたがって当業者であればなし得るであろう各種変形、修正を含むことは勿論である。また、引用した上記の特許文献等の各開示は、本書に引用をもって繰り込むものとする。 The present invention has been described above with reference to the embodiments, but the present invention is not limited to the above-mentioned embodiments. Various modifications that a person skilled in the art can understand can be made to the configuration and details of the present invention within the scope of the present invention. In other words, the present invention naturally includes various modifications and amendments that a person skilled in the art could make in accordance with the entire disclosure, including the claims, and the technical ideas. In addition, the disclosures of the above cited patent documents and the like are incorporated into this document by reference.
 1、1A 車載機
 2 車外端末
 3 通信網
 4 地図DB
 5 サーバ装置
 11、21、41 通信部
 12、22、42 記憶部
 13 入力部
 14、24、44 制御部
 15、25 センサ群
 16、26 表示部
 17、27 音出力部
1, 1A Vehicle-mounted device 2 External terminal 3 Communication network 4 Map DB
5 Server device 11, 21, 41 Communication unit 12, 22, 42 Storage unit 13 Input unit 14, 24, 44 Control unit 15, 25 Sensor group 16, 26 Display unit 17, 27 Sound output unit

Claims (12)

  1.  移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置であって、
     前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、
     前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、
     前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段と、
    を有する情報処理装置。
    An information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising:
    A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body;
    a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
    a display control means for displaying information relating to the cause on a terminal device of the user;
    An information processing device having the above configuration.
  2.  前記算出手段は、前記移動体の運転動作、運転者、又は走行環境の少なくともいずれかに関する現在の状態に基づき、前記指標値を算出する、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the calculation means calculates the index value based on a current state of at least one of the driving behavior of the moving object, the driver, or the driving environment.
  3.  前記算出手段は、前記移動体の運転動作、運転者、又は走行環境の少なくともいずれかに関する現在の状態を前記運転負荷に関連がある要素ごとに評価したスコアを算出し、前記スコアに基づき、前記指標値を算出する、請求項2に記載の情報処理装置。 The information processing device according to claim 2, wherein the calculation means calculates a score that evaluates the current state of at least one of the driving behavior of the moving body, the driver, and the driving environment for each element related to the driving load, and calculates the index value based on the score.
  4.  前記検出手段は、前記スコアに基づき、前記原因となる前記要素を決定する、請求項3に記載の情報処理装置。 The information processing device according to claim 3, wherein the detection means determines the causative element based on the score.
  5.  前記表示制御手段は、前記原因に関する情報と、前記指標値の大きさを表す情報とを、前記端末装置に表示させる、請求項1~4のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 1 to 4, wherein the display control means causes the terminal device to display information relating to the cause and information representing the magnitude of the index value.
  6.  前記表示制御手段は、前記運転者の視線方向の軌跡を表す線を、前記移動体から前記移動体の外を撮影した撮影画像に重畳して前記端末装置に表示させる、請求項1~4のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 1 to 4, wherein the display control means causes the terminal device to display a line representing the trajectory of the driver's line of sight on an image captured from the moving body of an area outside the moving body.
  7.  前記表示制御手段は、前記指標値に応じて前記線の表示態様を変化させる、請求項6に記載の情報処理装置。 The information processing device according to claim 6, wherein the display control means changes the display mode of the line depending on the index value.
  8.  前記表示制御手段は、前記運転負荷が所定基準よりも高いことを前記指標値が示す場合に、前記線を表示し、前記運転負荷が前記所定基準以下であることを前記指標値が示す場合に、前記線を非表示にする、請求項7に記載の情報処理装置。 The information processing device according to claim 7, wherein the display control means displays the line when the index value indicates that the driving load is higher than a predetermined standard, and hides the line when the index value indicates that the driving load is equal to or lower than the predetermined standard.
  9.  前記表示制御手段は、前記運転者が視線を向けるべき対象物を、前記撮影画像において強調表示する、請求項6に記載の情報処理装置。 The information processing device according to claim 6, wherein the display control means highlights an object to which the driver should direct his or her gaze in the captured image.
  10.  移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置が実行する制御方法であって、
     前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出工程と、
     前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出工程と、
     前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御工程と、
    を有する制御方法。
    A control method executed by an information processing device used in a system that mediates communication between a driver of a moving body and a user present outside the moving body, comprising:
    A calculation step of calculating an index value related to a degree of a driving load of a driver of the moving body;
    a detection step of detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
    a display control step of displaying information about the cause on a terminal device of the user;
    The control method includes:
  11.  移動体の運転者と前記移動体の外部に存在するユーザとのコミュニケーションを仲介するシステムに用いられる情報処理装置のコンピュータが実行するプログラムであって、
     前記移動体の運転者の運転負荷の度合いに関する指標値を算出する算出手段と、
     前記運転負荷が所定基準よりも高いことを前記指標値が示す際における、前記運転負荷を高めている原因を検出する検出手段と、
     前記原因に関する情報を、前記ユーザの端末装置に表示させる表示制御手段
    としてコンピュータを機能させるプログラム。
    A program executed by a computer of an information processing device used in a system that mediates communication between a driver of a moving body and a user outside the moving body, comprising:
    A calculation means for calculating an index value relating to a degree of a driving load of a driver of the moving body;
    a detection means for detecting a cause of an increase in the driving load when the index value indicates that the driving load is higher than a predetermined standard;
    A program that causes a computer to function as a display control means for displaying information regarding the cause on a terminal device of the user.
  12.  請求項11に記載のプログラムを記憶したことを特徴とする記憶媒体。 A storage medium storing the program according to claim 11.
PCT/JP2023/022911 2022-10-12 2023-06-21 Information processing apparatus, control method, program, and storage medium WO2024079942A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-163660 2022-10-12
JP2022163660 2022-10-12

Publications (1)

Publication Number Publication Date
WO2024079942A1 true WO2024079942A1 (en) 2024-04-18

Family

ID=90669254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/022911 WO2024079942A1 (en) 2022-10-12 2023-06-21 Information processing apparatus, control method, program, and storage medium

Country Status (1)

Country Link
WO (1) WO2024079942A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118565A (en) * 2009-12-02 2011-06-16 Denso It Laboratory Inc Workload indicator device, workload display method, and program
WO2011148455A1 (en) * 2010-05-25 2011-12-01 富士通株式会社 Video processing device, video processing method, and video processing program
JP2016213791A (en) * 2015-05-13 2016-12-15 株式会社デンソー Wakefulness maintenance system and on-vehicle unit
JP2022129626A (en) * 2021-02-25 2022-09-06 株式会社デンソーテン On-vehicle terminal device, electronic conference system, electronic conference method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118565A (en) * 2009-12-02 2011-06-16 Denso It Laboratory Inc Workload indicator device, workload display method, and program
WO2011148455A1 (en) * 2010-05-25 2011-12-01 富士通株式会社 Video processing device, video processing method, and video processing program
JP2016213791A (en) * 2015-05-13 2016-12-15 株式会社デンソー Wakefulness maintenance system and on-vehicle unit
JP2022129626A (en) * 2021-02-25 2022-09-06 株式会社デンソーテン On-vehicle terminal device, electronic conference system, electronic conference method and program

Similar Documents

Publication Publication Date Title
JP4370869B2 (en) Map data updating method and map data updating apparatus
JP4973331B2 (en) Information provision device
US20070067100A1 (en) Merge support system
JP2016193683A (en) Vehicle control device
JP2005041433A (en) Vehicle guiding device and route judging program
JP7211707B2 (en) Agent cooperation method
JP2018173862A (en) Driving support apparatus and computer program
JP2014120111A (en) Travel support system, travel support method, and computer program
JP2018149941A (en) Concentration level determination device, concentration level determination method, and program for determining concentration level
JP2018052326A (en) Vehicle control method and vehicle control device
JP2020024551A (en) Driving consciousness estimation device
JP2014120114A (en) Travel support system, travel support method, and computer program
JP7107157B2 (en) Operation support method and operation support device
JP2017129973A (en) Driving support apparatus and driving support method
JP6136238B2 (en) Driving support system, driving support method, and computer program
US20190202291A1 (en) Vehicle operation system and non-transitory tangible computer readable storage medium
JP2018151907A (en) Device for determining degree of concentration, method for determining degree of concentration, and program for determining degree of concentration
JP2020035437A (en) Vehicle system, method to be implemented in vehicle system, and driver assistance system
US20220076670A1 (en) Virtual conversation agent for controlling multiple vehicular intelligent virtual assistants
JP2014120113A (en) Travel support system, travel support method, and computer program
JP6575451B2 (en) Driving support device and driving support program
JP7376996B2 (en) Vehicle dangerous situation determination device, vehicle dangerous situation determination method, and program
JP2019096108A (en) Driving support device
WO2024079942A1 (en) Information processing apparatus, control method, program, and storage medium
JP7163581B2 (en) Agent cooperation system and agent cooperation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23876951

Country of ref document: EP

Kind code of ref document: A1