WO2018029904A1 - Driving tendency determination apparatus and driving tendency determination system - Google Patents

Driving tendency determination apparatus and driving tendency determination system Download PDF

Info

Publication number
WO2018029904A1
WO2018029904A1 PCT/JP2017/014736 JP2017014736W WO2018029904A1 WO 2018029904 A1 WO2018029904 A1 WO 2018029904A1 JP 2017014736 W JP2017014736 W JP 2017014736W WO 2018029904 A1 WO2018029904 A1 WO 2018029904A1
Authority
WO
WIPO (PCT)
Prior art keywords
acceleration
driving tendency
vehicle
data
determination
Prior art date
Application number
PCT/JP2017/014736
Other languages
French (fr)
Japanese (ja)
Inventor
板原 弘
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018029904A1 publication Critical patent/WO2018029904A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/16Control of vehicles or other craft
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/052Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles characterised by provision for recording or measuring trainee's performance

Definitions

  • the present disclosure relates to a driving tendency determination device and a driving tendency determination system that determine a driving tendency of a vehicle driver.
  • Patent Document 1 discloses a vehicle driving support device that determines a driving operation state of a vehicle based on an amount of change in acceleration or the like.
  • the vehicle driving support device of Patent Document 1 includes a change amount calculation unit that calculates a first related value related to a change amount of acceleration of a vehicle, and a jerk calculation that calculates a second related value related to the jerk of the vehicle. Means, acceleration calculating means for calculating a third related value related to the absolute value of the acceleration of the vehicle, and state determining means for determining the driving operation state of the vehicle based on the first to third related values.
  • the state determining means uses the first determination map and based on the first related value and the second related value, The driving operation state of the vehicle is determined according to a criterion set in advance based on the ratio of the kinetic energy of the mass point at the end of the change to the amount of change in the acceleration of the vehicle, calculated using a vibration model representing the movement of the mass point. judge. Further, when the first related value is smaller than the predetermined value, the state determining means uses a second determination map different from the first determination map, and based on the first related value and the third related value, The driving operation state is determined. With this configuration, it is possible to accurately determine the driving operation state of the vehicle regardless of the amount of change in acceleration.
  • a driving tendency determination device includes an acquisition unit that acquires information indicating acceleration and jerk of a vehicle measured when the determination target person drives the vehicle, and a time series of acceleration and jerk measured when the vehicle is driven by an arbitrary driver. And an arithmetic unit implemented with an artificial intelligence function learned using a determination image including a two-dimensional plane on which is plotted.
  • the calculation unit generates a plurality of determination images based on the acceleration and jerk information measured during driving of the determination target person, and inputs the plurality of determination images to the artificial intelligence function to increase the driving tendency of the determination target person. judge.
  • a driving tendency determination system includes a measuring device that measures vehicle acceleration, a portable terminal that receives information indicating acceleration measured from the measuring apparatus and transfers the information to the driving tendency determination device, and receives information indicating acceleration from the portable terminal. And the driving tendency determination device that learns the artificial intelligence function based on the received information.
  • the driving tendency detection device and the driving tendency system of the present disclosure it is possible to determine the driving tendency of the driver of the vehicle based on information on acceleration and jerk measured during driving of the vehicle.
  • FIG. 1 is a diagram illustrating a configuration of a driving tendency determination system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of the telemeter unit in the first embodiment.
  • FIG. 3 is a block diagram showing a configuration of the portable terminal in the first embodiment.
  • FIG. 4 is a block diagram showing the configuration of the data server (driving tendency determination device) in the first embodiment.
  • FIG. 5 is a diagram for explaining the flow of driving tendency data in the first embodiment.
  • FIG. 6 is a diagram illustrating a change in vehicle speed when a sudden brake is applied during vehicle operation while descending a gradient.
  • FIG. 7 is a diagram for explaining a change in acceleration when a sudden brake is applied during driving of the vehicle while descending a gradient.
  • FIG. 1 is a diagram illustrating a configuration of a driving tendency determination system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of the telemeter unit in the first
  • FIG. 8A is a view for explaining a J (jerk acceleration) -G (acceleration) plane.
  • FIG. 8B is a diagram for explaining an area determined as dangerous driving on the J (jerk acceleration) -G (acceleration) plane.
  • FIG. 9 is a diagram showing a convolutional neural network in the AI (artificial intelligence) function.
  • FIG. 10 is a diagram illustrating a method of generating an image (JG plane image) used for machine learning.
  • FIG. 11 is a diagram for explaining the learning procedure of the convolutional neural network.
  • FIG. 12 is a flowchart showing a process of transmitting driving tendency data from the telemeter unit to the portable terminal.
  • FIG. 13 is a flowchart showing driving tendency determination processing in the data server.
  • FIG. 14 is a diagram illustrating a configuration of the driving tendency determination system according to the second embodiment of the present disclosure.
  • FIG. 15 is a diagram for explaining the flow of driving tendency data in the second embodiment.
  • FIG. 16 is a diagram illustrating a configuration of the driving tendency determination system according to the third embodiment of the present disclosure.
  • FIG. 17 is a diagram for explaining the flow of driving tendency data in the third embodiment.
  • FIG. 1 illustrates a configuration of a driving tendency determination system according to an embodiment of the present disclosure.
  • the driving tendency determination system 100 includes a telemeter unit 10, a mobile terminal 40, and a data server 50.
  • the telemeter unit 10 is arranged in the vehicle 30, detects the behavior of the vehicle 30 when the vehicle 30 is driven, and data indicating the driving tendency (or driving operation state) of the driver of the vehicle based on the behavior (hereinafter, "" Driving tendency data ”) and send it to the outside.
  • the portable terminal 40 receives the driving tendency data from the telemeter unit 10 and transmits it to the data server 50 via the network 200.
  • the data server 50 updates the database based on the received driving tendency data.
  • FIG. 2 is a diagram illustrating the configuration of the telemeter unit 10.
  • the telemeter unit 10 includes a controller 11, an acceleration sensor 15, a communication interface 18 that enables wireless communication with other electronic devices in accordance with communication standards such as WiFi and Bluetooth (registered trademark), and a memory that stores data and the like. 17.
  • the controller 11 is composed of a CPU (Central Processing Unit) and an MPU (Micro Processing Unit), and implements predetermined functions to be described later by executing a program stored in the memory 17.
  • the program executed by the controller 11 may be provided via the network 200 or may be provided by a recording medium such as a CD-ROM.
  • the acceleration sensor 15 is a sensor that detects the acceleration of the vehicle 30 in three orthogonal directions (X, Y, and Z directions).
  • the width direction of the vehicle 30 is defined as the X direction
  • the traveling direction (forward direction) of the vehicle 30 is defined as the Y direction
  • the upward direction (zenith direction) of the vehicle 30 is defined as the Z direction.
  • the memory 17 is a recording medium for storing various data, and is composed of, for example, a semiconductor storage element such as a flash memory.
  • the memory 17 stores programs and data executed by the controller 11.
  • a recording medium such as a removable memory card or a hard disk may be used.
  • the communication interface 18 is a module that performs wireless communication in accordance with a communication standard such as WiFi or Bluetooth (registered trademark).
  • the communication interface 18 may perform communication according to a communication standard such as LTE (Long Term Evolution) or 3G. Note that the communication interface 18 is not limited to wireless communication, and may be wired communication.
  • the portable terminal 40 can communicate with the telemeter unit 10. In addition, the mobile terminal 40 can transmit information to the data server 50 via the network 200.
  • a smartphone is assumed as an example of the mobile terminal 40, but the mobile terminal 40 may be a PDA (Personal Digital Assistant), a mobile phone, or the like.
  • PDA Personal Digital Assistant
  • FIG. 3 is a diagram illustrating the configuration of the mobile terminal 40.
  • the portable terminal 40 includes an imaging unit 42 that captures an image, a display unit 43 that displays information such as an image, and an operation unit 45. Furthermore, the portable terminal 40 includes a first communication interface 48 that communicates to connect to a network, and a second communication interface 49 that communicates with other electronic devices. Furthermore, the mobile terminal 40 includes a RAM (Random Access Memory) 46 and a data storage unit 47 that store data and the like, and a controller 41 that controls the overall operation of the mobile terminal 40.
  • the portable terminal 40 may include an acceleration sensor as will be described later.
  • the image pickup unit 42 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and picks up a subject to generate image data.
  • an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and picks up a subject to generate image data.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the display unit 43 is composed of, for example, a liquid crystal display or an organic EL display.
  • the operation unit 45 includes a touch panel that is arranged on the upper surface of the display unit 43 and receives a touch operation by a user.
  • the operation unit 45 further includes operation buttons.
  • the operation buttons include buttons physically provided on the main body of the mobile terminal 40 and virtual buttons realized by the display unit 43 and the touch panel function.
  • the first communication interface 48 is a communication module for connecting to a network, and performs communication according to a communication standard such as LTE or 3G.
  • the second communication interface 49 is a communication module for wirelessly communicating with other electronic devices at a relatively short distance, and performs communication according to a communication standard such as WiFi or Bluetooth (registered trademark).
  • the second communication interface 49 may communicate with other electronic devices via a cable.
  • the second communication interface 49 may perform data communication in accordance with a standard such as USB (Universal Serial Bus).
  • the RAM 46 is a storage element that temporarily stores programs and data, and functions as a work area for the controller 41.
  • the data storage unit 47 is a recording medium that stores programs and various data, and may be configured by a recording medium such as a hard disk, a semiconductor memory, or a removable memory card.
  • the data storage unit 47 stores programs executed by the controller 41 (OS: Operating System, application program 47a) and data.
  • the controller 41 includes a CPU and an MPU, and implements a predetermined function to be described later by executing an application program 47a stored in the data storage unit 47.
  • the application program 47a may be provided via the network 200 or may be provided by a recording medium such as a CD-ROM.
  • FIG. 4 is a diagram illustrating the configuration of the data server 50.
  • the data server 50 is constituted by an information processing apparatus such as a personal computer.
  • the data server 50 includes a controller 51 that controls the overall operation thereof, a display unit 53 that performs screen display, an operation unit 55 that is operated by a user, a RAM 56 that stores data and programs, and a data storage unit 57.
  • the display unit 53 is configured by, for example, a liquid crystal display or an organic EL display.
  • the operation unit 55 includes a keyboard, a mouse, a touch panel, and the like.
  • the data server 50 further includes a device interface 58 for connecting to an external device such as a printer, and a network interface 59 for connecting to a network.
  • the device interface 58 is a communication module that performs data communication in accordance with USB, HDMI (registered trademark) (High-Definition Multimedia Interface), IEEE 1394, or the like.
  • the network interface 59 is a communication module that performs data communication in accordance with standards such as IEEE 802.11 and WiFi.
  • the controller 51 is composed of a CPU and an MPU, and realizes a predetermined function by executing a predetermined control program 57a stored in the data storage unit 57.
  • the control program executed by the controller 51 may be provided via the network 200 or may be provided on a recording medium such as a CD-ROM.
  • the RAM 56 is a storage element that temporarily stores programs and data, and functions as a work area for the controller 51.
  • the data storage unit 57 is a recording medium that stores parameters, data, and programs necessary for realizing the functions, and stores a control program executed by the controller 51 and various data.
  • the data storage unit 57 includes, for example, a hard disk (HDD: Hard Disk Drive) or a semiconductor storage device (SSD: Solid State Drive).
  • the data storage unit 57 is installed with a control program 57a for realizing functions described later.
  • the controller 51 implements various functions to be described later by executing the control program 57a.
  • the data storage unit 57 also stores a database 57b regarding the driving tendency of the driver.
  • the data storage unit 57 also functions as a work area for the controller 51.
  • the database 57b manages data indicating driving tendency for each driver (hereinafter referred to as “driving tendency data”).
  • the driving tendency data includes a driver ID for identifying the driver, a vehicle ID for identifying the vehicle, a data sampling date, information indicating acceleration, and information indicating jerk.
  • FIG. 5 is a diagram for explaining the flow of driving tendency data in the driving tendency determination system 100.
  • the telemeter unit 10 calculates driving tendency data indicating the driving tendency of the driver based on the acceleration data of the vehicle 30 (S1), and transmits the driving tendency data to the portable terminal 40 via the communication interface 18 (S2). ).
  • the portable terminal 40 receives the driving tendency data from the telemeter unit 10 via the second communication interface 49 (S2), and transmits the received driving tendency data to the data server 50 via the first communication interface 48 (S3).
  • the data server 50 receives the driving tendency data from the portable terminal 40 (S3), and updates the database 57b based on the received driving tendency data (S4).
  • the driving tendency data detected by the telemeter unit 10 is transmitted to the data server 50, and the database 57b is updated with the driving tendency data.
  • the driving tendency of the driver of the vehicle can be recognized.
  • Driving tendency data Used for determination of driving tendency will be described.
  • acceleration during driving there is a correlation between acceleration during driving and driving tendency. For example, when driving in an ordinary city, the maximum acceleration in a car is about 0.15 G, and if it exceeds 0.3 G, it can be determined that the driving is rough.
  • FIG. 6 is a diagram for explaining changes in vehicle speed measured when sudden braking is applied during vehicle operation while going down a gentle slope.
  • an accelerometer is used to drive a minivan (two passengers) down a gentle slope, and then the vehicle is decelerated from a speed of approximately 50 km / h and suddenly braked when the speed reaches 20 km / h.
  • the change in the acceleration of the vehicle when the vehicle was decelerated to 0 km / h was measured. The measurement was performed for about 11 seconds at a sampling interval of 20 ms.
  • FIG. 7 is a diagram for explaining changes in acceleration measured by the acceleration sensor with respect to the behavior of the vehicle as shown in FIG. Plot X shows acceleration in the X direction (vehicle width direction), plot Y shows acceleration in the Y direction (vehicle traveling (front-rear) direction), and plot Z shows acceleration in the Z direction (upward direction of the vehicle). ing.
  • the acceleration in the Y direction when sudden braking is started at time t1, the acceleration in the Y direction (traveling direction) changes greatly.
  • an offset of 0.3 always occurs in the acceleration in the Y direction.
  • the magnitude of this offset varies depending on the slope angle of the slope and the weight of the vehicle.
  • an offset occurs in the acceleration, it is difficult to accurately determine whether the state is the “safe driving state” or the “dangerous driving state” only by the acceleration.
  • the present inventor has determined the driving tendency in consideration of the jerk (also referred to as “jump”) in addition to the acceleration, and accurately determines the driving tendency (that is, the driving state). It was found that it could be detected.
  • the jerk is an amount indicating the rate of change of acceleration, and is obtained by differentiation of acceleration.
  • FIG. 8A shows a change in acceleration and jerk of a sampling point measured in a Y-axis direction (traveling direction) for a predetermined period (for example, 11 seconds) with respect to the behavior of the vehicle shown in FIG. (Hereinafter referred to as “JG plane”).
  • JG plane the horizontal axis represents jerk (J)
  • G acceleration
  • the present inventor has found that the plot of sampling points draws a unique trajectory due to a sudden change in the behavior of the vehicle on such a GG plane. Specifically, in a situation where safe driving is performed as shown in FIGS. 6 and 7, jerk (J) and acceleration (G) are relatively small on the GG plane as shown in FIG. 8B. The plots are concentrated in the range P (the region indicated by the alternate long and short dash line). Thereafter, when sudden braking is performed and a dangerous driving situation is reached, the locus Q of the plot is drawn counterclockwise far beyond the range P on the GG plane.
  • the image in which the sampling points are plotted on the GG plane in this way includes information corresponding to the driving tendency of the driver, such as the locus Q that appears corresponding to the sudden braking.
  • the inventor pays attention to this point, and performs machine learning on an image on the JG plane (hereinafter referred to as “JG plane image”) using a convolutional neural network.
  • JG plane image an image on the JG plane (hereinafter referred to as “JG plane image”) using a convolutional neural network.
  • the controller 51 of the data server 50 has an AI (Artificial Intelligence) function.
  • AI Artificial Intelligence
  • the controller 51 of the data server 50 uses a convolutional neural network (CNN) model in the AI function.
  • the convolutional neural network is made to learn an image (JG plane image) generated from long-term driving tendency data of the driver.
  • Data of an image (JG plane image) used for machine learning for a convolutional neural network is obtained as follows.
  • FIG. 10 is a diagram illustrating a method for generating an image (JG plane image) used for machine learning. It is assumed that long-term (for example, 30 hours) acceleration and jerk data (driving tendency data) 300 for a driver is measured by the telemeter unit 10. From this long-term driving tendency data 300, the acceleration and jerk data measured during a predetermined period (for example, 11 seconds) are plotted on the JG plane to obtain a JG plane image 310. create. As shown in FIG. 8B, the predetermined period is set so that information on the behavior of the vehicle from sudden braking to vehicle stop is included in one JG plane (for example, 11 seconds).
  • a predetermined period for example, 11 seconds
  • a plurality of JG plane images 310 are generated from the driving tendency data 300 measured over a long period (for example, 30 hours) (see FIG. 10). That is, for a single driver, a plurality of JG plane images (JG plane image group) are generated from the driving tendency data measured over a long period of time. Then, the JG plane image group for a large number of drivers is machine-learned by a convolutional neural network.
  • FIG. 11 is a diagram for explaining the learning procedure of the convolutional neural network.
  • Convolutional neural network learning is performed by executing three steps: (1) unsupervised deep learning (classification), (2) supervised deep learning (weighting clustered groups), and (3) inference. . Each step will be described.
  • Unsupervised deep learning A JG plane image group for a large number of drivers is input to a convolutional neural network.
  • the convolutional neural network performs clustering and automatically constructs a feature group (clustering) from the JG plane image group. For example, as shown in FIG. 9, clusters (feature groups) C1, C2, and C3 are constructed.
  • the convolutional neural network classifies highly responsive clusters as “accident reserve groups”, which are groups of drivers with a high probability of accident occurrence. That is, by giving history information (accident occurrence date and time) that an arbitrary driver has caused an accident as a teacher signal, the cluster group having the driving tendency is weighted with the accident preliminary group.
  • the cluster C1 is classified as an “accident reserve group” that is a group of drivers who have a high probability of accident occurrence.
  • the cluster C2 is classified as a “slightly dangerous driving group” that is a group of drivers having a slightly higher probability of accident occurrence.
  • Cluster C3 is classified as a “safe driving group” which is a group of drivers who perform safe driving.
  • a convolutional neural network is learned using a sufficient amount of JG plane images.
  • FIG. 12 is a flowchart showing a driving tendency data transmission process from the telemeter unit 10 to the portable terminal 40.
  • the driving tendency data transmission processing in the telemeter unit 10 will be described.
  • the process shown in FIG. 12 is mainly executed by the controller 11 of the telemeter unit 10.
  • the acceleration sensor 15 detects (measures) the acceleration (G) of the vehicle 30 at a predetermined sampling interval (for example, 20 msec), and information indicating the detected (measured) acceleration is stored in the memory 17. Accumulated.
  • the controller 11 of the telemeter unit 10 reads and acquires information on the acceleration (G) measured within a predetermined period (for example, one month) from the memory 17 (S11).
  • Controller 11 calculates jerk (J) at the sampling point from each acceleration (G) in the read Y direction (S12). Driving tendency data is generated from the acceleration and jerk information of each sampling point.
  • the controller 11 transmits the driving tendency data to the portable terminal 40 via the communication interface 18 (S13).
  • the driving tendency data obtained based on the acceleration data stored in the memory 17 of the telemeter unit 10, that is, the driving tendency data measured in a certain period is transmitted to the portable terminal 40.
  • the portable terminal 40 receives the driving tendency data from the telemeter unit 10 via the second communication interface 49, and transmits the received driving tendency data to the data server 50 connected to the network 200 via the first communication interface 48. To do.
  • the data server 50 receives the driving tendency data via the network interface 59 and accumulates the received driving tendency data in the database 57b (see FIGS. 4 and 5).
  • the user inputs information (ID number, name, etc.) specifying the target person who wants to determine the driving tendency through the operation unit 55 of the data server 50.
  • the controller 51 of the data server 50 receives the information specifying the target person, the controller 51 accesses the database 57b to acquire long-term driving tendency data (for example, for 30 hours) for the target person, and the acquired driving tendency data Based on the above, a JG plane image group is created (S21).
  • the controller 51 inputs the JG plane image group generated for the subject to the convolutional neural network, and determines the driving tendency for the subject (S22).
  • the convolutional neural network outputs the result of classification based on the JG plane image of the subject as a result of determining the driving tendency of the subject for each JG plane image.
  • the controller 51 outputs information indicating the driving tendency determination result obtained from the convolutional neural network to the display unit 53 (S23).
  • each JG image is classified into one of the clusters (for example, C1 to C3) and classified.
  • the information indicating the cluster is output.
  • the ratio of each output cluster to the total number of input JG images may be obtained and displayed as a determination result on the display unit.
  • the results output for each image of the JG image group input to the convolutional neural network are counted for each cluster.
  • the ratio (%) of the number of results classified into cluster C3 to the total number of JG images input for determination is obtained as “safe driving rate” (a rate indicating that the probability of causing an accident is extremely low).
  • the ratio (%) of the number of results classified into the cluster C2 with respect to the total number of JG image groups is defined as “slightly dangerous driving rate” (a rate indicating that the probability of causing an accident is somewhat high). Further, the ratio (%) of the number of results classified into the cluster C1 with respect to the total number of JG image groups is defined as “dangerous driving rate” (a rate indicating that the probability of causing an accident is very high). Then, the ratio to each item may be displayed on the display unit 53 as the determination result of the driving tendency. For example, “safe driving rate: 80%, somewhat dangerous driving rate: 15%, dangerous driving rate: 5%” may be displayed.
  • the determination result of the driving tendency may be stored in the data storage unit 57 or may be transmitted to another device via the network interface 59 or the device interface 58.
  • the data server 50 of the present embodiment can determine the driving tendency of the subject from the subject's JG plane image group.
  • the data server 50 acquires an information (network interface) that acquires information indicating the acceleration and jerk of the vehicle measured when the determination target person drives the vehicle. 59, a data storage unit 57, etc.) and an artificial intelligence function learned using a determination image including a two-dimensional plane in which a time series of acceleration and jerk measured when a vehicle is driven by an arbitrary driver is plotted
  • a controller 51 (calculation unit).
  • the controller 51 generates a plurality of JG plane images (an example of a determination image) based on acceleration and jerk information measured when the determination target person drives the vehicle, and uses the plurality of JG plane images as an artificial intelligence function. To determine the driving tendency of the person to be determined.
  • the driving tendency determination system 100 includes a telemeter unit 10 (an example of a measurement device) that measures vehicle acceleration, a mobile terminal 40 that receives and transfers information indicating acceleration measured from the telemeter unit 10, and a mobile terminal 40. And a data server 50 that receives information indicating acceleration from the computer and learns the artificial intelligence function based on the received information.
  • a telemeter unit 10 an example of a measurement device
  • a mobile terminal 40 that receives and transfers information indicating acceleration measured from the telemeter unit 10
  • a mobile terminal 40 that receives and transfers information indicating acceleration measured from the telemeter unit 10
  • a mobile terminal 40 that receives and transfers information indicating acceleration measured from the telemeter unit 10
  • a data server 50 that receives information indicating acceleration from the computer and learns the artificial intelligence function based on the received information.
  • the data server 50 is a convolutional neural that has previously learned a JG plane image in which the acceleration and jerk of the vehicle measured for a determination target person are plotted on a two-dimensional plane.
  • the determination result of the driving tendency of the determination target person can be obtained.
  • a plurality of JG plane images that is, driving tendency data measured over a long period of time is used as information to be input when learning the artificial intelligence function. For this reason, since the tendency of a driver
  • API Application Program Interface
  • driving-oriented deep learning and accident probability inference etc.
  • the driver's driving tendency that is likely to cause an accident can be automatically classified into clusters, and the association between the cluster and the accident can be automated.
  • it can be implemented on a common platform, including the final goal of inference (accident prediction).
  • FIG. 14 is a diagram illustrating a configuration of the driving tendency determination system 100b according to the second embodiment of the present disclosure.
  • the driving tendency determination system 100 is configured by the telemeter unit 10, the mobile terminal 40, and the data server 50.
  • the driving tendency determination system 100b of the present embodiment includes a telemeter unit 10 mounted on a vehicle and a data server 50. That is, the telemeter unit 10 transmits data directly to the data server 50 without using the mobile terminal 40.
  • the telemeter unit 10 can be connected to the network 200 via the communication interface 18 in order to communicate with the data server 50.
  • FIG. 15 is a diagram for explaining the flow of driving tendency data in the second embodiment.
  • the telemeter unit 10 calculates driving tendency data (S41), and transmits the calculated driving tendency data to the data server 50 (S42).
  • the data server 50 updates the database 57b based on the received data (S43).
  • the telemeter unit 10 may transmit driving tendency data to the data server 50 every predetermined period (for example, one month). Alternatively, the telemeter unit 10 may transmit data when there is an instruction from the user or a predetermined operation of the vehicle (engine start or the like).
  • Embodiment 3 a configuration in which the mobile terminal 40 includes the above functions of the telemeter unit 10 will be described. That is, in Embodiment 1, the telemeter unit 10 detects the acceleration of the vehicle 30 and calculates the driving tendency data, but the mobile terminal 40 may execute these functions of the vehicle 30.
  • FIG. 16 is a diagram illustrating a configuration of the driving tendency determination system 100c according to the third embodiment of the present disclosure.
  • the driving tendency determination system 100 c according to the present embodiment includes a mobile terminal 40 and a data server 50.
  • the mobile terminal 40 of this embodiment includes an acceleration sensor that can detect acceleration in three orthogonal directions (XYZ directions).
  • the portable terminal 40 is disposed in the vehicle 30 such that the traveling direction (forward direction) of the vehicle 30 is the Y direction and the upward direction of the vehicle is the Z direction.
  • the portable terminal 40 periodically measures and records the acceleration of the vehicle 30 during the driving of the vehicle 30 in the state of being arranged in the vehicle 30 as described above.
  • FIG. 17 is a diagram for explaining the flow of driving tendency data in the third embodiment.
  • the mobile terminal 40 calculates driving tendency data from the measured acceleration data according to the above-described method (S51), and transmits the calculated driving tendency data to the data server 50 (S52).
  • the data server 50 updates the database 57b based on the received driving tendency data (S53).
  • the portable terminal 40 may transmit driving tendency data to the data server 50 every predetermined period (for example, one month), or transmit data when a predetermined operation (such as a transmission instruction) is performed by the user. You may do it.
  • Embodiments 1 to 3 have been described as examples of the technology disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed.
  • the telemeter unit 10 has been described as an example of a driving tendency detection device.
  • the driving tendency detection device is not limited to the telemeter unit 10.
  • the portable terminal 40 and the data server 50 can also be configured as a driving tendency detection device.
  • the driving tendency data is generated in the telemeter unit 10, but may be generated in a device other than the telemeter unit 10, that is, the portable terminal 40 or the data server 50.
  • the telemeter unit 10 transmits a time series of detected acceleration data to the portable terminal 40.
  • the portable terminal 40 receives the time series of acceleration data via the second communication interface 49 (an example of an acquisition unit) and stores it in the data storage unit 47 for work.
  • the portable terminal 40 calculates jerk (J) based on the received acceleration data, and transmits data indicating the acceleration (G) and jerk (J) to the data server 50 as driving tendency data (FIG. 12). (See Steps S12 and S13).
  • the mobile terminal 40 can operate as a driving tendency detection device.
  • the telemeter unit 10 transmits the acceleration data to the portable terminal 40.
  • the portable terminal 40 transfers the received acceleration data to the data server 50.
  • the data server 50 receives a time series of acceleration data via the network interface 59 (an example of an acquisition unit) and stores it in the data storage unit 57 for work.
  • the data server 50 calculates jerk (J) based on the received acceleration data, generates driving tendency data from the data indicating the acceleration and jerk, and updates the database 57b with the generated driving tendency data.
  • the data server 50 can operate as a driving tendency detection device.
  • the controller 11, 41, 51 includes a CPU or MPU, and an example in which a predetermined function (described later) is realized by executing a predetermined program (software) has been described.
  • the functions of the controllers 11, 41, and 51 may be realized by cooperation of hardware and software, or may be realized only by a hardware circuit.
  • the controllers 11, 41, and 51 are a CPU, MPU, DSP (Digital Signal Processor), FPGA (Field-Programmable Gate Array), ASIC (Application Specific Integrated Circuit), ASP Can do.
  • the present disclosure can be applied to an apparatus that determines the driving tendency of a vehicle driver.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Technology (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

This driving tendency determination apparatus is provided with: an acquisition unit that acquires information indicating the jerk and the acceleration of a vehicle measured while the vehicle is driven by a determination subject; and an arithmetic unit that is provided with an artificial intelligence function realized through learning by using a determination image including a two-dimensional plane in which the time sequence of acceleration and jerk measured while a vehicle is driven by an arbitrarily defined driver is plotted. The arithmetic unit generates a plurality of determination images on the basis of the information indicating the acceleration and the jerk measured while the vehicle is driven by the determination subject, and determines the driving tendency of the determination subject by inputting the plurality of determination images to the artificial intelligence function.

Description

運転性向判定装置および運転性向判定システムDriving tendency determination device and driving tendency determination system
 本開示は、車両の運転者の運転性向を判定する運転性向判定装置及び運転性向判定システムに関する。 The present disclosure relates to a driving tendency determination device and a driving tendency determination system that determine a driving tendency of a vehicle driver.
 特許文献1は、加速度の変化量等に基づいて車両の運転操作状態を判定する車両用運転支援装置を開示する。 Patent Document 1 discloses a vehicle driving support device that determines a driving operation state of a vehicle based on an amount of change in acceleration or the like.
 特許文献1の車両用運転支援装置は、車両の加速度の変化量に関連する第1関連値を算出する変化量算出手段と、車両の躍度に関連する第2関連値を算出する躍度算出手段と、車両の加速度の絶対値に関連する第3関連値を算出する加速度算出手段と、第1ないし第3関連値に基づいて車両の運転操作状態を判定する状態判定手段とを備える。 The vehicle driving support device of Patent Document 1 includes a change amount calculation unit that calculates a first related value related to a change amount of acceleration of a vehicle, and a jerk calculation that calculates a second related value related to the jerk of the vehicle. Means, acceleration calculating means for calculating a third related value related to the absolute value of the acceleration of the vehicle, and state determining means for determining the driving operation state of the vehicle based on the first to third related values.
 状態判定手段は、変化量算出手段により算出された第1関連値が所定値以上のときに、第1判定マップを用いて、第1関連値と第2関連値とに基づいて、車室内の質点の動きを表す振動モデルを用いて算出された、車両の加速度の変化量に対するその変化の終了時における質点の運動エネルギーの比に基づいて予め設定された判定基準に従って、車両の運転操作状態を判定する。また、第1関連値が所定値よりも小さいときは、状態判定手段は、第1判定マップとは異なる第2判定マップを用いて、第1関連値と第3関連値とに基づいて車両の運転操作状態を判定する。この構成により、車両の運転操作状態を、加速度の変化量の大きさに拘わらず精度良く判定することを可能としている。 When the first related value calculated by the change amount calculating means is equal to or greater than a predetermined value, the state determining means uses the first determination map and based on the first related value and the second related value, The driving operation state of the vehicle is determined according to a criterion set in advance based on the ratio of the kinetic energy of the mass point at the end of the change to the amount of change in the acceleration of the vehicle, calculated using a vibration model representing the movement of the mass point. judge. Further, when the first related value is smaller than the predetermined value, the state determining means uses a second determination map different from the first determination map, and based on the first related value and the third related value, The driving operation state is determined. With this configuration, it is possible to accurately determine the driving operation state of the vehicle regardless of the amount of change in acceleration.
特開2012-198345号公報JP 2012-198345 A
 本開示の第一の態様において、運転性向判定装置が提供される。運転性向判定装置は、判定対象者の車両運転時に測定された車両の加速度と加加速度を示す情報を取得する取得部と、任意の運転者による車両運転時に測定された加速度と加加速度の時系列がプロットされた二次元平面を含む判定画像を用いて学習した人工知能機能を実装した演算部と、を備える。演算部は、判定対象者の車両運転時に測定された加速度及び加加速度の情報に基づき複数の判定画像を生成し、複数の判定画像を人工知能機能に入力することにより判定対象者の運転性向を判定する。 In a first aspect of the present disclosure, a driving tendency determination device is provided. The driving tendency determination device includes an acquisition unit that acquires information indicating acceleration and jerk of a vehicle measured when the determination target person drives the vehicle, and a time series of acceleration and jerk measured when the vehicle is driven by an arbitrary driver. And an arithmetic unit implemented with an artificial intelligence function learned using a determination image including a two-dimensional plane on which is plotted. The calculation unit generates a plurality of determination images based on the acceleration and jerk information measured during driving of the determination target person, and inputs the plurality of determination images to the artificial intelligence function to increase the driving tendency of the determination target person. judge.
 本開示の第二の態様において、運転性向判定システムが提供される。運転性向判定システムは、車両の加速度を測定する測定装置と、測定装置から測定した加速度を示す情報を受信して運転性向判定装置へ転送する携帯端末と、携帯端末から加速度を示す情報を受信し、受信した情報に基づき人工知能機能を学習させる上記の運転性向判定装置と、を備える。 In a second aspect of the present disclosure, a driving tendency determination system is provided. The driving tendency determination system includes a measuring device that measures vehicle acceleration, a portable terminal that receives information indicating acceleration measured from the measuring apparatus and transfers the information to the driving tendency determination device, and receives information indicating acceleration from the portable terminal. And the driving tendency determination device that learns the artificial intelligence function based on the received information.
 本開示の運転性向検出装置及び運転性向システムによれば、車両運転時に測定された加速度及び加加速度の情報に基づき車両の運転者の運転性向を判定することができる。 According to the driving tendency detection device and the driving tendency system of the present disclosure, it is possible to determine the driving tendency of the driver of the vehicle based on information on acceleration and jerk measured during driving of the vehicle.
図1は本開示の実施の形態1における運転性向判定システムの構成を示す図である。FIG. 1 is a diagram illustrating a configuration of a driving tendency determination system according to Embodiment 1 of the present disclosure. 図2は実施の形態1におけるテレメータユニットの構成を示すブロック図である。FIG. 2 is a block diagram showing the configuration of the telemeter unit in the first embodiment. 図3は実施の形態1における携帯端末の構成を示すブロック図である。FIG. 3 is a block diagram showing a configuration of the portable terminal in the first embodiment. 図4は実施の形態1におけるデータサーバ(運転性向判定装置)の構成を示すブロック図である。FIG. 4 is a block diagram showing the configuration of the data server (driving tendency determination device) in the first embodiment. 図5は実施の形態1における運転性向データの流れを説明するための図である。FIG. 5 is a diagram for explaining the flow of driving tendency data in the first embodiment. 図6は勾配を下りながらの車両運転中に急ブレーキをかけたときの車両速度の変化を説明した図である。FIG. 6 is a diagram illustrating a change in vehicle speed when a sudden brake is applied during vehicle operation while descending a gradient. 図7は勾配を下りながらの車両運転中に急ブレーキをかけたときの加速度の変化を説明した図である。FIG. 7 is a diagram for explaining a change in acceleration when a sudden brake is applied during driving of the vehicle while descending a gradient. 図8AはJ(加加速度)-G(加速度)平面を説明するための図である。FIG. 8A is a view for explaining a J (jerk acceleration) -G (acceleration) plane. 図8BはJ(加加速度)-G(加速度)平面上での危険運転と判定される領域を説明した図である。FIG. 8B is a diagram for explaining an area determined as dangerous driving on the J (jerk acceleration) -G (acceleration) plane. 図9はAI(人工知能)機能における畳み込みニューラルネットワークを示す図である。FIG. 9 is a diagram showing a convolutional neural network in the AI (artificial intelligence) function. 図10は機械学習に用いる画像(J-G平面画像)の生成方法を説明した図である。FIG. 10 is a diagram illustrating a method of generating an image (JG plane image) used for machine learning. 図11は畳み込みニューラルネットワークの学習手順を説明した図である。FIG. 11 is a diagram for explaining the learning procedure of the convolutional neural network. 図12はテレメータユニットから携帯端末への運転性向データの送信処理を示すフローチャートである。FIG. 12 is a flowchart showing a process of transmitting driving tendency data from the telemeter unit to the portable terminal. 図13はデータサーバにおける運転性向判定処理を示すフローチャートである。FIG. 13 is a flowchart showing driving tendency determination processing in the data server. 図14は本開示の実施の形態2における運転性向判定システムの構成を示す図である。FIG. 14 is a diagram illustrating a configuration of the driving tendency determination system according to the second embodiment of the present disclosure. 図15は実施の形態2における運転性向データの流れを説明するための図である。FIG. 15 is a diagram for explaining the flow of driving tendency data in the second embodiment. 図16は本開示の実施の形態3における運転性向判定システムの構成を示す図である。FIG. 16 is a diagram illustrating a configuration of the driving tendency determination system according to the third embodiment of the present disclosure. 図17は実施の形態3における運転性向データの流れを説明するための図である。FIG. 17 is a diagram for explaining the flow of driving tendency data in the third embodiment.
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
 なお、発明者は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって請求の範囲に記載の主題を限定することを意図するものではない。 In addition, the inventor provides the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and is not intended to limit the claimed subject matter. .
 (実施の形態1)
 [1-1.構成]
  [1-1-1.システム構成]
 以下、添付の図面を参照して本発明に係る運転性向判定システムの実施の形態を説明する。以下の実施形態で説明する運転性向判定システムは、車両の運転者の運転性向を検出するシステムである。図1に、本開示の一実施形態の運転性向判定システムの構成を示す。
(Embodiment 1)
[1-1. Constitution]
[1-1-1. System configuration]
Embodiments of a driving tendency determination system according to the present invention will be described below with reference to the accompanying drawings. The driving tendency determination system described in the following embodiment is a system that detects the driving tendency of a vehicle driver. FIG. 1 illustrates a configuration of a driving tendency determination system according to an embodiment of the present disclosure.
 図1に示すように、運転性向判定システム100は、テレメータユニット10と、携帯端末40と、データサーバ50とを含む。テレメータユニット10は、車両30内に配置され、車両30が運転されたときの車両30の挙動を検出し、その挙動から車両の運転者の運転傾向(または運転操作状態)を示すデータ(以下「運転性向データ」という)を求め、外部に送信する。携帯端末40は、テレメータユニット10から運転性向データを受信し、ネットワーク200を介してデータサーバ50へ送信する。データサーバ50は、受信した運転性向データに基づきデータベースを更新する。 As shown in FIG. 1, the driving tendency determination system 100 includes a telemeter unit 10, a mobile terminal 40, and a data server 50. The telemeter unit 10 is arranged in the vehicle 30, detects the behavior of the vehicle 30 when the vehicle 30 is driven, and data indicating the driving tendency (or driving operation state) of the driver of the vehicle based on the behavior (hereinafter, "" Driving tendency data ”) and send it to the outside. The portable terminal 40 receives the driving tendency data from the telemeter unit 10 and transmits it to the data server 50 via the network 200. The data server 50 updates the database based on the received driving tendency data.
 以下、テレメータユニット10、携帯端末40およびデータサーバ50それぞれの要素の具体的な構成を説明する。 Hereinafter, specific configurations of the elements of the telemeter unit 10, the mobile terminal 40, and the data server 50 will be described.
  [1-1-2.テレメータユニット]
 図2は、テレメータユニット10の構成を説明した図である。テレメータユニット10は、コントローラ11と、加速度センサ15と、WiFiやBluetooth(登録商標)等の通信規格にしたがい他の電子機器との無線通信を可能とする通信インタフェース18と、データ等を格納するメモリ17と、を備える。
[1-1-2. Telemeter unit]
FIG. 2 is a diagram illustrating the configuration of the telemeter unit 10. The telemeter unit 10 includes a controller 11, an acceleration sensor 15, a communication interface 18 that enables wireless communication with other electronic devices in accordance with communication standards such as WiFi and Bluetooth (registered trademark), and a memory that stores data and the like. 17.
 コントローラ11はCPU(Central Processing Unit)やMPU(Micro Processing Unit)で構成され、メモリ17に格納されたプログラムを実行することで、後述する所定の機能を実現する。コントローラ11で実行されるプログラムはネットワーク200を介して提供されてもよいし、CD-ROM等の記録媒体によって提供されてもよい。 The controller 11 is composed of a CPU (Central Processing Unit) and an MPU (Micro Processing Unit), and implements predetermined functions to be described later by executing a program stored in the memory 17. The program executed by the controller 11 may be provided via the network 200 or may be provided by a recording medium such as a CD-ROM.
 加速度センサ15は、3つの直交方向(X、Y、Z方向)における車両30の加速度を検出するセンサである。ここで、車両30の幅方向をX方向とし、車両30の進行方向(前方方向)をY方向とし、車両30の上方向(天頂方向)をZ方向としている。 The acceleration sensor 15 is a sensor that detects the acceleration of the vehicle 30 in three orthogonal directions (X, Y, and Z directions). Here, the width direction of the vehicle 30 is defined as the X direction, the traveling direction (forward direction) of the vehicle 30 is defined as the Y direction, and the upward direction (zenith direction) of the vehicle 30 is defined as the Z direction.
 メモリ17は種々のデータを格納する記録媒体であり、例えば、フラッシュメモリのような半導体記憶素子で構成される。メモリ17は、コントローラ11により実行されるプログラムやデータを格納する。なお、メモリ17に代えて、着脱可能なメモリカード等の記録媒体や、ハードディスクを用いてもよい。 The memory 17 is a recording medium for storing various data, and is composed of, for example, a semiconductor storage element such as a flash memory. The memory 17 stores programs and data executed by the controller 11. Instead of the memory 17, a recording medium such as a removable memory card or a hard disk may be used.
 通信インタフェース18は、WiFiやBluetooth(登録商標)等の通信規格にしたがい無線通信を行うモジュールである。通信インタフェース18は、LTE(Long Term Evolution)や3G等の通信規格にしたがい通信を行なってもよい。なお、通信インタフェース18は無線通信に限らず有線による通信を行うものであっても良い。 The communication interface 18 is a module that performs wireless communication in accordance with a communication standard such as WiFi or Bluetooth (registered trademark). The communication interface 18 may perform communication according to a communication standard such as LTE (Long Term Evolution) or 3G. Note that the communication interface 18 is not limited to wireless communication, and may be wired communication.
  [1-1-3.携帯端末]
 携帯端末40はテレメータユニット10と通信可能である。また、携帯端末40は、データサーバ50に対して、ネットワーク200を介して情報を送信することができる。本実施形態では、携帯端末40の一例としてスマートフォンを想定するが、携帯端末40はPDA(Personal Digital Assistant)、携帯電話等であってもよい。
[1-1-3. Mobile device]
The portable terminal 40 can communicate with the telemeter unit 10. In addition, the mobile terminal 40 can transmit information to the data server 50 via the network 200. In the present embodiment, a smartphone is assumed as an example of the mobile terminal 40, but the mobile terminal 40 may be a PDA (Personal Digital Assistant), a mobile phone, or the like.
 図3は、携帯端末40の構成を説明した図である。携帯端末40は、画像を撮影する撮像部42と、画像等の情報を表示する表示部43と、操作部45とを備えている。さらに、携帯端末40は、ネットワークに接続するために通信を行なう第1通信インタフェース48と、他の電子機器と通信を行うための第2通信インタフェース49と、を備える。さらに、携帯端末40は、データ等を格納するRAM(Random Access Memory)46及びデータ格納部47と、携帯端末40の全体動作を制御するコントローラ41とを備える。携帯端末40は後述するように加速度センサを備えてもよい。 FIG. 3 is a diagram illustrating the configuration of the mobile terminal 40. The portable terminal 40 includes an imaging unit 42 that captures an image, a display unit 43 that displays information such as an image, and an operation unit 45. Furthermore, the portable terminal 40 includes a first communication interface 48 that communicates to connect to a network, and a second communication interface 49 that communicates with other electronic devices. Furthermore, the mobile terminal 40 includes a RAM (Random Access Memory) 46 and a data storage unit 47 that store data and the like, and a controller 41 that controls the overall operation of the mobile terminal 40. The portable terminal 40 may include an acceleration sensor as will be described later.
 撮像部42は、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等の画像センサを備え、被写体を撮像して画像データを生成する。 The image pickup unit 42 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and picks up a subject to generate image data.
 表示部43は例えば液晶ディスプレイや有機ELディスプレイで構成される。操作部45は、表示部43の上面に重ねて配置され、ユーザによるタッチ操作を受け付けるタッチパネルを含む。操作部45はさらに操作ボタンを含む。操作ボタンには、携帯端末40の本体に物理的に設けられたボタンや、表示部43とタッチパネル機能により実現される仮想的なボタンが含まれる。 The display unit 43 is composed of, for example, a liquid crystal display or an organic EL display. The operation unit 45 includes a touch panel that is arranged on the upper surface of the display unit 43 and receives a touch operation by a user. The operation unit 45 further includes operation buttons. The operation buttons include buttons physically provided on the main body of the mobile terminal 40 and virtual buttons realized by the display unit 43 and the touch panel function.
 第1通信インタフェース48はネットワークと接続するための通信モジュールであり、LTEや3G等の通信規格にしたがい通信を行なう。第2通信インタフェース49は比較的近距離で他の電子機器と無線通信するための通信モジュールであり、WiFiやBluetooth(登録商標)等の通信規格にしたがい通信を行なう。第2通信インタフェース49はケーブルを介して他の電子機器と通信を行なうものでもよく、例えば、USB(Universal Serial Bus)等の規格に準拠してデータ通信を行ってもよい。 The first communication interface 48 is a communication module for connecting to a network, and performs communication according to a communication standard such as LTE or 3G. The second communication interface 49 is a communication module for wirelessly communicating with other electronic devices at a relatively short distance, and performs communication according to a communication standard such as WiFi or Bluetooth (registered trademark). The second communication interface 49 may communicate with other electronic devices via a cable. For example, the second communication interface 49 may perform data communication in accordance with a standard such as USB (Universal Serial Bus).
 RAM46はプログラムやデータを一時的に記憶する記憶素子であり、コントローラ41の作業領域として機能する。データ格納部47は、プログラムや種々のデータを格納する記録媒体であり、ハードディスク、半導体メモリまたは着脱可能なメモリカード等の記録媒体で構成され得る。データ格納部47には、コントローラ41により実行されるプログラム(OS:Operating System、アプリケーションプログラム47a)やデータが格納される。 The RAM 46 is a storage element that temporarily stores programs and data, and functions as a work area for the controller 41. The data storage unit 47 is a recording medium that stores programs and various data, and may be configured by a recording medium such as a hard disk, a semiconductor memory, or a removable memory card. The data storage unit 47 stores programs executed by the controller 41 (OS: Operating System, application program 47a) and data.
 コントローラ41はCPUやMPUで構成され、データ格納部47に格納されたアプリケーションプログラム47aを実行することで、後述する所定の機能を実現する。アプリケーションプログラム47aはネットワーク200を介して提供されてもよいし、CD-ROM等の記録媒体によって提供されてもよい。 The controller 41 includes a CPU and an MPU, and implements a predetermined function to be described later by executing an application program 47a stored in the data storage unit 47. The application program 47a may be provided via the network 200 or may be provided by a recording medium such as a CD-ROM.
  [1-1-4.データサーバ]
 図4は、データサーバ50の構成を説明した図である。データサーバ50は例えばパーソナルコンピュータのような情報処理装置で構成される。データサーバ50は、その全体動作を制御するコントローラ51と、画面表示を行う表示部53と、ユーザが操作を行う操作部55と、データやプログラムを記憶するRAM56及びデータ格納部57とを備える。表示部53は例えば、液晶ディスプレイや有機ELディスプレイで構成される。操作部55は、キーボード、マウス、タッチパネル等で構成される。
[1-1-4. Data server]
FIG. 4 is a diagram illustrating the configuration of the data server 50. The data server 50 is constituted by an information processing apparatus such as a personal computer. The data server 50 includes a controller 51 that controls the overall operation thereof, a display unit 53 that performs screen display, an operation unit 55 that is operated by a user, a RAM 56 that stores data and programs, and a data storage unit 57. The display unit 53 is configured by, for example, a liquid crystal display or an organic EL display. The operation unit 55 includes a keyboard, a mouse, a touch panel, and the like.
 データサーバ50はさらに、プリンタ等の外部機器に接続するための機器インタフェース58と、ネットワークに接続するためのネットワークインタフェース59とを含む。機器インタフェース58は、USB、HDMI(登録商標)(High-Definition Multimedia Interface)、IEEE1394等に準拠してデータ等の通信を行う通信モジュールである。ネットワークインタフェース59は、IEEE802.11、WiFi等の規格に準拠してデータ通信を行う通信モジュールである。 The data server 50 further includes a device interface 58 for connecting to an external device such as a printer, and a network interface 59 for connecting to a network. The device interface 58 is a communication module that performs data communication in accordance with USB, HDMI (registered trademark) (High-Definition Multimedia Interface), IEEE 1394, or the like. The network interface 59 is a communication module that performs data communication in accordance with standards such as IEEE 802.11 and WiFi.
 コントローラ51はCPUやMPUで構成され、データ格納部57に格納された所定の制御プログラム57aを実行することで所定の機能を実現する。コントローラ51で実行される制御プログラムはネットワーク200を介して提供されてもよいし、CD-ROM等の記録媒体で提供されてもよい。 The controller 51 is composed of a CPU and an MPU, and realizes a predetermined function by executing a predetermined control program 57a stored in the data storage unit 57. The control program executed by the controller 51 may be provided via the network 200 or may be provided on a recording medium such as a CD-ROM.
 RAM56はプログラムやデータを一時的に格納する記憶素子であり、コントローラ51の作業領域として機能する。データ格納部57は機能を実現するために必要なパラメータ、データ及びプログラムを記憶する記録媒体であり、コントローラ51で実行される制御プログラムや各種のデータを格納している。データ格納部57は例えばハードディスク(HDD:Hard Disk Drive)や半導体記憶装置(SSD:Solid State Drive)で構成される。データ格納部57には後述する機能を実現するための制御プログラム57aがインストールされている。コントローラ51はこの制御プログラム57aを実行することで、後述する種々の機能を実現する。データ格納部57は、また、運転者の運転性向に関するデータベース57bを格納する。データ格納部57はコントローラ51の作業領域としても機能する。 The RAM 56 is a storage element that temporarily stores programs and data, and functions as a work area for the controller 51. The data storage unit 57 is a recording medium that stores parameters, data, and programs necessary for realizing the functions, and stores a control program executed by the controller 51 and various data. The data storage unit 57 includes, for example, a hard disk (HDD: Hard Disk Drive) or a semiconductor storage device (SSD: Solid State Drive). The data storage unit 57 is installed with a control program 57a for realizing functions described later. The controller 51 implements various functions to be described later by executing the control program 57a. The data storage unit 57 also stores a database 57b regarding the driving tendency of the driver. The data storage unit 57 also functions as a work area for the controller 51.
 データベース57bは各ドライバに対する運転性向を示すデータ(以下「運転性向データ」という)を管理する。運転性向データは、運転者を識別するための運転者IDと、車両を識別するための車両IDと、データのサンプリング日時と、加速度を示す情報と、加加速度を示す情報とを含む。 The database 57b manages data indicating driving tendency for each driver (hereinafter referred to as “driving tendency data”). The driving tendency data includes a driver ID for identifying the driver, a vehicle ID for identifying the vehicle, a data sampling date, information indicating acceleration, and information indicating jerk.
 [1-2.動作]
 以上のように構成された運転性向判定システム100の動作を以下に説明する。
[1-2. Operation]
The operation of the driving tendency determination system 100 configured as described above will be described below.
 図5は、運転性向判定システム100における運転性向データの流れを説明するための図である。運転性向判定システムにおいて、テレメータユニット10は、車両30の加速度のデータに基づき運転者の運転性向を示す運転性向データを算出し(S1)、通信インタフェース18を介して携帯端末40に送信する(S2)。携帯端末40は、第2通信インタフェース49を介してテレメータユニット10から運転性向データを受信し(S2)、受信した運転性向データを第1通信インタフェース48を介してデータサーバ50へ送信する(S3)。データサーバ50は、携帯端末40から運転性向データを受信し(S3)、受信した運転性向データに基づきデータベース57bを更新する(S4)。以上のようにして、テレメータユニット10で検出された運転性向データがデータサーバ50に送信され、その運転性向データによりデータベース57bが更新される。データベース57bを参照することで、車両の運転者の運転性向を認識することができる。 FIG. 5 is a diagram for explaining the flow of driving tendency data in the driving tendency determination system 100. In the driving tendency determination system, the telemeter unit 10 calculates driving tendency data indicating the driving tendency of the driver based on the acceleration data of the vehicle 30 (S1), and transmits the driving tendency data to the portable terminal 40 via the communication interface 18 (S2). ). The portable terminal 40 receives the driving tendency data from the telemeter unit 10 via the second communication interface 49 (S2), and transmits the received driving tendency data to the data server 50 via the first communication interface 48 (S3). . The data server 50 receives the driving tendency data from the portable terminal 40 (S3), and updates the database 57b based on the received driving tendency data (S4). As described above, the driving tendency data detected by the telemeter unit 10 is transmitted to the data server 50, and the database 57b is updated with the driving tendency data. By referring to the database 57b, the driving tendency of the driver of the vehicle can be recognized.
  [1-2-1.運転性向データ]
 運転性向の判定に用いる運転性向データを説明する。
[1-2-1. Driving tendency data]
Driving tendency data used for determination of driving tendency will be described.
 一般に車両運転時の加速度と運転性向とは相関関係があることが知られている。例えば、普通の街中での運転において、自動車でのおとなしい加速は最大でも0.15G程度であり、0.3Gを超えると、「乱暴な運転」であると判断できる。 Generally, it is known that there is a correlation between acceleration during driving and driving tendency. For example, when driving in an ordinary city, the maximum acceleration in a car is about 0.15 G, and if it exceeds 0.3 G, it can be determined that the driving is rough.
 図6は、緩やかな坂道を下りながらの車両運転中に急ブレーキをかけたときに測定された車両速度の変化を説明した図である。この例では、加速度センサにより、ミニバン(2名乗車)を緩やかな坂道を下る運転において、約50km/hの速度から通常の減速を行い、20km/hの速度に達した時点で急ブレーキをかけて一気に0km/hまで減速した場合の車両の加速度の変化を測定した。測定はサンプリング間隔20msで約11秒間行った。 FIG. 6 is a diagram for explaining changes in vehicle speed measured when sudden braking is applied during vehicle operation while going down a gentle slope. In this example, an accelerometer is used to drive a minivan (two passengers) down a gentle slope, and then the vehicle is decelerated from a speed of approximately 50 km / h and suddenly braked when the speed reaches 20 km / h. The change in the acceleration of the vehicle when the vehicle was decelerated to 0 km / h was measured. The measurement was performed for about 11 seconds at a sampling interval of 20 ms.
 図6において、20km/hの速度に達して急ブレーキがかけられるまでは「安全運転」であり、急ブレーキ開始後、停止するまでが「危険運転」となる。 In FIG. 6, “safe driving” is performed until the speed of 20 km / h is reached and sudden braking is applied, and “dangerous driving” is performed until the braking is stopped after sudden braking is started.
 図7は、図6に示すような車両の挙動に対して加速度センサにより測定された加速度の変化を説明した図である。プロットXはX方向(車両の幅方向)における加速度を示し、プロットYはY方向(車両の進行(前後)方向)における加速度を示し、プロットZはZ方向(車両の上方向)における加速度を示している。図7に示すように、時刻t1にて急ブレーキが開始されたときに、Y方向(進行方向)の加速度は大きく変化している。このとき、車両が坂道を緩やかに下っていることから、Y方向の加速度において0.3のオフセットが常時発生している。このオフセットの大きさは、坂道の傾斜角度や車両の重量によって変化する。このように、加速度にオフセットが発生することから、加速度のみでは、「安全運転状態」なのか「危険運転状態」なのかを精度よく判定することは難しい。 FIG. 7 is a diagram for explaining changes in acceleration measured by the acceleration sensor with respect to the behavior of the vehicle as shown in FIG. Plot X shows acceleration in the X direction (vehicle width direction), plot Y shows acceleration in the Y direction (vehicle traveling (front-rear) direction), and plot Z shows acceleration in the Z direction (upward direction of the vehicle). ing. As shown in FIG. 7, when sudden braking is started at time t1, the acceleration in the Y direction (traveling direction) changes greatly. At this time, since the vehicle is gently descending the slope, an offset of 0.3 always occurs in the acceleration in the Y direction. The magnitude of this offset varies depending on the slope angle of the slope and the weight of the vehicle. As described above, since an offset occurs in the acceleration, it is difficult to accurately determine whether the state is the “safe driving state” or the “dangerous driving state” only by the acceleration.
 そこで、本発明者は、鋭意研究した結果、加速度に加えて加加速度(「躍度」ともいう)をも考慮して運転性向を判定することで精度よく、運転性向(すなわち、運転状態)を検出できることを見出した。なお、加加速度とは、加速度の変化率を示す量であり、加速度の微分により得られる。 Therefore, as a result of earnest research, the present inventor has determined the driving tendency in consideration of the jerk (also referred to as “jump”) in addition to the acceleration, and accurately determines the driving tendency (that is, the driving state). It was found that it could be detected. The jerk is an amount indicating the rate of change of acceleration, and is obtained by differentiation of acceleration.
 図8Aは、図6に示す車両の挙動に対して所定期間(例えば、11秒間)の間、測定されたサンプリング点のY軸方向(進行方向)の加速度および加加速度の変化を二次元平面(以下「J-G平面」という)上にプロットした図である。図8Aに示すJ-G平面において、横軸は加加速度(J)を示し、縦軸が加速度(G)を示す。 FIG. 8A shows a change in acceleration and jerk of a sampling point measured in a Y-axis direction (traveling direction) for a predetermined period (for example, 11 seconds) with respect to the behavior of the vehicle shown in FIG. (Hereinafter referred to as “JG plane”). In the GG plane shown in FIG. 8A, the horizontal axis represents jerk (J), and the vertical axis represents acceleration (G).
 本発明者は、このようなJ-G平面上において、車両の急激な挙動の変化により、サンプリング点のプロットが特異な軌跡を描くことを発見した。具体的には、図6、7に示すように安全運転がなされている状況においては、図8Bに示すように、J-G平面上で加加速度(J)および加速度(G)が比較的小さい範囲P(一点鎖線で示す領域)内にプロットが集中する。その後、急ブレーキがなされ、危険運転状況になると、J-G平面上で範囲Pを大きく超えて反時計周りにプロットの軌跡Qが描かれる。 The present inventor has found that the plot of sampling points draws a unique trajectory due to a sudden change in the behavior of the vehicle on such a GG plane. Specifically, in a situation where safe driving is performed as shown in FIGS. 6 and 7, jerk (J) and acceleration (G) are relatively small on the GG plane as shown in FIG. 8B. The plots are concentrated in the range P (the region indicated by the alternate long and short dash line). Thereafter, when sudden braking is performed and a dangerous driving situation is reached, the locus Q of the plot is drawn counterclockwise far beyond the range P on the GG plane.
 このようにJ-G平面にサンプリング点をプロットした画像は、急ブレーキに対応して現れる軌跡Qのように、運転者の運転性向に対応した情報を含む。発明者はこの点に着目し、J-G平面の画像(以下「J-G平面画像」という)を、畳み込みニューラルネットワークを用いて機械学習させることで、J-G平面画像に基づき運転者の運転性向を分類できると考えた。このため、データサーバ50のコントローラ51はAI(Artificial Intelligence)機能を実装している。以下、コントローラ51のAI機能について説明する。 The image in which the sampling points are plotted on the GG plane in this way includes information corresponding to the driving tendency of the driver, such as the locus Q that appears corresponding to the sudden braking. The inventor pays attention to this point, and performs machine learning on an image on the JG plane (hereinafter referred to as “JG plane image”) using a convolutional neural network. We thought that driving tendency could be classified. Therefore, the controller 51 of the data server 50 has an AI (Artificial Intelligence) function. Hereinafter, the AI function of the controller 51 will be described.
  [1-2-2.畳み込みニューラルネットワーク]
 本実施形態のデータサーバ50のコントローラ51は、図9に示すように、AI機能において畳み込みニューラルネットワーク(CNN:Convolutional Neural Network)モデルを用いる。この畳み込みニューラルネットワークに対して、運転者の長期間の運転性向データから生成した画像(J-G平面画像)を学習させる。畳み込みニューラルネットワークに対する機械学習に用いる画像(J-G平面画像)のデータを次のようにして求める。
[1-2-2. Convolutional neural network]
As shown in FIG. 9, the controller 51 of the data server 50 according to the present embodiment uses a convolutional neural network (CNN) model in the AI function. The convolutional neural network is made to learn an image (JG plane image) generated from long-term driving tendency data of the driver. Data of an image (JG plane image) used for machine learning for a convolutional neural network is obtained as follows.
 図10は、機械学習に用いる画像(J-G平面画像)の生成方法を説明した図である。テレメータユニット10により、ある運転者に対する長期間(例えば、30時間)の加速度および加加速度のデータ(運転性向データ)300が測定されているとする。この長期間の運転性向データ300から、所定の期間(例えば11秒)毎に、その期間に測定された加速度及び加加速度のデータをJ-G平面上にプロットしてJ-G平面画像310を作成する。この所定期間は、図8Bに示すように、急ブレーキから車両停止までの車両の挙動の情報が1つのJ-G平面に含まれるように設定される(例えば11秒)。このとき、長期間(例えば、30時間)に亘って測定した運転性向データ300からは、複数のJ-G平面画像310が生成される(図10参照)。すなわち、一人の運転者に対して、長期間に亘って測定された運転性向データから複数のJ-G平面画像(J-G平面画像群)が生成される。そして、多数の運転者に対するJ-G平面画像群を畳み込みニューラルネットワークに機械学習させる。 FIG. 10 is a diagram illustrating a method for generating an image (JG plane image) used for machine learning. It is assumed that long-term (for example, 30 hours) acceleration and jerk data (driving tendency data) 300 for a driver is measured by the telemeter unit 10. From this long-term driving tendency data 300, the acceleration and jerk data measured during a predetermined period (for example, 11 seconds) are plotted on the JG plane to obtain a JG plane image 310. create. As shown in FIG. 8B, the predetermined period is set so that information on the behavior of the vehicle from sudden braking to vehicle stop is included in one JG plane (for example, 11 seconds). At this time, a plurality of JG plane images 310 are generated from the driving tendency data 300 measured over a long period (for example, 30 hours) (see FIG. 10). That is, for a single driver, a plurality of JG plane images (JG plane image group) are generated from the driving tendency data measured over a long period of time. Then, the JG plane image group for a large number of drivers is machine-learned by a convolutional neural network.
 図11は、畳み込みニューラルネットワークの学習手順を説明した図である。畳み込みニューラルネットワークの学習は、(1)非教師深層学習(クラシフィケ-ション)、(2)教師付深層学習(クラスタリングされたグループへの重み付け)、(3)推論の3つのステップの実行により行われる。それぞれのステップについて説明する。 FIG. 11 is a diagram for explaining the learning procedure of the convolutional neural network. Convolutional neural network learning is performed by executing three steps: (1) unsupervised deep learning (classification), (2) supervised deep learning (weighting clustered groups), and (3) inference. . Each step will be described.
 (1)非教師深層学習(クラシフィケーション)
 多数の運転者に対するJ-G平面画像群を畳み込みニューラルネットワークに入力する。畳み込みニューラルネットワークは、クラスタリングを行い、J-G平面画像群から自動的に特徴グループを構築(クラスタリング)する。例えば、図9に示すように、クラスタ(特徴グループ)C1、C2、C3が構築される。
(1) Unsupervised deep learning (classification)
A JG plane image group for a large number of drivers is input to a convolutional neural network. The convolutional neural network performs clustering and automatically constructs a feature group (clustering) from the JG plane image group. For example, as shown in FIG. 9, clusters (feature groups) C1, C2, and C3 are constructed.
 (2)教師付深層学習(クラスタリングされたグループへの重み付け)
 実際に事故を発生させた運転者のデータを教師信号として畳み込みニューラルネットワークに入力する。畳み込みニューラルネットワークは、高く応答するクラスタを事故発生確率の高い運転者のグループである「事故予備群」としてクラス分けする。すなわち、教師信号として、任意の運転者が事故を発生した履歴情報(事故発生日時等)を与えることにより、その運転性向のクラスタグループに事故予備群の重み付けを行う。例えば、クラスタC1が、事故発生確率の高い運転者のグループである「事故予備群」としてクラス分けされる。クラスタC2は、事故発生確率がやや高い運転者のグループである「やや危険運転群」としてクラス分けされる。クラスタC3が、安全運転を行う運転者のグループである「安全運転群」としてクラス分けされる。
(2) Supervised deep learning (weighting clustered groups)
The data of the driver who actually caused the accident is input to the convolutional neural network as a teacher signal. The convolutional neural network classifies highly responsive clusters as “accident reserve groups”, which are groups of drivers with a high probability of accident occurrence. That is, by giving history information (accident occurrence date and time) that an arbitrary driver has caused an accident as a teacher signal, the cluster group having the driving tendency is weighted with the accident preliminary group. For example, the cluster C1 is classified as an “accident reserve group” that is a group of drivers who have a high probability of accident occurrence. The cluster C2 is classified as a “slightly dangerous driving group” that is a group of drivers having a slightly higher probability of accident occurrence. Cluster C3 is classified as a “safe driving group” which is a group of drivers who perform safe driving.
 (3)推論
 上記(1)(2)の学習が終了後、畳み込みニューラルネットワークに対して任意のJ-G平面画像群を入力し、そのJ-G平面画像群に基づく判定結果を推論させ、推論結果の正誤を教師信号としてフィードバックする。このように、畳み込みニューラルネットワークに対して、J-G平面画像群よる推論をさせながら、推論と教師信号による補正を繰り返して、畳み込みニューラルネットワークの推論精度を高めていく。補正は、例えば、事故予備群として分類されたクラスタC1について、推論結果が間違いの場合、クラスタC1内のサブクラスタC1-1の重みを下げて、推論結果が正解の場合は、クラスタ内のサブクラスタ(例えば、C1-2)の重みを上げることにより行う。例えば、ある対象者についての推論結果としてその対象者が「事故予備群」にクラス分けされたが、実際にはその対象者が事故を起こしていない場合、その推論が誤りであったことを教師信号でフィードバックする。これにより、クラスタC1-1、C1-2が更新される。
(3) Inference After learning in (1) and (2) above, an arbitrary JG plane image group is input to the convolutional neural network, and a determination result based on the JG plane image group is inferred. The correctness of the inference result is fed back as a teacher signal. In this way, the inference accuracy of the convolutional neural network is increased by repeating the inference and correction by the teacher signal while inferring the convolutional neural network by the JG plane image group. For example, for the cluster C1 classified as the accident reserve group, when the inference result is incorrect, the weight of the sub-cluster C1-1 in the cluster C1 is reduced, and when the inference result is correct, the sub-cluster in the cluster This is done by increasing the weight of the cluster (for example, C1-2). For example, if the subject was classified into an “accident reserve group” as an inference result for a subject, but the subject actually did not have an accident, the teacher informed that the reasoning was incorrect. Feedback with a signal. As a result, the clusters C1-1 and C1-2 are updated.
 このように、画像データを繰り返し入力し、推論を実施させながら適宜推論の正解とエラーを教示しながら深層学習を進め、Similarity Matchingによりクラスタグループを細分化して、事故予備群をより高確度に抽出できるように収束させていく。推論結果の正答率が所定の確率以上になると、推論ステップを終了し、畳み込みニューラルネットワークの学習を終了する。 In this way, image data is repeatedly input and deep learning is performed while teaching correct inferences and errors as appropriate while performing inference, and cluster groups are subdivided by Similarity Matching to extract the preliminary accident group with higher accuracy. Converge as you can. When the correct answer rate of the inference result is equal to or higher than a predetermined probability, the inference step is ended and the learning of the convolutional neural network is ended.
 以上のようにして十分な量のJ-G平面画像を用いて畳み込みニューラルネットワークを学習させる。 As described above, a convolutional neural network is learned using a sufficient amount of JG plane images.
  [1-2-2.テレメータユニットからのデータ送信処理]
 図12は、テレメータユニット10から携帯端末40への運転性向データの送信処理を示すフローチャートである。以下、図12を参照し、テレメータユニット10における運転性向データの送信処理を説明する。図12に示す処理は主としてテレメータユニット10のコントローラ11により実行される。
[1-2-2. Data transmission processing from telemeter unit]
FIG. 12 is a flowchart showing a driving tendency data transmission process from the telemeter unit 10 to the portable terminal 40. Hereinafter, with reference to FIG. 12, the driving tendency data transmission processing in the telemeter unit 10 will be described. The process shown in FIG. 12 is mainly executed by the controller 11 of the telemeter unit 10.
 車両運転中、テレメータユニット10において、加速度センサ15により所定のサンプリング間隔(例えば20msec)で車両30の加速度(G)が検出(測定)され、検出(測定)された加速度を示す情報がメモリ17に蓄積されている。 During driving of the vehicle, in the telemeter unit 10, the acceleration sensor 15 detects (measures) the acceleration (G) of the vehicle 30 at a predetermined sampling interval (for example, 20 msec), and information indicating the detected (measured) acceleration is stored in the memory 17. Accumulated.
 図12に示す処理が開始されると、テレメータユニット10のコントローラ11は、メモリ17から、所定期間(例えば1ヶ月)内に測定された加速度(G)の情報を読み出して取得する(S11)。 When the process shown in FIG. 12 is started, the controller 11 of the telemeter unit 10 reads and acquires information on the acceleration (G) measured within a predetermined period (for example, one month) from the memory 17 (S11).
 コントローラ11は、読み出したY方向の各加速度(G)からサンプリング点における加加速度(J)を算出する(S12)。各サンプリング点の加速度と加加速度の情報から運転性向データを生成する。 Controller 11 calculates jerk (J) at the sampling point from each acceleration (G) in the read Y direction (S12). Driving tendency data is generated from the acceleration and jerk information of each sampling point.
 その後、コントローラ11は、運転性向データを、通信インタフェース18を介して携帯端末40へ送信する(S13)。このとき、テレメータユニット10のメモリ17に蓄積されていた加速度のデータに基づき求められた運転性向データ、すなわち、ある期間に測定された運転性向データが携帯端末40へ送信される。 Thereafter, the controller 11 transmits the driving tendency data to the portable terminal 40 via the communication interface 18 (S13). At this time, the driving tendency data obtained based on the acceleration data stored in the memory 17 of the telemeter unit 10, that is, the driving tendency data measured in a certain period is transmitted to the portable terminal 40.
 携帯端末40は、テレメータユニット10から第2通信インタフェース49を介して運転性向データを受信し、受信した運転性向データを、第1通信インタフェース48を介して、ネットワーク200に接続するデータサーバ50へ送信する。データサーバ50は、ネットワークインタフェース59を介して運転性向データを受信し、受信した運転性向データをデータベース57bに蓄積する(図4、図5参照)。 The portable terminal 40 receives the driving tendency data from the telemeter unit 10 via the second communication interface 49, and transmits the received driving tendency data to the data server 50 connected to the network 200 via the first communication interface 48. To do. The data server 50 receives the driving tendency data via the network interface 59 and accumulates the received driving tendency data in the database 57b (see FIGS. 4 and 5).
  [1-2-3.運転性向の判定]
 図13のフローチャートを参照して、データサーバ50における運転性向判定処理を説明する。
[1-2-3. Determination of driving tendency]
The driving tendency determination process in the data server 50 will be described with reference to the flowchart of FIG.
 利用者は、データサーバ50の操作部55を介して運転性向の判定を行いたい対象者を指定する情報(ID番号、氏名等)を入力する。データサーバ50のコントローラ51は、対象者を指定する情報を受けると、データベース57bにアクセスしてその対象者に対する、長期間(例えば30時間分)の運転性向データを取得し、取得した運転性向データに基づきJ-G平面画像群を作成する(S21)。 The user inputs information (ID number, name, etc.) specifying the target person who wants to determine the driving tendency through the operation unit 55 of the data server 50. When the controller 51 of the data server 50 receives the information specifying the target person, the controller 51 accesses the database 57b to acquire long-term driving tendency data (for example, for 30 hours) for the target person, and the acquired driving tendency data Based on the above, a JG plane image group is created (S21).
 コントローラ51は、対象者について生成されたJ-G平面画像群を畳み込みニューラルネットワークに入力し、対象者に対する運転性向を判定する(S22)。畳み込みニューラルネットワークは、対象者のJ-G平面画像に基づき分類した結果を、J-G平面画像毎にその対象者の運転性向の判定結果として出力する。コントローラ51は、畳み込みニューラルネットワークから得られた、運転性向の判定結果を示す情報を表示部53に出力する(S23)。 The controller 51 inputs the JG plane image group generated for the subject to the convolutional neural network, and determines the driving tendency for the subject (S22). The convolutional neural network outputs the result of classification based on the JG plane image of the subject as a result of determining the driving tendency of the subject for each JG plane image. The controller 51 outputs information indicating the driving tendency determination result obtained from the convolutional neural network to the display unit 53 (S23).
 例えば、畳み込みニューラルネットワークに対して運転性向判定のために複数のJ-G画像を入力したときに、J-G画像毎に、いずれかのクラスタ(例えば、C1~C3)に分類され、分類されたクラスタを示す情報が出力される。そのような場合、入力したJ-G画像の総数に対する、出力されたクラスタ毎の比率を求め、判定結果として表示部に表示してもよい。具体的には、畳み込みニューラルネットワークに入力したJ-G画像群の画像毎に出力された結果を、クラスタ毎にカウントする。そして、判定のために入力したJ-G画像の総数に対するクラスタC3に分類された結果の数の比率(%)を「安全運転率」(事故を起こす確率がきわめて低いことを示す率)として求める。さらに、J-G画像群の総数に対するクラスタC2に分類された結果の数の比率(%)を、「やや危険運転率」(事故を起こす確率がやや高いことを示す率)とする。さらに、J-G画像群の総数に対するクラスタC1に分類された結果の数の比率(%)を、「危険運転率」(事故を起こす確率が非常に高いことを示す率)とする。そして、運転性向の判定結果として、それぞれの項目に対する比率を表示部53に表示させてもよい。例えば、「安全運転率:80% やや危険運転率:15% 危険運転率:5%」と表示してもよい。 For example, when a plurality of JG images are input to the convolutional neural network to determine driving tendency, each JG image is classified into one of the clusters (for example, C1 to C3) and classified. The information indicating the cluster is output. In such a case, the ratio of each output cluster to the total number of input JG images may be obtained and displayed as a determination result on the display unit. Specifically, the results output for each image of the JG image group input to the convolutional neural network are counted for each cluster. Then, the ratio (%) of the number of results classified into cluster C3 to the total number of JG images input for determination is obtained as “safe driving rate” (a rate indicating that the probability of causing an accident is extremely low). . Further, the ratio (%) of the number of results classified into the cluster C2 with respect to the total number of JG image groups is defined as “slightly dangerous driving rate” (a rate indicating that the probability of causing an accident is somewhat high). Further, the ratio (%) of the number of results classified into the cluster C1 with respect to the total number of JG image groups is defined as “dangerous driving rate” (a rate indicating that the probability of causing an accident is very high). Then, the ratio to each item may be displayed on the display unit 53 as the determination result of the driving tendency. For example, “safe driving rate: 80%, somewhat dangerous driving rate: 15%, dangerous driving rate: 5%” may be displayed.
 なお、運転性向の判定結果は、データ格納部57に格納されてもよいし、ネットワークインタフェース59や機器インタフェース58を介して他の機器に送信されてもよい。 Note that the determination result of the driving tendency may be stored in the data storage unit 57 or may be transmitted to another device via the network interface 59 or the device interface 58.
 以上のように、本実施の形態のデータサーバ50は、対象者のJ-G平面画像群からその対象者の運転性向を判定することができる。 As described above, the data server 50 of the present embodiment can determine the driving tendency of the subject from the subject's JG plane image group.
 [1-3.効果等]
 以上のように、本実施の形態のデータサーバ50(運転性向判定装置の一例)は、判定対象者の車両運転時に測定された車両の加速度と加加速度を示す情報を取得する取得部(ネットワークインタフェース59、データ格納部57等)と、任意の運転者による車両運転時に測定された加速度と加加速度の時系列がプロットされた二次元平面を含む判定画像を用いて学習した人工知能機能を実装したコントローラ51(演算部)と、を備える。コントローラ51は、判定対象者の車両運転時に測定された加速度及び加加速度の情報に基づき複数のJ-G平面画像(判定画像の一例)を生成し、複数のJ-G平面画像を人工知能機能に入力することにより判定対象者の運転性向を判定する。
[1-3. Effect]
As described above, the data server 50 (an example of the driving tendency determination device) according to the present embodiment acquires an information (network interface) that acquires information indicating the acceleration and jerk of the vehicle measured when the determination target person drives the vehicle. 59, a data storage unit 57, etc.) and an artificial intelligence function learned using a determination image including a two-dimensional plane in which a time series of acceleration and jerk measured when a vehicle is driven by an arbitrary driver is plotted A controller 51 (calculation unit). The controller 51 generates a plurality of JG plane images (an example of a determination image) based on acceleration and jerk information measured when the determination target person drives the vehicle, and uses the plurality of JG plane images as an artificial intelligence function. To determine the driving tendency of the person to be determined.
 また、運転性向判定システム100は、車両の加速度を測定するテレメータユニット10(測定装置の一例)と、テレメータユニット10から測定した加速度を示す情報を受信して転送する携帯端末40と、携帯端末40から加速度を示す情報を受信し、受信した情報に基づき人工知能機能を学習させるデータサーバ50と、を備える。 The driving tendency determination system 100 includes a telemeter unit 10 (an example of a measurement device) that measures vehicle acceleration, a mobile terminal 40 that receives and transfers information indicating acceleration measured from the telemeter unit 10, and a mobile terminal 40. And a data server 50 that receives information indicating acceleration from the computer and learns the artificial intelligence function based on the received information.
 以上のように本実施形態のデータサーバ50は、判定対象者に対して測定された車両の加速度と加加速度を二次元平面上にプロットしたJ-G平面画像を、事前に学習させた畳み込みニューラルネットワークに入力することで、判定対象者の運転性向の判定結果を得ることができる。特に本実施の形態では、人工知能機能を学習させる際に入力する情報として、複数のJ-G平面画像、すなわち、長期間に亘って測定された運転性向データを用いている。このため、それらのデータにおいて運転者の運転の傾向がより確実に含まれていることから、精度のよい判定が可能となる。 As described above, the data server 50 according to the present embodiment is a convolutional neural that has previously learned a JG plane image in which the acceleration and jerk of the vehicle measured for a determination target person are plotted on a two-dimensional plane. By inputting to the network, the determination result of the driving tendency of the determination target person can be obtained. In particular, in this embodiment, a plurality of JG plane images, that is, driving tendency data measured over a long period of time is used as information to be input when learning the artificial intelligence function. For this reason, since the tendency of a driver | operator's driving | operation is included more reliably in those data, a highly accurate determination is attained.
 画像ビッグデータの深層学習に関する様々な手法やナレッジ、API(Application Program Interface)等が運転性向の深層学習や事故発生確率推論等で応用できるようになり、特別なアルゴリズム開発を必要とすることなく、事故を起こし易いドライバの運転性向を自動クラスタ分類でき、クラスタと事故との関連づけも自動化できる。かつ最終目的である推論(事故予測)まで含めて、共通したプラットホーム上で実施できるようになる。 Various methods and knowledge related to deep learning of image big data, API (Application Program Interface), etc. can be applied in driving-oriented deep learning and accident probability inference, etc., without requiring special algorithm development, The driver's driving tendency that is likely to cause an accident can be automatically classified into clusters, and the association between the cluster and the accident can be automated. In addition, it can be implemented on a common platform, including the final goal of inference (accident prediction).
 (実施の形態2)
 図14は、本開示の実施の形態2における運転性向判定システム100bの構成を示す図である。実施の形態1では、テレメータユニット10と、携帯端末40と、データサーバ50とで運転性向判定システム100を構成した。これに対して本実施の形態の運転性向判定システム100bは、車両に搭載されたテレメータユニット10と、データサーバ50とで構成される。すなわち、テレメータユニット10は、携帯端末40を介さず、直接、データサーバ50へデータを送信する。なお、テレメータユニット10は、データサーバ50と通信するために、通信インタフェース18を介してネットワーク200に接続することができる。
(Embodiment 2)
FIG. 14 is a diagram illustrating a configuration of the driving tendency determination system 100b according to the second embodiment of the present disclosure. In the first embodiment, the driving tendency determination system 100 is configured by the telemeter unit 10, the mobile terminal 40, and the data server 50. On the other hand, the driving tendency determination system 100b of the present embodiment includes a telemeter unit 10 mounted on a vehicle and a data server 50. That is, the telemeter unit 10 transmits data directly to the data server 50 without using the mobile terminal 40. The telemeter unit 10 can be connected to the network 200 via the communication interface 18 in order to communicate with the data server 50.
 図15は、実施の形態2における運転性向データの流れを説明するための図である。同図に示すように、テレメータユニット10は、運転性向データを算出し(S41)、算出した運転性向データをデータサーバ50に送信する(S42)。データサーバ50は、受信したデータに基づきデータベース57bを更新する(S43)。 FIG. 15 is a diagram for explaining the flow of driving tendency data in the second embodiment. As shown in the figure, the telemeter unit 10 calculates driving tendency data (S41), and transmits the calculated driving tendency data to the data server 50 (S42). The data server 50 updates the database 57b based on the received data (S43).
 テレメータユニット10は、所定期間(例えば1ケ月)毎に運転性向データをデータサーバ50に送信してもよい。または、テレメータユニット10は、ユーザによる指示や、車両の所定の動作(エンジンスタート等)があったときにデータを送信するようにしてもよい。 The telemeter unit 10 may transmit driving tendency data to the data server 50 every predetermined period (for example, one month). Alternatively, the telemeter unit 10 may transmit data when there is an instruction from the user or a predetermined operation of the vehicle (engine start or the like).
 (実施の形態3)
 本実施の形態では、携帯端末40が、テレメータユニット10の上記機能を備えた構成を説明する。すなわち、実施の形態1では、テレメータユニット10が車両30の加速度を検出し、運転性向データを算出したが、携帯端末40が車両30のこれらの機能を実行してもよい。
(Embodiment 3)
In the present embodiment, a configuration in which the mobile terminal 40 includes the above functions of the telemeter unit 10 will be described. That is, in Embodiment 1, the telemeter unit 10 detects the acceleration of the vehicle 30 and calculates the driving tendency data, but the mobile terminal 40 may execute these functions of the vehicle 30.
 図16は、本開示の実施の形態3における運転性向判定システム100cの構成を示す図である。本実施形態の運転性向判定システム100cは、携帯端末40と、データサーバ50とで構成する。本実施形態の携帯端末40は、直交する三方向(XYZ方向)の加速度が検出できる加速度センサを備えている。携帯端末40は、車両30の進行方向(前方方向)をY方向とし、車両の上方向をZ方向とするように車両30内に配置される。携帯端末40は、そのようにして車両30内に配置された状態で、車両30の運転中における車両30の加速度を定期的に測定し、記録する。 FIG. 16 is a diagram illustrating a configuration of the driving tendency determination system 100c according to the third embodiment of the present disclosure. The driving tendency determination system 100 c according to the present embodiment includes a mobile terminal 40 and a data server 50. The mobile terminal 40 of this embodiment includes an acceleration sensor that can detect acceleration in three orthogonal directions (XYZ directions). The portable terminal 40 is disposed in the vehicle 30 such that the traveling direction (forward direction) of the vehicle 30 is the Y direction and the upward direction of the vehicle is the Z direction. The portable terminal 40 periodically measures and records the acceleration of the vehicle 30 during the driving of the vehicle 30 in the state of being arranged in the vehicle 30 as described above.
 図17は、実施の形態3における運転性向データの流れを説明するための図である。同図に示すように、携帯端末40は、前述の方法にしたがい、測定した加速度データから運転性向データを算出し(S51)、算出した運転性向データをデータサーバ50に送信する(S52)。データサーバ50は、受信した運転性向データに基づきデータベース57bを更新する(S53)。 FIG. 17 is a diagram for explaining the flow of driving tendency data in the third embodiment. As shown in the figure, the mobile terminal 40 calculates driving tendency data from the measured acceleration data according to the above-described method (S51), and transmits the calculated driving tendency data to the data server 50 (S52). The data server 50 updates the database 57b based on the received driving tendency data (S53).
 携帯端末40は、所定期間(例えば1ケ月)毎に運転性向データをデータサーバ50に送信してもよいし、または、ユーザによる所定の操作(送信指示等)があったときにデータを送信するようにしてもよい。 The portable terminal 40 may transmit driving tendency data to the data server 50 every predetermined period (for example, one month), or transmit data when a predetermined operation (such as a transmission instruction) is performed by the user. You may do it.
 (他の実施の形態)
 以上のように、本出願において開示する技術の例示として、実施の形態1~3を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態1~3で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施の形態を例示する。
(Other embodiments)
As described above, Embodiments 1 to 3 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Also, it is possible to combine the components described in the first to third embodiments to form a new embodiment. Therefore, other embodiments will be exemplified below.
 実施の形態1~3では、運転性向検出装置の一例としてテレメータユニット10を説明した。運転性向検出装置はテレメータユニット10に限定されない。携帯端末40やデータサーバ50を運転性向検出装置として構成することもできる。 In Embodiments 1 to 3, the telemeter unit 10 has been described as an example of a driving tendency detection device. The driving tendency detection device is not limited to the telemeter unit 10. The portable terminal 40 and the data server 50 can also be configured as a driving tendency detection device.
 例えば、実施の形態1では、図5に示すように、運転性向データはテレメータユニット10において生成されたが、テレメータユニット10以外の装置すなわち携帯端末40またはデータサーバ50において生成されてもよい。 For example, in the first embodiment, as shown in FIG. 5, the driving tendency data is generated in the telemeter unit 10, but may be generated in a device other than the telemeter unit 10, that is, the portable terminal 40 or the data server 50.
 より具体的には、運転性向データを携帯端末40において算出する場合は、テレメータユニット10は、検出した加速度のデータの時系列を携帯端末40に送信する。携帯端末40は、第2通信インタフェース49(取得部の一例)を介して加速度データの時系列を受信し、作業のためデータ格納部47に記憶する。携帯端末40は、受信した加速度データに基づき加加速度(J)を算出し、加速度(G)および加加速度(J)を示すデータを、運転性向データとして、データサーバ50に送信する(図12のステップS12、S13参照)。このような構成により、携帯端末40は運転性向検出装置として動作することができる。 More specifically, when the driving tendency data is calculated in the portable terminal 40, the telemeter unit 10 transmits a time series of detected acceleration data to the portable terminal 40. The portable terminal 40 receives the time series of acceleration data via the second communication interface 49 (an example of an acquisition unit) and stores it in the data storage unit 47 for work. The portable terminal 40 calculates jerk (J) based on the received acceleration data, and transmits data indicating the acceleration (G) and jerk (J) to the data server 50 as driving tendency data (FIG. 12). (See Steps S12 and S13). With this configuration, the mobile terminal 40 can operate as a driving tendency detection device.
 また、運転性向データをデータサーバ50において算出する場合は、テレメータユニット10は、加速度データを携帯端末40に送信する。携帯端末40は、受信した加速度データをデータサーバ50に転送する。データサーバ50は、ネットワークインタフェース59(取得部の一例)を介して加速度データの時系列を受信し、作業のためデータ格納部57に記憶する。データサーバ50は、受信した加速度データに基づき加加速度(J)を算出し、加速度および加加速度を示すデータから運転性向データを生成し、生成した運転性向データでデータベース57bを更新する。このような構成により、データサーバ50は運転性向検出装置として動作することができる。 Further, when the driving tendency data is calculated in the data server 50, the telemeter unit 10 transmits the acceleration data to the portable terminal 40. The portable terminal 40 transfers the received acceleration data to the data server 50. The data server 50 receives a time series of acceleration data via the network interface 59 (an example of an acquisition unit) and stores it in the data storage unit 57 for work. The data server 50 calculates jerk (J) based on the received acceleration data, generates driving tendency data from the data indicating the acceleration and jerk, and updates the database 57b with the generated driving tendency data. With such a configuration, the data server 50 can operate as a driving tendency detection device.
 上記の実施形態では、コントローラ11、41、51は、CPUまたはMPUを含み、所定のプログラム(ソフトウェア)を実行することで後述する所定の機能を実現する例を説明した。このようにコントローラ11、41、51の機能はハードウェアとソフトウェアの協働により実現してもよいし、ハードウェア回路のみで実現してもよい。例えば、コントローラ11、41、51は、CPU、MPU、DSP(Digital Signal Processor)、FPGA(Field‐Programmable Gate Array)、ASIC(Application Specific Integrated Circuit)、ASSP(Application Specific Standard Produce)等で実現することができる。 In the above embodiment, the controller 11, 41, 51 includes a CPU or MPU, and an example in which a predetermined function (described later) is realized by executing a predetermined program (software) has been described. As described above, the functions of the controllers 11, 41, and 51 may be realized by cooperation of hardware and software, or may be realized only by a hardware circuit. For example, the controllers 11, 41, and 51 are a CPU, MPU, DSP (Digital Signal Processor), FPGA (Field-Programmable Gate Array), ASIC (Application Specific Integrated Circuit), ASP Can do.
 以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。 As described above, the embodiments have been described as examples of the technology in the present disclosure. For this purpose, the accompanying drawings and detailed description are provided.
 したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Accordingly, among the components described in the accompanying drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to illustrate the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiment is for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be performed within the scope of the claims or an equivalent scope thereof.
 本開示は、車両の運転者の運転性向を判定する装置に適用できる。 The present disclosure can be applied to an apparatus that determines the driving tendency of a vehicle driver.
 10 テレメータユニット
 11 テレメータユニットのコントローラ
 15 加速度センサ
 17 メモリ
 18 通信インタフェース
 30 車両
 40 携帯端末
 41 携帯端末のコントローラ
 47a アプリケーションプログラム
 48 第1通信インタフェース
 49 第2通信インタフェース
 50 データサーバ
 51 データサーバのコントローラ
 57a 制御プログラム
 59 ネットワークインタフェース
100 運転性向判定システム
200 ネットワーク
DESCRIPTION OF SYMBOLS 10 Telemeter unit 11 Telemeter unit controller 15 Acceleration sensor 17 Memory 18 Communication interface 30 Vehicle 40 Portable terminal 41 Portable terminal controller 47a Application program 48 First communication interface 49 Second communication interface 50 Data server 51 Data server controller 57a Control program 59 Network interface 100 Driving tendency determination system 200 Network

Claims (5)

  1.  判定対象者の車両運転時に測定された車両の加速度と加加速度を示す情報を取得する取得部と、
     任意の運転者による車両運転時に測定された加速度と加加速度の時系列がプロットされた二次元平面を含む判定画像を用いて学習した人工知能機能を実装した演算部と、を備え、
     前記演算部は、
      前記判定対象者の前記車両運転時に測定された前記加速度及び前記加加速度の前記情報に基づき複数の判定画像を生成し、
      前記複数の判定画像を前記人工知能機能に入力することにより前記判定対象者の運転性向を判定する、
    運転性向判定装置。
    An acquisition unit that acquires information indicating the acceleration and jerk of the vehicle measured during driving of the determination target person;
    An arithmetic unit equipped with an artificial intelligence function learned using a determination image including a two-dimensional plane in which a time series of acceleration and jerk measured when a vehicle is driven by an arbitrary driver is provided,
    The computing unit is
    A plurality of determination images are generated based on the information on the acceleration and jerk measured during the vehicle driving of the determination target person,
    Determining the driving tendency of the person to be determined by inputting the plurality of determination images to the artificial intelligence function;
    Driving tendency determination device.
  2.  前記演算部は、第1の期間内に測定された前記車両の前記加速度を示す前記情報と、それに対応する前記加加速度を示す前記情報とを用いて、前記第1の期間よりも短い第2の期間毎に、前記第2の期間内に測定された前記車両の前記加速度と前記加加速度の組の時系列を二次元平面上にプロットすることにより前記複数の判定画像を生成する、
    請求項1に記載の運転性向判定装置。
    The calculation unit uses the information indicating the acceleration of the vehicle measured within a first period and the information indicating the jerk corresponding to the information to be a second shorter than the first period. Generating a plurality of determination images by plotting a time series of a set of the acceleration and jerk of the vehicle measured in the second period on a two-dimensional plane for each period of
    The driving tendency determination device according to claim 1.
  3.  前記加速度は所定のサンプリング期間毎に測定され、
     前記第2の期間は、前記加速度と前記加加速度の複数の組のプロットを含む期間に設定される、請求項2に記載の運転性向判定装置。
    The acceleration is measured every predetermined sampling period,
    The driving tendency determination apparatus according to claim 2, wherein the second period is set to a period including a plurality of sets of plots of the acceleration and the jerk.
  4.  前記人工知能機能は、畳み込みニューラルネットワークモデルを含む、請求項1に記載の運転性向判定装置。 The driving tendency determination device according to claim 1, wherein the artificial intelligence function includes a convolutional neural network model.
  5.  前記車両の前記加速度を測定する測定装置と、
     前記測定装置から測定した前記加速度を示す前記情報を受信して転送する携帯端末と、
     前記携帯端末から前記加速度を示す前記情報を受信し、受信した前記情報に基づき前記人工知能機能を学習させる請求項1から4のいずれかに記載の前記運転性向判定装置と、を備えた、
    運転性向判定システム。
    A measuring device for measuring the acceleration of the vehicle;
    A portable terminal that receives and transfers the information indicating the acceleration measured from the measuring device;
    The driving tendency determination device according to any one of claims 1 to 4, wherein the information indicating the acceleration is received from the portable terminal, and the artificial intelligence function is learned based on the received information.
    Driving tendency determination system.
PCT/JP2017/014736 2016-08-10 2017-04-11 Driving tendency determination apparatus and driving tendency determination system WO2018029904A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016157772A JP2018024340A (en) 2016-08-10 2016-08-10 Driving tendency determination apparatus and driving tendency determination system
JP2016-157772 2016-08-10

Publications (1)

Publication Number Publication Date
WO2018029904A1 true WO2018029904A1 (en) 2018-02-15

Family

ID=61161916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014736 WO2018029904A1 (en) 2016-08-10 2017-04-11 Driving tendency determination apparatus and driving tendency determination system

Country Status (2)

Country Link
JP (1) JP2018024340A (en)
WO (1) WO2018029904A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110348338A (en) * 2019-06-26 2019-10-18 深圳市微纳集成电路与系统应用研究院 Driver assistance based on deep learning drives rearview mirror and the system comprising it
JP7491653B2 (en) 2019-12-02 2024-05-28 インターナショナル・ビジネス・マシーンズ・コーポレーション Deep Contour Correlation Prediction

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7380019B2 (en) 2019-09-27 2023-11-15 オムロン株式会社 Data generation system, learning device, data generation device, data generation method, and data generation program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001256598A (en) * 2000-03-08 2001-09-21 Honda Motor Co Ltd System for notifying dangerous place
JP2007326465A (en) * 2006-06-07 2007-12-20 Toyota Motor Corp Driving intention estimating device
JP2008152136A (en) * 2006-12-19 2008-07-03 Myuutekku:Kk Driving support device and method thereof
JP2009530166A (en) * 2006-03-22 2009-08-27 ジーエム グローバル テクノロジー オペレーションズ,インク. Driving style sensitive vehicle subsystem control method and apparatus
JP2012081897A (en) * 2010-10-13 2012-04-26 Toyota Motor Corp Drive assist system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001256598A (en) * 2000-03-08 2001-09-21 Honda Motor Co Ltd System for notifying dangerous place
JP2009530166A (en) * 2006-03-22 2009-08-27 ジーエム グローバル テクノロジー オペレーションズ,インク. Driving style sensitive vehicle subsystem control method and apparatus
JP2007326465A (en) * 2006-06-07 2007-12-20 Toyota Motor Corp Driving intention estimating device
JP2008152136A (en) * 2006-12-19 2008-07-03 Myuutekku:Kk Driving support device and method thereof
JP2012081897A (en) * 2010-10-13 2012-04-26 Toyota Motor Corp Drive assist system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110348338A (en) * 2019-06-26 2019-10-18 深圳市微纳集成电路与系统应用研究院 Driver assistance based on deep learning drives rearview mirror and the system comprising it
JP7491653B2 (en) 2019-12-02 2024-05-28 インターナショナル・ビジネス・マシーンズ・コーポレーション Deep Contour Correlation Prediction

Also Published As

Publication number Publication date
JP2018024340A (en) 2018-02-15

Similar Documents

Publication Publication Date Title
US10414408B1 (en) Real-time driver monitoring and feedback reporting system
US11475770B2 (en) Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium
CA3028630C (en) Systems and methods for identifying risky driving behavior
US10894321B1 (en) Mobile robot
US11017318B2 (en) Information processing system, information processing method, program, and vehicle for generating a first driver model and generating a second driver model using the first driver model
EP3560171A1 (en) Systems and methods for identifying risky driving behavior
WO2018029904A1 (en) Driving tendency determination apparatus and driving tendency determination system
CN111225603B (en) Electronic device and method for providing stress index corresponding to user activity
US11029743B2 (en) Information processing device and information processing method
KR102517228B1 (en) Electronic device for controlling predefined function based on response time of external electronic device on user input and method thereof
KR20130054327A (en) Methods and apparatuses for controlling invocation of a sensor
EP3736191A1 (en) Electronic device and method for vehicle driving assistance
JP2019200453A (en) Control system, learning data generation apparatus, learning apparatus and determination apparatus
CN109177909B (en) Vehicle rollover pre-judging method, device and equipment
EP3725217A1 (en) Electronic device and method for measuring heart rate
JP2012150613A (en) Work content measuring device and work management device
JP7299560B2 (en) Learning data generation method, training method, prediction model, computer program
KR102532230B1 (en) Electronic device and control method thereof
US20230026609A1 (en) Driving diagnostic information management apparatus and driving diagnostic information management method
US20220281485A1 (en) Control apparatus, system, vehicle, and control method
WO2019073845A1 (en) Vehicle, determination method, and determination program
WO2017081851A1 (en) Driving improvement detection device and driving improvement detection system
JP6167426B1 (en) Driving analysis device and driving analysis system
CN112596620B (en) Collision detection using smart phone sensor data
CN112298184B (en) Driving switching method, device, equipment and storage medium based on artificial intelligence

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17838980

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17838980

Country of ref document: EP

Kind code of ref document: A1