WO2015083420A1 - Dispositif à l'usage d'un instructeur, système support d'instruction, programme de support d'instruction, support d'enregistrement et procédé de support d'instruction - Google Patents

Dispositif à l'usage d'un instructeur, système support d'instruction, programme de support d'instruction, support d'enregistrement et procédé de support d'instruction Download PDF

Info

Publication number
WO2015083420A1
WO2015083420A1 PCT/JP2014/075527 JP2014075527W WO2015083420A1 WO 2015083420 A1 WO2015083420 A1 WO 2015083420A1 JP 2014075527 W JP2014075527 W JP 2014075527W WO 2015083420 A1 WO2015083420 A1 WO 2015083420A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation
data
vehicle
instructor
unit
Prior art date
Application number
PCT/JP2014/075527
Other languages
English (en)
Japanese (ja)
Inventor
圭祐 森島
Original Assignee
ヤマハ発動機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ発動機株式会社 filed Critical ヤマハ発動機株式会社
Priority to JP2015551414A priority Critical patent/JP6247704B2/ja
Publication of WO2015083420A1 publication Critical patent/WO2015083420A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/042Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles providing simulation in a real vehicle
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/052Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles characterised by provision for recording or measuring trainee's performance

Definitions

  • the present invention relates to an instructor apparatus, a learning support system, a learning support program, a recording medium, and a learning support method.
  • Patent Document 1 discloses an automobile teaching device.
  • the apparatus includes a video camera, a transmission antenna, a reception antenna, and a reception television.
  • the video camera and the transmission antenna are attached to the automobile.
  • the video camera takes an image and obtains a video signal.
  • the transmission antenna transmits a video signal.
  • the receiving antenna and the receiving television are installed in the monitoring room.
  • the receiving antenna receives a video signal.
  • the receiving television displays video based on the video signal.
  • Students will drive a car.
  • the instructor monitors the video displayed on the receiving television and grasps the driving operation of the student.
  • the conventional example having such a configuration has the following problems.
  • the receiving television displays video in real time.
  • the image changes from moment to moment. Therefore, the instructor may overlook the video. If the instructor overlooks the video even for a moment, the instructor no longer notices the overlooked video. Even if the operation to be instructed is shown in the video, the instructor cannot instruct the student properly.
  • the instructor evaluates the skill of the student driving the car based on the video.
  • the subject of the instructor is included in the evaluation results. That is, the evaluation result lacks objectivity.
  • the present invention has been made in view of such circumstances, and provides an instructor apparatus, a learning support system, a learning support program, a recording medium, and a learning support method that can appropriately support learning related to vehicle driving.
  • the purpose is to provide.
  • the present invention has the following configuration. That is, the present invention is an instructor device for supporting a lesson about driving a vehicle, The instructor uses the receiving unit that receives position data relating to the position of the vehicle at each time, and the vehicle during the evaluation period based on the position data and an evaluation result relating to the driving skill of the vehicle during the evaluation period.
  • An instructor apparatus comprising: a display control unit that identifies an evaluation position that is a traveled position and generates result display data in which the evaluation result and the evaluation position are associated; and a display unit that displays the result display data It is.
  • the instructor apparatus when the display unit displays the result display data, the instructor apparatus presents the evaluation result and the evaluation position where the evaluation result is obtained.
  • the user for example, instructor
  • the instructor apparatus can confirm the evaluation result that should not be overlooked together with the evaluation position without constantly monitoring the display unit. Therefore, the user can appropriately perform the training related to the driving of the vehicle. In this way, the instructor device can appropriately support the lessons related to driving the vehicle.
  • the display control unit changes a mode of displaying the evaluation position according to the evaluation result. Since the display mode of the evaluation position implies the evaluation result, the evaluation position and the evaluation result can be preferably associated with each other.
  • the result display data may indicate the evaluation position by a line, and indicate the evaluation result at the evaluation position by at least one of the thickness, type, and color of the line indicating the evaluation position. preferable. According to this, both the evaluation position and the evaluation result can be indicated by one line.
  • the instructor device includes a map storage unit that stores a route map, the display control unit generates the result display data using the route map, and the result display data is the evaluation
  • the route map associated with a location is included.
  • the display part can show suitably the relative positional relationship of a runway and an evaluation position.
  • the display control unit further specifies a traveling locus traveled by the vehicle based on the position data, and generates the result display data including the traveling locus.
  • the display unit can further present a travel locus.
  • the instructor device includes a monitoring condition storage unit that stores monitoring condition information, and the instructor display unit extracts the evaluation position and the evaluation result that match the monitoring condition information, It is preferable to generate the result display data based on the extracted evaluation position and the evaluation result.
  • the display unit does not display the evaluation position and the evaluation result that are not extracted. Therefore, the user can quickly confirm the evaluation position and the evaluation result that match the monitoring condition information.
  • the receiving unit receives measurement data related to the state of the vehicle at each time, and the display control unit further extracts the measurement data in the evaluation period from the measurement data as evaluation period measurement data. It is preferable to generate detailed display data including the evaluation period measurement data. According to this, when the display unit displays the detailed display data, the instructor apparatus can present the vehicle state at the evaluation position in detail.
  • the receiving unit receives image data captured by an imaging device provided in the vehicle, and the display control unit further receives the image data in the evaluation period from the image data in the evaluation period. It is preferable to extract as the image data and generate the detailed display data including the evaluation period image data. According to this, the display part can display the state of the driving
  • the instructor device preferably includes a message information input unit that receives message information regarding a message to the driver of the vehicle, and a transmission unit that transmits the message information. According to this, the instructor's apparatus can give message information to the driver of the vehicle.
  • the receiving unit receives measurement data related to the state of the vehicle at each time, and the instructor determines the evaluation period based on at least one of the position data and the measurement data. It is preferable to include an evaluation period determination unit and an evaluation unit that acquires the evaluation result in the evaluation period based on the measurement data and the evaluation period. According to this, the instructor's apparatus can acquire an objective and quantitative evaluation result. The user can objectively and quantitatively grasp the driving skill of the driver by confirming the evaluation result displayed on the display unit.
  • the present invention is a lesson support system for supporting lessons related to driving a vehicle, comprising: a vehicular device provided in the vehicle; and an instructor device that communicates with the vehicular device wirelessly,
  • the vehicle device includes a position detection unit that detects the position of the vehicle, and a vehicle state measurement unit that measures the state of the vehicle, and any one of the vehicle device and the instructor device includes the position
  • An evaluation period determination unit that determines an evaluation period based on at least one of position data corresponding to the detection result of the detection unit and measurement data corresponding to the measurement result of the vehicle state measurement unit, and at least the measurement data and the evaluation period
  • an evaluation unit that acquires an evaluation result related to the driving skill of the vehicle in the evaluation period, and the instructor device includes the position data and the evaluation period.
  • a display control unit that identifies an evaluation position, which is a position where the vehicle has traveled during the evaluation period, and generates result display data in which the evaluation result and the evaluation position are associated with each other, and the result display data
  • a learning support system comprising a display unit for displaying.
  • the position detection unit can preferably detect the position of the vehicle, and the vehicle state measurement unit can preferably measure the state of the vehicle.
  • the instructor device can communicate with the vehicle device wirelessly, it can be installed at a position away from the vehicle. Since either the vehicle device or the instructor device includes the evaluation period determination unit and the evaluation unit, an objective and quantitative evaluation result can be acquired.
  • the instructor apparatus includes a display control unit and a display unit, and the display unit displays result display data.
  • the instructor's device presents the evaluation result and the evaluation position where the evaluation result is obtained.
  • the user of the instructor apparatus can check the evaluation result that should not be overlooked together with the evaluation position without constantly monitoring the display unit. Therefore, the user can appropriately perform the training related to the driving of the vehicle.
  • the learning support system can appropriately support learning related to driving of the vehicle.
  • the vehicle device further includes a time detection unit that detects time, time information acquired by the time detection unit, position information acquired by the position detection unit, and the vehicle state measurement unit.
  • a data generation unit that generates the position data and the measurement data based on the acquired measurement value, and the position data includes the position information and the time information associated with the position information.
  • the measurement data includes the measurement value and time information associated with the measurement value. According to this, position information and a measured value can be easily synchronized in time.
  • the present invention is a learning support program executed in an instructor device for supporting a lesson related to driving of a vehicle, wherein the lesson support program receives position data related to the position of the vehicle at each time And an evaluation position that is a position where the vehicle has traveled during the evaluation period based on the position data and an evaluation result regarding the driving skill of the vehicle during the evaluation period, and the evaluation result and the evaluation position are associated with each other
  • a learning support program for causing a computer included in the instructor apparatus to execute processing for generating the obtained result display data.
  • the computer included in the instructor apparatus can suitably generate the result display data. If this result display data is displayed, the instructor apparatus can present the evaluation result and the evaluation position where the evaluation result is obtained.
  • the present invention is a computer-readable recording medium that records the above-described teaching support program.
  • the above-described education support program can be appropriately installed in the instructor's apparatus.
  • the present invention is also a learning support method for supporting a lesson related to driving of a vehicle, wherein the lesson support method includes a step of receiving position data regarding the position of the vehicle at each time, and the position data and evaluation Based on the evaluation result regarding the driving skill of the vehicle in the period, the evaluation position which is the position where the vehicle traveled in the evaluation period is specified, and result display data in which the evaluation result and the evaluation position are associated is generated And a learning support method comprising the step of displaying the result display data.
  • the evaluation result and the evaluation position from which the evaluation result is obtained can be suitably presented.
  • the instructor device includes a switching command input unit that receives a switching command related to display switching, and the display control unit is configured to display the result display data and the detailed display data based on the switching command. It is preferable to display any of the above on the display unit.
  • the display on the display unit can be suitably switched between the display of the result display data and the display of the detailed display data.
  • the instructor device preferably includes a notification control unit that controls the message information input unit and the transmission unit.
  • message information can be suitably received and transmitted.
  • the lesson support program further causes a computer included in the instructor apparatus to execute a process of displaying the result display data.
  • the evaluation position and the evaluation result can be suitably presented to the user of the instructor's device.
  • the user can confirm without overlooking the evaluation result. Further, the user can confirm the evaluation position where the evaluation result is obtained together with the evaluation result. Thus, the present invention can appropriately support a lesson related to driving a vehicle.
  • FIG. 1 is a schematic configuration diagram illustrating a learning support system according to Embodiment 1.
  • FIG. 1 is a side view showing a schematic configuration of a motorcycle. It is a figure which illustrates the hardware constitutions of the apparatus for vehicles. The hardware structure of the apparatus for teachers is illustrated. It is a block diagram which illustrates the functional structure of a learning assistance system. It is a figure which shows the example of a display of a display typically. It is a figure which shows the example of a display of a display typically. It is a block diagram which illustrates the functional structure of a learning assistance system. It is a detailed functional block diagram of an evaluation period determination part and an evaluation part. It is a flowchart which shows the operation
  • FIG. 6 is a block diagram illustrating a functional configuration of a learning support system according to a second embodiment. It is a block diagram which illustrates the functional structure of the learning assistance system which concerns on a modified example.
  • FIG. 6 is a side view showing a schematic configuration of a motorcycle according to a modified embodiment. It is a figure which illustrates the hardware constitutions of the apparatus for vehicles. It is a figure which shows the example of a display of a display typically. It is a figure which shows the example of a display of a display typically. It is a figure which shows the example of a display of a display typically. It is a figure which shows the example of a display of a display typically.
  • Embodiment 1 of the present invention will be described below with reference to the drawings.
  • FIG. 1 is a schematic configuration diagram illustrating a learning support system according to the present embodiment.
  • the training support system 1 supports training related to vehicle driving.
  • the learning support system 1 is operated, for example, at a driving school (facility).
  • the vehicle is, for example, a motorcycle 3a, 3b, 3c.
  • Drivers Ma, Mb and Mc of the motorcycles 3a, 3b and 3c are students.
  • the training support system 1 includes vehicle devices 5a, 5b, and 5c and a teacher device 7.
  • Vehicle devices 5a, 5b, and 5c are provided in motorcycles 3a, 3b, and 3c, respectively.
  • the instructor device 7 is not physically connected to any of the motorcycles 3a, 3b, and 3c, and is provided separately from the motorcycles 3a, 3b, and 3c.
  • the user of the instructor device 7 is, for example, an instructor who teaches the students Ma, Mb, and Mc.
  • motorcycles 3a, 3b, and 3c are not particularly distinguished, they are simply referred to as “motorcycle 3”.
  • vehicle devices 5a, 5b, and 5c are similarly described as “vehicle device 5”.
  • students Ma, Mb, and Mc are also described as “Student M”.
  • the vehicle device 5 and the instructor device 7 are information devices each having a communication function.
  • the vehicle device 5 is, for example, a smartphone, a tablet terminal, a mobile phone, or the like.
  • the instructor device 7 is, for example, a tablet terminal, a notebook PC (Personal Computer), a smartphone, a mobile phone, or the like.
  • the vehicle device 5 and the instructor device 7 communicate with each other wirelessly.
  • a communication method is appropriately selected.
  • the communication method is, for example, a telecommunication network, wireless LAN, WiFi (registered trademark) or Bluetooth (registered trademark).
  • the apparatus 5 for vehicles and the apparatus 7 for teachers may communicate via a web server. Specifically, one of the vehicle device 5 and the instructor device 7 uploads information to a web server, and the other downloads the information from the web server.
  • FIG. 2 is a side view showing a schematic configuration of the motorcycle 3.
  • the motorcycle 3 includes a main frame 11.
  • a head pipe 12 is provided at the upper front end of the main frame 11.
  • a steering shaft 13 is inserted through the head pipe 12.
  • An upper bracket (not shown) is fixed to the upper end portion of the steering shaft 13, and a lower bracket 14 is fixed to the lower end portion.
  • a pair of left and right extendable front forks 15 are supported by these brackets.
  • a handle 16 is connected to the upper bracket.
  • a throttle operating part and a brake lever (not shown) are arranged on the right part of the handle 16, and a clutch lever (not shown) is arranged on the left part of the handle 16.
  • a vehicle device 5 is supported at the center of the handle 16.
  • Rotating the handle 16 causes the front fork 15 to swing around the steering shaft 13.
  • the upper part of the front fork 15 is an outer tube 15a
  • the lower part of the front fork 15 is an inner tube 15b.
  • the outer tube 15a is connected to both brackets.
  • the outer tube 15a slidably supports the inner tube 15b.
  • a front wheel 17 is rotatably attached to the lower end portion of the inner tube 15b. The vibration of the front wheel 17 is absorbed by the expansion and contraction of the front fork 15.
  • a brake 18 is attached between the inner tube 15b and the front wheel 17, and the rotation of the front wheel 17 is braked by operating the brake lever.
  • a front fender 19 is supported on the inner tube 15 b so as to move up and down together with the front wheel 17.
  • the upper part of the main frame 11 holds a fuel tank 20 and a seat 21 side by side.
  • An engine 22 and a transmission 23 are held by the main frame 11 at a position below the fuel tank 20.
  • the transmission 23 includes a drive shaft 24 that outputs power generated by the engine 22.
  • a drive sprocket 25 is connected to the drive shaft 24.
  • a swing arm 26 is swingably supported at the lower part and the rear part of the main frame 11.
  • a driven sprocket 27 and a rear wheel 28 are rotatably supported at the rear end of the swing arm 26.
  • a chain 29 is suspended between the drive sprocket 25 and the driven sprocket 27.
  • the power generated by the engine 22 is transmitted to the rear wheel 28 via the transmission 23, the drive shaft 24, the drive sprocket 25, the chain 29 and the driven sprocket 27.
  • FIG. 3 illustrates a hardware configuration of the vehicle device 5.
  • the vehicle device 5 includes a CPU (Central Processing Unit) 31, a storage unit 32, a GPS (Global Positioning System) receiving unit 33, a vehicle attitude angle sensor 34, a camera 35, a communication unit 36, a display 37, and a speaker 38. Yes.
  • the CPU 31 reads and executes the program stored in the storage unit 32. As a result, the CPU 31 controls the operations of the units 32 to 38 and performs various processes.
  • the CPU 31 is, for example, a microprocessor.
  • the storage unit 32 stores various programs and various data in advance before the operation of the learning support system 1.
  • One of the various data is vehicle identification information for identifying the motorcycle 3 on which the vehicle device 5 is mounted.
  • the storage unit 32 stores various data during use of the learning support system 1 and is used as a work area for processing by the CPU 31.
  • the storage unit 32 is, for example, a ROM (Read Only Memory), a flash memory (Flash Memory), or a RAM (Random Access Memory).
  • the GPS receiver 33 detects the time and the position of the motorcycle 3. Thereby, the GPS receiving unit 33 obtains position information and time information.
  • the GPS receiver 33 is an example of a position detector in the present invention, and is an example of a time detector.
  • the vehicle attitude angle sensor 34 measures the attitude angle of the motorcycle 3.
  • the attitude angle is, for example, a vehicle angle or a vehicle angular velocity.
  • the vehicle angle is, for example, a yaw angle, a roll angle, or a pitch angle.
  • the vehicle angular velocity is, for example, a yaw rate, a roll rate, or a pitch rate.
  • the yaw angle, roll angle, and pitch angle are rotation angles of the motorcycle 3 in the yaw direction, roll direction, and pitch direction.
  • the yaw rate, roll rate, and pitch rate are rotational angular velocities in the yaw direction, roll direction, and pitch direction, respectively.
  • the vehicle attitude angle sensor 34 is, for example, a gyro sensor.
  • the vehicle attitude angle sensor 34 is an example of a vehicle state measurement unit in the present invention.
  • the camera 35 performs shooting. Thereby, the camera 35 obtains image information.
  • the orientation and arrangement of the camera 35 are appropriately selected. For example, it is preferable to install the camera 35 so that the road surface in front of the motorcycle 3 is reflected in a part of the field of view of the camera 35.
  • the camera 35 is realized by, for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) sensor.
  • the camera 35 is an example of an imaging device in the present invention.
  • the communication unit 36 transmits and receives information to and from the instructor device 7.
  • the display 37 displays an image.
  • the speaker 38 outputs sound such as voice.
  • FIG. 4 illustrates a hardware configuration of the instructor device 7.
  • the instructor device 7 includes a CPU 41, a storage unit 42, a communication unit 43, a display 44, a touch panel 45, a microphone 46, and a fixed key 47.
  • the CPU41 reads and executes the program memorize
  • the CPU 41 is an example of the “computer included in the instructor device” in the present invention.
  • the storage unit 42 stores various programs and various data in advance.
  • One of the various programs is a learning support program.
  • the storage unit 42 stores various data and is used as a work area for processing by the CPU 41 during use of the learning support system 1.
  • the storage unit 42 is, for example, a ROM, a flash memory, a RAM, or a hard disk.
  • the communication unit 43 transmits / receives information to / from the communication unit 36.
  • the display 44 displays an image.
  • the touch panel 45 has translucency and is provided on the surface of the display 44.
  • the touch panel 45 detects a position where the instructor touches the display 44.
  • An instructor's voice or the like is input to the microphone 46.
  • the fixed key 47 is an input device operated by an instructor. 6 and 7 described later illustrate the display 44, the touch panel 45, the microphone 46, and the fixed key 47.
  • FIG. 5 is a block diagram illustrating a functional configuration of the learning support system 1.
  • FIG. 5 shows a functional configuration particularly related to the display of the instructor apparatus 7.
  • the vehicle device 5 includes a data generation unit 71.
  • the data generation unit 71 is a function realized by the CPU 31 executing a program.
  • the data generation unit 71 generates position data related to the position of the motorcycle 3 at each time based on the detection result of the GPS reception unit 33.
  • the data generation unit 71 generates measurement data related to the state of the motorcycle 3 at each time based on the measurement result of the vehicle attitude angle sensor 34 and the detection result of the GPS reception unit 33.
  • the data generation unit 71 generates image data based on the imaging result of the camera 35 and the detection result of the GPS reception unit 33.
  • the location data includes location information and time information associated with the location information.
  • the time information associated with the position information indicates the time when the position information is detected.
  • the measurement data includes a measurement value and time information associated with the measurement value.
  • the time information associated with the measurement value indicates the time when the measurement value was measured.
  • the image data includes image information and time information associated with the image information.
  • the time information associated with the image information indicates the time when the image information was captured.
  • each of the position data, measurement data, and image data includes vehicle identification information.
  • position data, measurement data, and image data are mutually linked by common vehicle identification information.
  • the communication unit 36 transmits position data, measurement data, and image data.
  • the instructor device 7 includes a receiving unit 50, a data storage unit 51, an evaluation period determination unit 52, an evaluation unit 53, a map storage unit 54, and a display control unit 55.
  • the receiving unit 50 is realized by the communication unit 43.
  • the data storage unit 51 and the map storage unit 54 are realized by the storage unit 42.
  • the evaluation period determination unit 52, the evaluation unit 53, and the display control unit 55 are functions realized by the CPU 41 executing the learning support program.
  • the receiving unit 50 receives position data, measurement data, and image data.
  • the data storage unit 51 accumulates position data, measurement data, and image data. Various data stored in the data storage unit 51 is time-series data.
  • Evaluation period determination unit 52 determines an evaluation period.
  • the evaluation period is a target for performing an evaluation process related to the driving skill of the motorcycle 3.
  • the evaluation period is defined by time information.
  • the evaluation unit 53 performs an evaluation process related to the driving skill of the motorcycle 3 during the evaluation period. By the evaluation process, the evaluation unit 53 acquires an evaluation result for each evaluation period.
  • the evaluation result is represented by either “good evaluation result” or “bad evaluation result”.
  • the “good evaluation result” has a higher evaluation result than the “bad evaluation result”. Details of the evaluation period determination unit 52 and the evaluation unit 53 will be described later.
  • the map storage unit 54 stores a route map “a” indicating a running path on which the motorcycle 3 travels.
  • the route map a is map information related to a course at a school.
  • the display control unit 55 generates various display data to be displayed on the display 44 and displays the display data on the display 44.
  • the display 44 is an example of a display unit in the present invention.
  • FIG. 6 is a diagram schematically showing a display example on the display 44.
  • the display 44 displays the result display data Aa, Ab, and Ac.
  • Result display data Aa, Ab, and Ac show the evaluation results regarding the motorcycles 3a, 3b, and 3c, respectively.
  • the result display data Aa, Ab, and Ac are not particularly distinguished, they are simply referred to as “result display data A”.
  • the result display data A includes evaluation results and evaluation positions that are related to each other.
  • the result display data A indicates the relationship between the evaluation result and the evaluation position.
  • the evaluation position is a position where the motorcycle 3 travels during the evaluation period.
  • the evaluation position is indicated by solid lines b1 and b2.
  • the evaluation result is indicated by the thickness of the lines b1 and b2.
  • a good evaluation result is indicated by a relatively thin solid line b1.
  • the bad evaluation result is highlighted by a relatively thick solid line b2.
  • the result display data A further includes a travel locus.
  • the travel locus is a position where the motorcycle 3 travels.
  • the travel locus is indicated by a dotted line c.
  • the travel locus in the evaluation period has the same meaning as the evaluation position.
  • the result display data A further includes a route map a.
  • the route map a is associated with the evaluation position and the travel locus. Specifically, an evaluation position (lines b1, b2) and a travel locus (line c) are superimposed on the route map a. As described above, the result display data A indicates the relationship between the evaluation position, the travel locus, and the route map a.
  • the display control unit 55 identifies the evaluation position based on the position data and the evaluation period, and generates one result display data A for each motorcycle 3. With reference to FIG. 5, a processing example for generating the result display data A will be described.
  • the display control unit 55 identifies the evaluation position. Specifically, the display control unit 55 receives position data from the data storage unit 51 and receives evaluation periods t1 and t2 from the evaluation period determination unit 52. The display control unit 55 extracts position data in the evaluation period t1 based on the position data and the evaluation period t1. In this specification, the position data in the evaluation period is particularly referred to as “evaluation period position data”. And the display control part 55 specifies the position prescribed
  • the display control unit 55 associates the evaluation position with the evaluation result. Specifically, the evaluation result is received from the evaluation unit 53 in the evaluation periods t1 and t2. The display control unit 55 associates the evaluation result in the evaluation period t1 with the evaluation position in the evaluation period t1. Similarly, the display control unit 55 associates the evaluation result in the evaluation period t2 with the evaluation position in the evaluation period t2.
  • the display control unit 55 further generates lines b1 and b2 based on the evaluation position and the evaluation result related to each other (see FIG. 6). Specifically, the display control unit 55 generates a line b1 indicating the evaluation position associated with the good evaluation result, and generates a line b2 indicating the evaluation position associated with the bad evaluation result. In this way, the display control unit 55 changes the thickness of the line indicating the evaluation position according to the evaluation result at the evaluation position.
  • the display control unit 55 specifies a travel locus based on the position data.
  • the display control unit 55 generates a dotted line c indicating the travel locus.
  • the display control unit 55 receives the route map a from the map storage unit 54.
  • the display control unit 55 superimposes the lines b1, b2 and the line c on the route map a. As a result, the result display data A is generated.
  • the display control unit 55 When the display control unit 55 generates a plurality of result display data Aa, Ab, and Ac, the display control unit 55 displays the plurality of result display data Aa, Ab, and Ac on a list on the display 44 (see FIG. 6).
  • the display 44 further displays a legend part B, a voice button part C, and a designation button part D.
  • the legend part B includes a description regarding lines used in the result display data A.
  • the voice button part C and the designation button part D display buttons operated by the instructor (details will be described later). These parts B, C, and D are also generated by the display control unit 55.
  • FIG. 7 is a diagram schematically showing a display example of the display 44.
  • the display 44 displays the detailed display data E.
  • the detailed display data E indicates measurement data and image data in one evaluation period regarding any one of the motorcycles 3.
  • the detailed display data E includes, for example, a graph Eg, a table Et, and a moving image Em.
  • a graph Eg and a table Et show measurement data in the evaluation period. Specifically, the graph Eg shows a temporal change in the measured value during the evaluation period.
  • the vertical line shown in the center part of the graph Eg is a cursor which shows an instantaneous value on time series data.
  • Table Et shows the measured value (instantaneous value) in the evaluation period by letters (including numbers).
  • the moving image Em indicates image data in the evaluation period. Specifically, the moving image Em continuously displays image data in the evaluation period.
  • the measurement data in the evaluation period is referred to as “evaluation period measurement data”
  • the image data in the evaluation period image data is referred to as “evaluation period image data”.
  • the display control unit 55 generates one detailed display data E for each evaluation period based on the evaluation period, the measurement data, and the image data. With reference to FIG. 5, a processing example for generating the detailed display data E will be described.
  • the display control unit 55 receives measurement data and image data from the data storage unit 51 and receives evaluation periods t1 and t2 from the evaluation period determination unit 52.
  • the display control unit 55 generates detailed display data E1 based on the measurement data, the image data, and the evaluation period t1, and generates detailed display data E2 based on the measurement data, the image data, and the evaluation period t2.
  • the display control unit 55 extracts evaluation period measurement data in the evaluation period t1 from the measurement data.
  • the display control unit 55 generates a graph Eg and a table Et in the evaluation period t1 based on the evaluation period measurement data.
  • the display control unit 55 extracts the evaluation period image data in the evaluation period t1 from the image data.
  • the display control unit 55 generates a moving image Em in the evaluation period t1 based on the evaluation period image data.
  • the graph Eg, the table Et, and the moving image Em in the evaluation period t1 constitute the detailed display data E1 in the evaluation period t1.
  • the display control unit 55 synchronizes temporally the graph Eg, the table Et, and the moving image Em included in the detailed display data E. Specifically, the display control unit 55 causes the display 44 to simultaneously display measurement values and image data associated with the same time information.
  • the display control unit 55 Even if the display control unit 55 generates a plurality of detailed display data E1, E2, the display control unit 55 does not display the detailed display data E1, E2 together on the display 44 (see FIG. 7).
  • the display 44 further displays a basic display unit F, an evaluation position display unit G, an image control unit H, a switching button unit I, and a voice button unit J.
  • the basic display unit F shows vehicle identification information and evaluation results related to the detailed display data E.
  • the evaluation position display part G shows an evaluation position related to the detailed display data E.
  • the image control unit H, the switching button unit I, and the voice button J each display buttons operated by the instructor. These units F to J are also generated by the display control unit 55.
  • the touch panel 45 receives various image control commands related to the display of the moving image Em.
  • the image control command is sent from the touch panel 45 to the display control unit 55.
  • the display control unit 55 controls the display of the moving image Em such as playback and stop based on the image control command.
  • the touch panel 45 accepts a switching command related to display on the display 44.
  • the switching command is sent from the touch panel 45 to the display control unit 55.
  • the display control unit 55 causes the display 44 to display the result display data A and the like instead of the detailed display data E and the like based on the switching command.
  • the display on the display 44 is switched from the display illustrated in FIG. 7 to the display illustrated in FIG.
  • the lines b1 and b2 indicating the evaluation positions in the result display data A also serve as switching buttons.
  • the touch panel 45 receives a switching command and sends the switching command to the display control unit 55.
  • the display control unit 55 switches the display on the display 44 based on the switching command.
  • the display 44 displays the detailed display data E at the tapped evaluation position instead of the result display data A.
  • the touch panel 45 is an example of a switching command input unit in the present invention.
  • FIG. 8 is a block diagram illustrating a functional configuration of the learning support system 1.
  • FIG. 8 particularly relates to notification of a message from the instructor device 7 to the vehicle device 5.
  • the instructor device 7 includes a notification control unit 56 and a transmission unit 57.
  • the notification control unit 56 is a process realized by the CPU 41 executing the learning support program.
  • the transmission unit 57 is realized by the communication unit 43.
  • the microphone 46 receives the instructor's utterance as message information.
  • the microphone 46 when the microphone 46 receives message information, the microphone 46 is turned on. Specifically, when the instructor operates the voice button part C or the voice button part J (see FIGS. 6 and 7), the touch panel 45 receives an ON command. The ON command is sent from the touch panel 45 to the notification control unit 56. The notification control unit 56 turns on the microphone 46 based on the ON command. After the microphone 46 is turned on, the instructor inputs message information to the microphone 46.
  • the microphone 46 is an example of a message information input unit in the present invention.
  • the notification control unit 56 controls the operation of the microphone 46 and the transmission unit 57.
  • the notification control unit 56 further determines at least one vehicle device 5 as a message information notification destination. The process of determining the notification destination depends on the case where the voice button part C displayed together with the result display data A is operated and the case where the voice button part J displayed together with the detailed display data E is operated. Different.
  • the instructor When the voice button part C is operated, the instructor operates the designation button part D (see FIG. 6) displayed on the display 44 together with the voice button part C. Thereby, the touch panel 45 accepts a designation command for designating a notification destination.
  • the designation command is sent from the touch panel 45 to the notification control unit 56.
  • the notification control unit 56 determines the vehicle device 5 designated by the designation command as a notification destination.
  • the notification control unit 56 specifies the motorcycle 3 related to the detailed display data E, and determines the vehicle device 5 provided in the specified motorcycle 3 as a notification destination.
  • the transmission unit 57 transmits message information to the vehicle device 5 determined as the notification destination.
  • the transmitted message information is received by the vehicle device 5 that is the notification destination and is output by the speaker 38.
  • FIG. 9 is a detailed functional block diagram of the evaluation period determination unit 52 and the evaluation unit 53.
  • the instructor device 7 further includes an evaluation period condition storage unit 58 and an evaluation reference storage unit 59.
  • the evaluation period condition storage unit 58 and the evaluation reference storage unit 59 are realized by the storage unit 42.
  • the evaluation period condition storage unit 58 stores evaluation period condition information.
  • the evaluation reference storage unit 59 stores a filter, vehicle stability characteristic evaluation information, turning characteristic evaluation information, and comprehensive evaluation information.
  • the evaluation period determination unit 52 determines an evaluation period based on the measurement data. Hereinafter, the process which determines an evaluation period is illustrated.
  • the evaluation period determination unit 52 acquires measurement data related to the yaw rate from the data storage unit 51.
  • the evaluation period determination unit 52 acquires the lower limit value Xt and the minimum duration Y min from the evaluation period condition storage unit 58.
  • the lower limit Xt and the minimum duration Y min are examples of evaluation period condition information.
  • the evaluation period determining unit 52 determines an evaluation period period absolute value of the yaw rate is not less than the lower limit value X t over a minimum duration Y min.
  • the determined evaluation period corresponds to a period during which the motorcycle 3 is turning.
  • the evaluation unit 53 includes a component separation unit 61, a vehicle stability characteristic evaluation unit 62, a turning characteristic determination unit 63, and a comprehensive evaluation unit 64.
  • the component separation unit 61 extracts a low frequency component and a high frequency component of the evaluation period measurement data. Below, the process which extracts the low frequency component and high frequency component of the roll rate in an evaluation period is illustrated.
  • the component separation unit 61 reads roll rate measurement data from the data storage unit 51.
  • the component separation unit 61 extracts evaluation period measurement data from the measurement data based on the evaluation period.
  • the component separation unit 61 reads out the low pass filter and the band pass filter from the evaluation criterion storage unit 59.
  • the low pass filter and the band pass filter are examples of filters.
  • the component separation unit 61 removes high frequency components higher than a predetermined threshold frequency Fc1 from the evaluation period measurement data using a low-pass filter. Thereby, the component separation unit 61 obtains a low frequency component of the roll rate in the evaluation period.
  • the component separation unit 61 uses a bandpass filter to remove low frequency components having a threshold frequency Fc1 or less and noise components having a predetermined threshold frequency Fc2 (Fc2> Fc1) or more from the evaluation period measurement data. From this, the component separation part 61 obtains the high frequency component of the roll rate in the evaluation period.
  • the vehicle stability characteristic evaluation unit 62 evaluates the driving skill from the viewpoint of vehicle stability characteristics based on the evaluation period measurement data.
  • the process of the vehicle stability characteristic evaluation unit 62 will be specifically exemplified.
  • the vehicle stability characteristic evaluation unit 62 acquires the low frequency component and the high frequency component of the yaw rate, roll rate, and pitch rate during the evaluation period from the component separation unit 61.
  • the ratio between the integral value of the absolute value of the low frequency component and the integral value of the absolute value of the high frequency component is defined as a stability characteristic index S.
  • the vehicle stability characteristic evaluation unit 62 calculates a stability characteristic index S yaw related to the yaw rate, a stability characteristic index S roll related to the roll rate, and a stability characteristic index S pitch related to the pitch rate. Further, the vehicle stability characteristic evaluation unit 62 calculates a weighted linear sum (hereinafter referred to as “vehicle stability characteristic score S v ”) of the stability characteristic indexes S yaw , S roll, S pitch .
  • Stable characteristic indicator S yaw, S roll such as a function for calculating the S pitch and vehicle stability properties score S v is included in the vehicle stability characterization information.
  • the vehicle stability characteristic evaluation unit 62 performs the above-described processing using the vehicle stability characteristic evaluation information read from the evaluation criterion storage unit 59.
  • the turning characteristic evaluation unit 63 evaluates the driving skill from the viewpoint of turning characteristics based on the evaluation period measurement data and the evaluation period position data.
  • the process of the turning characteristic evaluation unit 63 will be specifically exemplified.
  • the turning characteristic evaluation unit 63 acquires the low frequency components of the roll angle and the pitch angle in the evaluation period from the component separation unit 61.
  • the turning characteristic evaluation unit 63 calculates an integral value of the absolute value of the low frequency component of the roll angle (hereinafter referred to as “turning characteristic index T roll ”).
  • the turning characteristic evaluation unit 63 calculates an integral value (hereinafter referred to as “turning characteristic index T pitch ”) of the absolute value of the low frequency component of the pitch rate.
  • the turning characteristic evaluation unit 63 extracts evaluation period position data based on the evaluation period and position data.
  • the turning characteristic evaluation unit 63 calculates the average vehicle speed T speed during the evaluation period based on the evaluation period position data.
  • the turning characteristic evaluation unit 63 further calculates a weighted linear sum of the turning characteristic indexes T roll and T pitch and the average vehicle speed T speed (hereinafter referred to as “turning characteristic score T v ”).
  • Turning characteristic index T roll, T pitch, average speed T speed such as a function for calculating a turning characteristic score T v is included in the turning characteristic evaluation information.
  • the turning characteristic evaluation information and the vehicle stability characteristic evaluation information define different evaluation criteria.
  • the turning characteristic evaluation unit 63 performs the above-described processing using the turning characteristic evaluation information read from the evaluation reference storage unit 59.
  • the comprehensive evaluation unit 64 the weighted linear sum of the vehicle stability characteristics score S v and turning characteristic score T v (hereinafter referred to as "overall performance score U") is calculated.
  • the comprehensive evaluation unit 64 compares the comprehensive characteristic score U with a threshold value. As a result, when the total characteristic score U is higher than the threshold value, the comprehensive evaluation unit 64 determines the evaluation result as “good evaluation result”. Otherwise, the comprehensive evaluation unit 64 determines the evaluation result as “bad evaluation result”.
  • the function for calculating the total characteristic score U, the threshold for classifying the evaluation result according to the total characteristic score U, and the like are included in the total evaluation information.
  • the comprehensive evaluation unit 64 performs the above-described processing using the comprehensive evaluation information read from the evaluation reference storage unit 59.
  • FIG. 10 is a flowchart showing an operation of transmitting data.
  • the GPS receiver 33 detects the time and the position of the motorcycle 3 (step S1).
  • the vehicle attitude angle sensor 34 measures the attitude angle of the motorcycle 3 (step S2).
  • the camera 35 performs shooting (step S3).
  • CPU31 reads the detection result of the GPS receiving part 33, and determines whether the position and time were detected successfully (step S4). If it is determined to be successful, the process proceeds to steps S5, S6, and S8. Otherwise, the process returns to step S1.
  • CPU 31 generates position data (step S5).
  • the CPU 31 reads the measurement result of the vehicle attitude angle sensor 34 and determines whether or not the measurement is successful (step S6). When it determines with success, CPU31 produces
  • the CPU 31 reads the photographing result of the camera 35 and determines whether or not the photographing is successful (step S8). If it is determined to be successful, the CPU 31 generates image data (step S9). Otherwise, skip step S9 and proceed to step S10.
  • the communication unit 36 transmits the generated various data (step S10).
  • the data to be transmitted always includes position data. Then, the process returns to step S1.
  • the position data is transmitted as needed.
  • measurement data and image data are generated, they are transmitted together.
  • FIG. 11 is a flowchart showing an operation for displaying display data.
  • the receiving unit 50 receives position data, measurement data, and image data (step S11).
  • the display control unit 55 identifies a travel locus based on the position data (step S12).
  • the evaluation period determination unit 52 performs a process of determining the evaluation period based on the measurement data (step S13). When the evaluation period is determined as a result of step S13, the process proceeds to steps S14 and S15. Otherwise, skip steps S14 and S15 and proceed to steps S16 and S17.
  • the display control unit 55 identifies the evaluation position based on the evaluation period and the position data (step S14).
  • the evaluation unit 53 performs an evaluation process based on the evaluation period, measurement data, and position data, and acquires an evaluation result (step S15).
  • the display control unit 55 generates the result display data A based on the travel locus, the evaluation position, and the evaluation result (step S16).
  • the display control unit 55 generates detailed display data E based on the evaluation period, measurement data, and image data (step S17). However, if the evaluation period is not determined as a result of step S13, the display control unit 55 updates the traveling locus of the result display data A based on the traveling locus (step 16). In this case, the detailed display data E is not generated (step S17).
  • the display 44 displays the display data A and E generated in steps S16 and S17 (step S18). In step S18, the display 44 displays either the result display data A or the detailed display data E according to the operation of the instructor.
  • the display 44 When displaying the result display data A, the display 44 displays all the result display data A generated in step S16 side by side.
  • the display 44 displays the legend part B, the voice button part C, and the designation button part D together with the result display data A.
  • the display 44 When displaying the detailed display data E, the display 44 displays any one of the detailed display data E generated in step S17. Further, the display 44 displays a basic display part F, an evaluation position display part G, an image control part H, a switching button part I, and a voice button part J together with the detailed display data E.
  • FIG. 12 is a flowchart showing an operation of notifying message information from the instructor device 7 to the vehicle device 5.
  • steps S21 to S23 are operations of the instructor apparatus 7
  • steps S24 and S25 are operations of the vehicle apparatus 5.
  • the notification controller 56 turns the microphone 46 on.
  • the microphone 46 acquires the instructor's utterance as message information (step S21).
  • the notification control unit 56 specifies the vehicle device 5 that is the notification destination in response to the operation of the designation button unit D by the instructor or the control of the display 44 by the display control unit 55 (step S22).
  • the transmission unit 57 transmits the message information to the vehicular device 5 that is the notification destination (step S23).
  • the communication unit 36 of the vehicle device 5 receives the message information (step S24).
  • the speaker 38 of the vehicle device 5 outputs message information (step S25).
  • the instructor apparatus 7 includes the display control unit 55 that generates the result display data A and the display 44 that displays the result display data A.
  • the result display data A shows the evaluation position and the evaluation result in the same evaluation period in association with each other. Therefore, the instructor can suitably confirm the position where the student M has driven the motorcycle 3 and the evaluation result regarding the driving skill at the position. Therefore, the instructor can appropriately instruct the student M.
  • the instructor can look back at the evaluation position and the evaluation result at any time by simply looking at the display 44.
  • the evaluation position and the evaluation result indicated by the result display data A are greatly different from the information indicated by the normal moving image and sound. For this reason, even if the instructor does not always monitor the display 44, the instructor can confirm the evaluation position and the evaluation result at an arbitrary timing, thereby confirming the evaluation position and the evaluation result that should not be overlooked without omission. Therefore, the instructor can appropriately confirm the driving operation of the student M.
  • the instructor can carry out the lessons with a margin. For example, it is possible to look directly at the state where the student M is driving the motorcycle 3 while keeping an eye on the display 44. According to this, the instructor can grasp the driving operation of the student M in more detail.
  • the display control unit 55 changes the manner in which the evaluation position is displayed in the result display data A according to the evaluation result. Therefore, not only the evaluation position but also the evaluation result at the evaluation position can be suitably displayed. Specifically, the display control unit 55 changes the thickness of the line indicating the evaluation position according to the evaluation result. According to this, both the evaluation position and the evaluation result can be shown by one line.
  • the display control unit 55 emphasizes the bad evaluation result in the result display data A rather than the good evaluation result. Thereby, the instructor can confirm the bad evaluation result more easily.
  • the display control unit 55 indicates the evaluation position by a line drawn on the route map a in the result display data A. Therefore, the instructor can confirm the relative positional relationship between the runway and the evaluation position.
  • the display control unit 55 further indicates a travel locus in the result display data A. Therefore, the instructor can confirm the traveling locus.
  • the display control unit 55 causes the display 44 to display a plurality of result display data A in a list. Therefore, the instructor can simultaneously confirm the evaluation results regarding the driving skills of a plurality of students M.
  • the display control unit 55 generates detailed display data E, and the display 44 displays the detailed display data E.
  • the detailed display data E indicates evaluation period measurement data. Therefore, the state of the motorcycle 3 at the evaluation position can be confirmed in detail.
  • the detailed display data E includes a graph Eg indicating the evaluation period measurement data. Therefore, the instructor can easily confirm the temporal change in the measured value during the evaluation period.
  • the detailed display data E includes a table Et indicating the evaluation period measurement data. Therefore, the instructor can easily confirm the measurement value (instantaneous value) included in the evaluation period measurement data.
  • Detailed display data E indicates evaluation period image data. Therefore, the instructor can see the state of the motorcycle 3 at the evaluation position.
  • the detailed display data E includes a moving image Em indicating evaluation period image data. Therefore, the state of the motorcycle 3 at the evaluation position can be seen in detail.
  • the display control unit 55 displays the graph Eg, the table Et, and the moving image Em in synchronization. Therefore, the state and state of the motorcycle 3 at the evaluation position can be confirmed in more detail.
  • the touch panel 45 receives a switching command by an operation by an instructor.
  • the display control unit 55 switches the display on the display 44 between the result display data A and the detailed display data E based on the switching command. Thereby, the instructor can promptly confirm the result display data A and the detailed display data E, respectively.
  • the line b indicating the evaluation position in the result display data A also serves as a switching button. Thereby, the instructor can easily select the detailed display data E at an arbitrary evaluation position.
  • the instructor device 7 includes the receiving unit 50, the position data of the motorcycle 3 can be received at a position away from the motorcycle 3. Therefore, the instructor can use the instructor device 7 at a position away from the motorcycle 3.
  • the instructor device 7 includes the microphone 46, the instructor can promptly input message information. Since the instructor apparatus 7 includes the transmission unit 57, the instructor apparatus 7 can give message information to the student M. As a result, the driving operation of the student M can be improved efficiently.
  • the display control unit 55 displays the designation button part D on the display 44 together with the result display data A, the touch panel 45 accepts a designation command according to the operation of the instructor, and the notification control unit 56 selects a notification destination based on the designation command. decide. Thereby, the instructor can easily select the student M who notifies the message information.
  • the notification control unit 56 automatically determines the notification destination based on the control of the display 44 by the display control unit 55. Thereby, the operation by the instructor is reduced, and the instructor can notify the message information more promptly.
  • the evaluation unit 53 acquires an evaluation result based on the evaluation period, measurement data, and position data, the evaluation result is objective and quantitative. Therefore, the instructor can grasp the driving skill of the student M objectively and quantitatively. In addition, the instructor can properly grasp the transition of the driving skill of the student M. That is, it is possible to properly grasp the effect of guidance by the instructor.
  • the instructor device 7 includes the evaluation period determination unit 52 and the evaluation unit 53. Therefore, it is possible to prevent the process for determining the evaluation period and the evaluation process from varying among the plurality of motorcycles 3.
  • position data, measurement data, and image data each include time information, position information, measurement values, and image information can be suitably synchronized. Further, since the position data, measurement data, and image data each include vehicle identification information, various data can be associated with each motorcycle 3. Therefore, various data relating to the plurality of motorcycles 3 can be suitably managed by the single instructor device 7.
  • the vehicle device 5 when the detection by the GPS receiver 33 is successful, the vehicle device 5 generates and transmits position data even if the measurement by the vehicle attitude angle sensor 34 and the photographing by the camera 35 are not successful. .
  • the instructor device 7 can update the traveling locus of the motorcycle 3 in substantially real time.
  • vehicle device 5 includes the speaker 38, message information can be suitably presented to the student M.
  • the learning support system 1 includes a vehicle device 5 and an instructor device 7. Since the hardware configuration of the vehicle device 5 and the instructor device 7 is the same as that of the first embodiment, the description thereof is omitted.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and detailed description thereof is omitted.
  • FIG. 13 is a block diagram illustrating a functional configuration of the learning support system 1 according to the second embodiment.
  • FIG. 13 shows a functional configuration particularly related to the display of the instructor apparatus 7.
  • the vehicle device 5 calculates an evaluation period and an evaluation result.
  • the vehicle device 5 includes a data storage unit 72, an evaluation period determination unit 73, and an evaluation unit 74 in addition to the data generation unit 71. These units 71, 73, and 74 are functions realized by the CPU 31 executing a program.
  • the data storage unit 72 is realized by the storage unit 32.
  • the data generation unit 71 generates position data, measurement data, and image data.
  • the data storage unit 72 stores position data, measurement data, and image data.
  • the evaluation period determination unit 73 and the evaluation unit 74 perform the same processing as the evaluation period determination unit 52 and the evaluation unit 53 of the first embodiment, respectively.
  • the communication unit 36 transmits position data, measurement data, image data, an evaluation period, and an evaluation result.
  • the instructor device 7 includes an evaluation result storage unit 76.
  • the evaluation result storage unit 76 is realized by the storage unit 42.
  • the evaluation result storage unit 76 accumulates the evaluation period and the evaluation result.
  • the display control unit 55 performs processing using the evaluation period and the evaluation result stored in the evaluation result storage unit 76 in addition to the various data stored in the data storage unit 51.
  • the instructor apparatus 7 does not include the evaluation period determination unit 52 and the evaluation unit 53 described in the first embodiment, the processing load of the CPU 41 included in the instructor apparatus 7 can be reduced.
  • the present invention is not limited to the above embodiment, and can be modified as follows.
  • the evaluation position and the evaluation result indicated by the result display data A may be changed so as to be limited to a specific evaluation position and the evaluation result.
  • the result display data A may indicate an evaluation position and an evaluation result within a predetermined time.
  • the result display data A may indicate an evaluation position and an evaluation result within a predetermined travel distance.
  • FIG. 14 is a block diagram illustrating a functional configuration of the learning support system 1 according to a modified embodiment.
  • the instructor device 7 includes a monitoring condition storage unit 77.
  • the monitoring condition storage unit 77 stores monitoring condition information.
  • the display control unit 55 extracts an evaluation position and an evaluation result that match the monitoring condition information, and generates display data A based on the extracted evaluation position and the evaluation result.
  • the display data A shows the extracted evaluation position and evaluation result, and does not show the evaluation position and evaluation result that were not extracted. Therefore, the instructor can efficiently check the evaluation position and the evaluation result that match the monitoring condition information.
  • the monitoring condition information defines, for example, at least one of monitoring time (for example, 2 minutes) and monitoring distance.
  • the display control unit 55 When the monitoring condition information defines the monitoring time, the display control unit 55 identifies the period from the reference time to the time that is back by the monitoring time as the monitoring period. The display control unit 55 extracts the evaluation period within the monitoring period from the already determined evaluation period. The display control unit 55 specifies the evaluation position and the evaluation result in the extracted evaluation period. The display control unit 55 generates result display data A indicating the specified evaluation position and evaluation result.
  • the reference time is, for example, the time indicated by the latest time information included in the position data.
  • the reference time is, for example, a time when the display control unit 55 performs processing for generating the result display data A.
  • the display control unit 55 specifies a range (traveling locus) in which the traveling distance of the motorcycle 3 up to the reference position is equal to or less than the monitoring distance as a monitoring section.
  • the display control unit 55 extracts the evaluation position included in the monitoring section from the already specified evaluation position.
  • the display control unit 55 identifies an evaluation result related to the extracted evaluation position.
  • the display control unit 55 generates result display data A indicating the extracted evaluation position and the specified evaluation result.
  • the reference position is, for example, a position indicated by position information associated with the latest time information.
  • the vehicle attitude angle sensor 34 is exemplified as the vehicle state measurement unit, but the present invention is not limited thereto.
  • the vehicle state may be measured by various sensors.
  • the vehicle state measuring unit may be a sensor that measures the behavior of the vehicle (for example, the vehicle speed) and the state of the engine 22 (for example, the engine speed).
  • the vehicle state measuring unit may be a sensor that detects an operation (for example, a clutch operation) by the student M.
  • the vehicle device 5 may include a driver state measuring unit that measures the state of the student M (for example, the angle of the head) in addition to the vehicle state measuring unit. A specific example will be described below.
  • FIG. 15 is a side view showing a schematic configuration of the motorcycle 3 according to a modified embodiment.
  • the vehicle device 5 includes a steering angle sensor 81, a front wheel speed sensor 82, a rear wheel speed sensor 83, a front wheel brake pressure sensor 84, a rear wheel brake pressure sensor 85, a front wheel suspension stroke sensor 86, A rear wheel suspension stroke sensor 87, a clutch operation sensor 88, an engine speed sensor 89, a gear position sensor 90, a throttle opening sensor 91, and a head gyro sensor 92 are provided.
  • the steering angle sensor 81 measures the steering angle of the handle 16.
  • the front wheel speed sensor 82 measures the wheel speed of the front wheel 17.
  • the rear wheel speed sensor 83 measures the wheel speed of the rear wheel 28.
  • the front wheel brake pressure sensor 84 measures the pressure of the brake fluid for the front wheel 17.
  • the rear wheel brake pressure sensor 85 measures the pressure of the brake fluid for the rear wheel 28.
  • the front wheel suspension stroke sensor 86 measures the amount of expansion / contraction of the front fork 15.
  • the rear wheel suspension stroke sensor 87 measures the amount of expansion and contraction of the shock absorber 94.
  • the shock absorber 94 expands and contracts according to the swing of the swing arm 27 to absorb and attenuate the impact.
  • the clutch operation sensor 88 measures the operation of the clutch lever.
  • the engine speed sensor 89 measures the speed of the engine 22.
  • the gear position sensor 90 detects the shift position.
  • the throttle opening sensor 91 measures the throttle opening.
  • the head gyro sensor 92 measures the angle and angular velocity of the head of the student M.
  • the head gyro sensor 92 is attached to a helmet 95 worn by the student M.
  • Sensors 81 to 91 are examples of a vehicle state measuring unit in the present invention.
  • the sensor 92 is an example of a driver state measurement unit.
  • the vehicle device 5 further includes an external speaker 96 and a control unit 97.
  • the external speaker 96 is attached to the helmet 95.
  • FIG. 16 illustrates the hardware configuration of the vehicle device 5.
  • the control unit 97 includes an interface 98.
  • the interface 98 is electrically connected to the sensors 81 to 92 and the external speaker 96 by wire or wirelessly.
  • the measured values of the sensors 81 to 92 are respectively input to the control unit 97 through the interface 98.
  • the control unit 97 causes the external speaker 96 to output sound via the interface 98.
  • CPU 31 (data generation unit 71) generates measurement data relating to the state of the motorcycle 3 at each time and measurement data relating to the state of the student M at each time.
  • Each measurement data includes a measurement value acquired by each sensor 81 to 92 and time information associated with the measurement value.
  • the communication unit 36 transmits measurement data together with position data.
  • the vehicular device 5 configured in this manner gives various measurement data to the instructor device 7.
  • the display control unit 55 generates detailed display data E based on various measurement data, and the display 44 displays the detailed display data E including various measurement data. Therefore, the instructor can confirm the vehicle state more accurately.
  • FIG. 17 is a diagram schematically showing a display example on the display 44.
  • the graph Eg shows the roll angle, the throttle opening, the steering angle, and the amount of expansion / contraction of the front fork 15.
  • Table Et shows the throttle opening, the vehicle speed, the roll angle, the steering angle, and the pressure of the brake fluid for the front wheels 17.
  • FIG. 17 shows the amount of expansion / contraction of the front fork 15 as “Fr stroke”, and the pressure of the brake fluid for the front wheel 17 as “Fr brake”.
  • the display control unit 55 changes the thickness of the lines b1 and b2 indicating the evaluation position according to the evaluation result, but is not limited thereto.
  • the display control unit 55 may change the manner in which the evaluation position is displayed according to the evaluation result.
  • the display control unit 55 may highlight the evaluation position according to the evaluation result.
  • the display control unit 55 may change at least one of the color, shading, and line type of the lines b1 and b2 according to the evaluation result.
  • the line type is, for example, a solid line, a dotted line, a one-dot chain line, or the like.
  • the display control unit 55 indicates the evaluation position by the lines b1 and b2, but is not limited thereto.
  • the evaluation position may be indicated by characters (including numbers).
  • the result display data A can indicate the evaluation position and the evaluation result in association with each other.
  • FIG. 18 is a diagram schematically illustrating a display example of the display 44.
  • the result display data A includes a table f in which the evaluation position information d and the evaluation result e are associated with each other.
  • the evaluation position information d is information (for example, a number) for identifying the evaluation position.
  • the legend part B includes a description regarding the evaluation position indicated by the evaluation position information d.
  • the display control unit 55 specifies the evaluation position based on the position data and the evaluation period, and then associates the evaluation position information d for identifying each evaluation position with the evaluation result e at the evaluation position. Table f is generated.
  • the evaluation part 53 (total evaluation part 64) represented the evaluation result by 2 divisions ("good evaluation result” and "bad evaluation result"), it is not restricted to this.
  • the evaluation unit 53 may represent the evaluation result by three or more categories.
  • the evaluation unit 53 may determine the total characteristic score U as an evaluation result.
  • the evaluation unit 53 may determine the vehicle stability characteristic score Sv and the turning characteristic score Tv as evaluation results.
  • the display control unit 55 may generate the result display data A as illustrated below.
  • FIG. 19 is a diagram schematically illustrating a display example of the display 44.
  • the result display data A includes a graph g.
  • the graph g includes two-dimensional coordinates.
  • the horizontal axis of the two-dimensional coordinates is a vehicle stability characteristics score S v
  • the vertical axis represents the turning characteristic score T v.
  • a mark h is plotted on the two-dimensional coordinates.
  • the mark h includes information (for example, a number) for identifying the evaluation position.
  • Coordinates of the mark h shows a vehicle stability properties score S v and turning characteristic score T v at the evaluation positions indicated by mark h.
  • the legend part B includes a description regarding the evaluation position indicated by the information included in the mark h.
  • the result display data A can indicate the evaluation position in association with two kinds of evaluation results at the evaluation position.
  • the result display data A includes the travel locus, but is not limited thereto. That is, the travel locus may be omitted from the result display data A.
  • the result display data A includes the route map a, but is not limited thereto. That is, the route map a may be omitted from the result display data A.
  • the touch panel 45, the microphone 46, and the fixed key 47 are illustrated as input devices of the instructor apparatus 7.
  • the present invention is not limited thereto.
  • the instructor apparatus 7 may include various input devices such as a mouse and a keyboard.
  • the microphone 46 that receives the instructor's utterance functions as a message information input unit, but is not limited thereto.
  • an input device for example, the touch panel 45 and the fixed key 47
  • message information input unit may function as a message information input unit.
  • message information was the information of the audio
  • the message information may be changed to image format information or text format information.
  • the touch panel 45 functions as a switching command input unit, but is not limited thereto.
  • the microphone 46 and the fixed key 47 may function as a switching command input unit.
  • the detailed display data E indicates the evaluation period measurement data and the evaluation period image data, but is not limited thereto.
  • the detailed display data E may indicate either evaluation period measurement data or evaluation period image data.
  • the detailed display data E includes the graph Eg, the table Et, and the moving image Em, but is not limited thereto. At least one of the graph Eg, the table Et, and the moving image Em may be omitted.
  • the evaluation period image data is indicated by the moving image Em, but is not limited thereto.
  • the evaluation period image data may be indicated by a still image.
  • the display control unit 55 may generate a still image using image information captured at a time representing the evaluation period.
  • the time representative of the evaluation period is, for example, an evaluation period start time, end time, or temporal intermediate time of the travel period.
  • the speaker 38 outputs message information, but the present invention is not limited to this.
  • the display 37 may present message information.
  • the external speaker 96 illustrated in FIGS. 15 and 16 may present message information.
  • the interface 98 is connected to the various sensors 81 to 92, but is not limited thereto.
  • the interface 98 may be connected to an ECU (Electronic Control Unit) included in the motorcycle 3.
  • ECU Electronic Control Unit
  • measurement data possessed by the ECU can be used.
  • the vehicle device 5 may be provided separately from the ECU, or the vehicle device 5 may be realized by the ECU.
  • the time information included in the position data, measurement data, and image data is the time information detected by the GPS receiver 33, but is not limited thereto.
  • the time information included in the position data, measurement data, and image data may be changed to time information detected by a real time clock (not shown).
  • the vehicular device 5 further includes a real-time clock for detecting time, and the CPU 31 associates time information obtained from the real-time clock with position information, measurement values, and image information. Measurement data and image data may be generated.
  • the real time clock is an example of a time detection unit in the present invention.
  • the position data, the measurement data, and the image data include the vehicle identification information, but the present invention is not limited to this.
  • the student ID information for identifying the students Ma, Mb, Mc may be changed so that each data includes it.
  • position data, measurement data, and image data can be related by common student ID information.
  • storage part 32 of the apparatus 5 for vehicles has memorize
  • the vehicle device 5 may include an input unit that receives at least one of vehicle identification information and student ID information by the operation of the student M.
  • the camera 35 images the front of the motorcycle 3, but the present invention is not limited to this.
  • the camera 35 may photograph the student M.
  • the camera 35 may be installed so that the back, face, or chest of the student M can be photographed.
  • the evaluation period determination unit 52 determines the evaluation period based on the measurement data, but is not limited thereto.
  • the evaluation period determination unit 52 may determine the evaluation period based on the position data.
  • the evaluation period determination unit 52 may determine the evaluation period based on the measurement data and the position data.
  • the evaluation period determination part 52 used the yaw rate as measurement data, it is not restricted to this.
  • the evaluation period determination unit 52 may use a steering angle as measurement data.
  • evaluation part 53 acquired an evaluation result based on an evaluation period, measurement data, and position data, it is not restricted to this.
  • the evaluation unit 53 may acquire the evaluation result based on the evaluation period and the measurement data.
  • the evaluation unit 53 (turning characteristic evaluation unit 63) may calculate the average vehicle speed T speed based on at least one of the wheel speed of the front wheel 17 and the wheel speed of the rear wheel 28. According to this, the evaluation part 53 can acquire an evaluation result, without using position data.
  • the evaluation period determination unit 52 uses the turning movement period that satisfies a predetermined condition as the evaluation period, but is not limited thereto.
  • the evaluation period can be set as appropriate.
  • the evaluation period may be specified when the motorcycle 3 is traveling straight, when starting from a state where the motorcycle 3 is stopped, or when stopping when the motorcycle 3 is traveling. .
  • the evaluation unit 53 acquires an evaluation result based on measurement data (hereinafter referred to as “vehicle measurement data”) regarding the state of the motorcycle 3 and an evaluation period. Not limited to.
  • the evaluation unit 53 may acquire an evaluation result based on measurement data related to the state of the student M (hereinafter referred to as “driver state measurement data”), vehicle state measurement data, and an evaluation period.
  • the student state measurement data is, for example, the posture angle of the head and the eye movement.
  • the posture angle of the head is, for example, a head angle or a head angular velocity.
  • the eye movement is, for example, the rotation angle of the eyeball or the rotation angular velocity of the eyeball.
  • the posture angle of the head is generated based on the detection result of the head gyro sensor 92 illustrated in FIGS.
  • the eye movement is generated based on the detection result of the eye movement sensor or the eye camera.
  • a specific processing example of the evaluation unit 53 will be described.
  • the evaluation unit 53 (specifically, the component separation unit 61) extracts a high-frequency component and a low-frequency component of the head angular velocity during the evaluation period from the head angular velocity using a filter.
  • the evaluation unit 53 calculates the ratio between the integrated value of the absolute value of the low frequency component of the head angular velocity and the integrated value of the absolute value of the high frequency component of the head angular velocity in the evaluation period (hereinafter referred to as “head stability characteristic score SH”). Calculated).
  • head stability characteristic score SH the integrated value of the absolute value of the low frequency component of the head angular velocity and the integrated value of the absolute value of the high frequency component of the head angular velocity in the evaluation period.
  • Evaluation unit 53 (comprehensive evaluation unit 64 specifically) based on the head stability characteristics scores SH and turning characteristic score T v, determines the evaluation results.
  • the filter used for the above-described processing, the function for calculating the head stability characteristic score SH, the function for determining the evaluation result, the threshold value, and the like
  • turning characteristic evaluation unit 63 in order to calculate the turning characteristic score T v, it was used vehicle angle and the average vehicle speed is not limited thereto.
  • the turning characteristic evaluation unit 63 may use various measurement data.
  • the turning characteristic evaluation unit 63 may use at least one of a vehicle angle, an average vehicle speed, a steering angle, and a caster angle.
  • the vehicle stability properties evaluation unit 62 in order to calculate the vehicle stability characteristics score S v, was used vehicle angular velocity is not limited thereto.
  • the vehicle stability characteristic evaluation unit 62 may use at least one of a vehicle angle, a vehicle angular velocity, an average vehicle speed, a steering angle, and a caster angle.
  • the vehicle stability properties evaluation unit 62 based only on the roll rate, may acquire the vehicle stability characteristics score S v.
  • the learning support program described in each of the above embodiments may be recorded on a computer-readable recording medium.
  • the recording medium is, for example, a CD-ROM or DVD-ROM.
  • the learning support program may be stored in the server so that it can be transmitted (downloaded) via the network.
  • the learning support system 1 includes a plurality (three) of the vehicle devices 5, but is not limited thereto. That is, the teaching support system 1 may be changed to include a single vehicle device 5.
  • the motorcycle 3 is exemplified as the vehicle, but is not limited thereto.
  • it may be changed to a three-wheel vehicle or a four-wheel vehicle.
  • the teaching support system of each embodiment can be suitably applied.
  • the evaluation unit 53 may perform an evaluation process using a first index indicating the degree of smoothness of the steering operation and a second index based on the vehicle speed.
  • the first index is acquired based on measurement data such as a steering angle.
  • the learning support program may cause the CPU 31 to execute various processes in cooperation with the operating system.
  • Monitoring condition information storage part M Ma, Mb, Mc ... Participants (drivers)

Abstract

L'invention concerne un dispositif à l'usage d'un instructeur (7) consistant en : une unité de réception (50) qui reçoit des données d'emplacement concernant l'emplacement d'une moto à certains instants; une unité de commande d'affichage (55) qui spécifie un emplacement d'évaluation qui est un emplacement auquel la moto (3) est arrivée pendant une période d'évaluation, en fonction des données d'emplacement et des résultats d'évaluation relatifs aux compétences de maniement de moto pendant la période d'évaluation, et qui produit des données d'affichage de résultats (A) dans lesquelles les résultats d'évaluation et l'emplacement d'évaluation sont associés; et un écran (44) qui affiche les données d'affichage de résultats (A).
PCT/JP2014/075527 2013-12-06 2014-09-25 Dispositif à l'usage d'un instructeur, système support d'instruction, programme de support d'instruction, support d'enregistrement et procédé de support d'instruction WO2015083420A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015551414A JP6247704B2 (ja) 2013-12-06 2014-09-25 教官用装置、教習支援システム、教習支援プログラム、記録媒体および教習支援方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013253383 2013-12-06
JP2013-253383 2013-12-06

Publications (1)

Publication Number Publication Date
WO2015083420A1 true WO2015083420A1 (fr) 2015-06-11

Family

ID=53273197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/075527 WO2015083420A1 (fr) 2013-12-06 2014-09-25 Dispositif à l'usage d'un instructeur, système support d'instruction, programme de support d'instruction, support d'enregistrement et procédé de support d'instruction

Country Status (3)

Country Link
JP (1) JP6247704B2 (fr)
TW (1) TW201523550A (fr)
WO (1) WO2015083420A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107767726A (zh) * 2017-10-18 2018-03-06 孙顺东 一种驾校原地打舵练习用车轮托放装置
JP2019045693A (ja) * 2017-09-01 2019-03-22 株式会社セガゲームス 運転シミュレータ装置、運転シミュレート方法、及びそのプログラム
JPWO2018189841A1 (ja) * 2017-04-12 2019-11-07 川崎重工業株式会社 車両の会話情報出力装置、及び会話情報出力方法
US20220169253A1 (en) * 2020-11-30 2022-06-02 Subaru Corporation Vehicle travel locus transmission system and vehicle traffic control system
WO2023112092A1 (fr) * 2021-12-13 2023-06-22 ヤマハ発動機株式会社 Dispositif de traitement de données de véhicule à selle et procédé de traitement de données de véhicule à selle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100246A1 (fr) * 2018-11-15 2020-05-22 ヤマハ発動機株式会社 Dispositif et procédé de traitement de données de déplacement de véhicule à selle
WO2023281670A1 (fr) * 2021-07-07 2023-01-12 Yamaha Hatsudoki Kabushiki Kaisha Dispositif de support d'apprentissage pour véhicule à inclinaison émettant des données d'évaluation de conducteur de véhicule à inclinaison

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59201083A (ja) * 1983-04-29 1984-11-14 株式会社デンソー 自動車運転練習システム
JPH06332370A (ja) * 1993-05-26 1994-12-02 Shinkai Kosan Kk 自動車教習所における映像無線教習装置
JP2009025733A (ja) * 2007-07-23 2009-02-05 Honda Motor Co Ltd 走行データ処理装置
JP2010072573A (ja) * 2008-09-22 2010-04-02 Toyota Motor Corp 運転評価装置
JP2011118601A (ja) * 2009-12-02 2011-06-16 Advanced Telecommunication Research Institute International 交通ハザードマップ生成装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8265342B2 (en) * 2009-04-23 2012-09-11 International Business Machines Corporation Real-time annotation of images in a human assistive environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59201083A (ja) * 1983-04-29 1984-11-14 株式会社デンソー 自動車運転練習システム
JPH06332370A (ja) * 1993-05-26 1994-12-02 Shinkai Kosan Kk 自動車教習所における映像無線教習装置
JP2009025733A (ja) * 2007-07-23 2009-02-05 Honda Motor Co Ltd 走行データ処理装置
JP2010072573A (ja) * 2008-09-22 2010-04-02 Toyota Motor Corp 運転評価装置
JP2011118601A (ja) * 2009-12-02 2011-06-16 Advanced Telecommunication Research Institute International 交通ハザードマップ生成装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2018189841A1 (ja) * 2017-04-12 2019-11-07 川崎重工業株式会社 車両の会話情報出力装置、及び会話情報出力方法
JP2019045693A (ja) * 2017-09-01 2019-03-22 株式会社セガゲームス 運転シミュレータ装置、運転シミュレート方法、及びそのプログラム
JP7006034B2 (ja) 2017-09-01 2022-01-24 株式会社セガ 運転シミュレータ装置、運転シミュレート方法、及びそのプログラム
CN107767726A (zh) * 2017-10-18 2018-03-06 孙顺东 一种驾校原地打舵练习用车轮托放装置
CN107767726B (zh) * 2017-10-18 2019-09-24 磐安县易驰机械厂 一种驾校原地打舵练习用车轮托放装置
US20220169253A1 (en) * 2020-11-30 2022-06-02 Subaru Corporation Vehicle travel locus transmission system and vehicle traffic control system
US11845436B2 (en) * 2020-11-30 2023-12-19 Subaru Corporation Vehicle travel locus transmission system and vehicle traffic control system
WO2023112092A1 (fr) * 2021-12-13 2023-06-22 ヤマハ発動機株式会社 Dispositif de traitement de données de véhicule à selle et procédé de traitement de données de véhicule à selle

Also Published As

Publication number Publication date
JP6247704B2 (ja) 2017-12-13
TW201523550A (zh) 2015-06-16
JPWO2015083420A1 (ja) 2017-03-16

Similar Documents

Publication Publication Date Title
JP6247704B2 (ja) 教官用装置、教習支援システム、教習支援プログラム、記録媒体および教習支援方法
US8364389B2 (en) Systems and methods for integrating a portable electronic device with a bicycle
JP4650028B2 (ja) 運転評価装置および運転評価システム
JP2021113046A (ja) 車体運動および乗員体験を制御するための方法およびシステム
JP7203035B2 (ja) 情報処理装置および情報処理方法
JP6002405B2 (ja) 車両用端末のためのユーザーインターフェース方法、装置、及びこれを具備する車両
JP5894131B2 (ja) 評価プログラム、記録媒体、評価方法、評価装置および車両
JP6708785B2 (ja) 走行経路提供システムおよびその制御方法、並びにプログラム
JP6218618B2 (ja) 運転支援装置、運転支援方法及び運転支援プログラム
JP6341735B2 (ja) 教習支援装置、教習支援プログラムおよび教習支援方法
WO2012077234A1 (fr) Système de collecte d'informations à l'usage d'un véhicule
JP4421668B2 (ja) 撮影制御装置、撮影制御方法、撮影制御プログラム、および記録媒体
WO2013099246A1 (fr) Dispositif de présentation d'informations de compétences de conduite
CN110023141A (zh) 用于在车辆转弯时调整虚拟相机的朝向的方法和系统
JP6086515B1 (ja) 運転技量評価装置、サーバ装置、運転技量評価システム、プログラムおよび運転技量評価方法
JP6996969B2 (ja) 運転支援装置、及び運転支援方法
CN112141010B (zh) 一种控制方法、装置、电子设备及存储介质
JP5042711B2 (ja) 運行管理装置
JP5090891B2 (ja) 安全運転教示システム
WO2015033446A1 (fr) Système d'aide à la course, et dispositif de visiocasque utilisé dans ce dernier
CN107042829A (zh) 车队跟随监控方法、装置及系统
US20210179131A1 (en) Driver assistance device, non-transitory storage medium storing driver assistance program, and driver assistance system
JP4076349B2 (ja) 端末装置、及び管理機関装置
JP2020130502A (ja) 情報処理装置および情報処理方法
JP2019121314A (ja) 判定装置、情報記録装置、判定方法、及び判定用プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14868625

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015551414

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14868625

Country of ref document: EP

Kind code of ref document: A1