WO2021229671A1 - Travel assistance device and travel assistance method - Google Patents

Travel assistance device and travel assistance method Download PDF

Info

Publication number
WO2021229671A1
WO2021229671A1 PCT/JP2020/018910 JP2020018910W WO2021229671A1 WO 2021229671 A1 WO2021229671 A1 WO 2021229671A1 JP 2020018910 W JP2020018910 W JP 2020018910W WO 2021229671 A1 WO2021229671 A1 WO 2021229671A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
driving
moving body
support
support device
Prior art date
Application number
PCT/JP2020/018910
Other languages
French (fr)
Japanese (ja)
Inventor
悠司 濱田
佳明 安達
崇成 竹原
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020552062A priority Critical patent/JPWO2021229671A1/ja
Priority to PCT/JP2020/018910 priority patent/WO2021229671A1/en
Publication of WO2021229671A1 publication Critical patent/WO2021229671A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a driving support device for an automobile, and particularly to a driving support device having a reduced processing load.
  • Patent Document 1 describes a technique for switching the implementation, non-execution, and automation level of autonomous driving according to the traffic volume of the autonomous driving vehicle.
  • Patent Document 1 the implementation, non-execution, or switching of the automation level of autonomous driving is switched according to the traffic volume of autonomous driving, but it depends on the degree of congestion of the autonomous driving vehicle, and even in a situation where automatic driving is difficult. , Self-driving vehicles need to make their own judgments and drive. In addition, since the amount of information used for determining automatic driving uses information that can be acquired by the own vehicle and communication, the processing load may increase.
  • the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide a driving support device having a reduced processing load for realizing driving support.
  • the present disclosure relates to a travel support device that supports the travel of a mobile body, and includes moving body information including at least the position, speed, and orientation of the moving body acquired by a moving body sensor mounted on the moving body, and the movement. Based on the sensing information including at least the position and speed of obstacles around the moving body acquired by the peripheral recognition sensor mounted on the body, at least the running difficulty level indicating the difficulty of running the moving body is determined. Judgment is made in three or more stages, and the support information used for the running support of the moving body is set according to the determined running difficulty level.
  • the support information used for the driving support of the moving object is set according to the driving difficulty level, so that the information to be processed is reduced in the situation where the driving difficulty level is low, and the safety and comfort are achieved. It is possible to reduce the processing load while ensuring the performance, and in a situation where the driving difficulty is low, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
  • FIG. It is a figure explaining the structure of the driving support system which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the server which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the support target and support information for a support level. It is a figure which shows an example of the support level management table. It is a figure which shows an example of the state transition of a support level. It is a figure which shows an example which manages the switching of a support level by a tree structure.
  • FIG. It is a block diagram which shows the structure of the moving body which concerns on Embodiment 1.
  • FIG. It is a flowchart which shows the whole processing of the server which concerns on Embodiment 1.
  • FIG. It is a figure which shows an example of the support level determination condition in the server which concerns on Embodiment 1.
  • FIG. It is a figure explaining an example of the determination of the driving difficulty level and the determination of a support level. It is a figure explaining an example of the determination of the driving difficulty level and the determination of a support level.
  • It is a flowchart which shows the transmission process of the moving body which concerns on Embodiment 1. It is a flowchart which shows the reception process of the moving body which concerns on Embodiment 1.
  • FIG. It is a figure which shows the application example of the group determination in the server which concerns on Embodiment 3.
  • FIG. It is a flowchart which shows the whole processing of the server which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the structure of the server which concerns on Embodiment 5.
  • FIG. 1 is a diagram illustrating a configuration of a traveling support system 1000 according to the first embodiment. It should be noted that the reference numerals of the respective configurations in FIG. 1 are the same or the same reference numerals are given to the configurations corresponding to the same in the other drawings, and duplicate description will be omitted.
  • the configuration of the travel support system 1000 is also common to the embodiments 2 to 5, and will be described as the travel support systems 2000 to 5000, respectively.
  • the travel support system 1000 shown in FIG. 1 includes a server 101, a roadside communication device 102, a mobile body 103, and a roadside sensor 104.
  • the travel support system 1000 is composed of one or more units, the server 101 is connected to a plurality of roadside communication devices 102, and the roadside communication device 102 is connected to a plurality of mobile bodies 103 and a plurality of roadside sensors 104. ..
  • the moving body 103 will be described assuming a vehicle.
  • the server 101 and the roadside communication device 102 may be connected via the Internet network.
  • the server 101 may be integrated with the roadside communication device 102 and the roadside sensor 104. Further, in the first embodiment, the server 101 is provided separately from the mobile body 103 and will be described as a travel support device for supporting the travel of the mobile body 103, but the server 101 is mounted on one of the mobile body 103. It may have been done.
  • the server 101 By providing the server 101 separately from the mobile body 103, there are few restrictions on the size of the server 101, and it is easy to increase the processing capacity of the server 101.
  • FIG. 2 is a block diagram showing the configuration of the server 101 of the driving support system 1000 according to the first embodiment.
  • the server 101 includes a processor 100, a communication interface 10, a communication device 30, and a storage device 40. Since the server 101 is a computer server and is realized by an edge server, a cloud server, or the like, the server 101 may be displayed as an edge server or a cloud server.
  • the processor 100 communicates with the communication device 30 via the communication interface 10 and also acquires information such as map information from the storage device 40.
  • the processor 100 is composed of an IC (Integrated Circuit) for executing a process described in a program to execute processes such as data transfer, calculation, processing, control, and management, and functions by executing the instruction.
  • IC Integrated Circuit
  • the processor 100 has an arithmetic circuit, a register for storing instructions and information, and a cache memory.
  • a processor such as a CPU (Central Processing Unit), a DSP (Digital Signal Processor), or a GPU (Graphics Processing Unit) is applied to the processor 100.
  • the support level management table 5 and the map data 6 are stored in the storage device 40, which will be described later.
  • the storage device 40 includes RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EEPROM (ElectricallyErasableProgrammableReadOnlyMemory), SD (SecureDigital: registered trademark), Non-volatile or volatile semiconductor memories such as memory cards, CF (Compact Flash) memories, and NAND flash memories can be applied.
  • portable storage media such as HDD (Hard Disk Drive), SSD (Solid State Drive), magnetic disk, flexible disk, Blu-ray (registered trademark) disk, DVD (Digital Versatile Disc), optical disk, compact disk, and mini disk can be used. May be applied.
  • the communication device 30 is a device including a receiver for receiving data from the mobile body 103 or the roadside sensor 104 via the roadside communication device 102 and a transmitter for transmitting the data.
  • a communication chip or a NIC Network Interface Card
  • a communication interface 10 For example, a communication chip or a NIC (Network Interface Card) can be applied to the communication interface 10.
  • the communication interface 10 can use DSRC (Dedicated Short Range Communication) dedicated to vehicle communication and a communication protocol such as IEEE802.11p.
  • DSRC Dedicated Short Range Communication
  • the communication interface 10 may use a communication line such as LTE (Long Term Evolution: registered trademark) or a 5th generation mobile communication system (5G).
  • LTE Long Term Evolution: registered trademark
  • 5G 5th generation mobile communication system
  • the communication interface 10 may use a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n / ac.
  • a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n / ac.
  • the peripheral situation recognition unit 1 determines the position, speed, direction, traveling route, driver state, driver viewpoint, number of occupants, occupant state, etc. detected by the moving body sensor mounted on the moving body 103 via the communication interface 10. Get moving object information. In addition, sensing information such as the position, speed, attributes, detection accuracy, video, and point cloud of obstacles around the moving body 103 detected by the peripheral recognition sensor mounted on the moving body 103, and the moving body 103 automatically operates. Acquires the travel path information composed of the travel locus and the speed indicating the action plan to be generated in order to perform the above.
  • the peripheral situation recognition unit 1 includes sensing information such as the position, speed, attribute, and accuracy of obstacles around the roadside sensor 104 detected by the roadside sensor 104, and the roadside machine such as the position and installation angle of the roadside sensor 104. Get information.
  • the roadside unit equipped with the roadside sensor 104 is also treated as a moving body, and the sensing information is integrated.
  • the traveling difficulty determination unit 2 aggregates the information of the moving object 103 based on the moving object information, the sensing information, and the traveling path information acquired by the surrounding situation recognition unit 1, and the map data 6 stored in the storage device 40. Integrate with map information obtained from. Further, the traveling difficulty determination unit 2 analyzes the traveling situation and the traveling scenario in consideration of the surrounding situation of each moving body based on the moving body information and the sensing information, and is based on the conflict index between the moving bodies. Calculate the risk of surrounding conditions, including the possibility of contact with surrounding moving objects. Further, the traveling difficulty determination unit 2 analyzes the traffic condition from the number of moving bodies 103, the relative distance, the relative speed, and the like.
  • the driving difficulty level indicates the difficulty of driving in manual driving by the driver or automatic driving by the automatic driving system, and the vehicle condition, the road environment, the weather, and the automatic driving system existing around the moving body 103. It changes depending on the configuration, sensor performance, etc.
  • the driving difficulty level changes in stages and is defined in at least three stages, and is classified into, for example, easy, normal, and difficult.
  • the driving difficulty level is different from the automatic driving level defined by SAE (Society of Automotive Engineers) International.
  • the automatic driving level is level 0 operated by the driver, level 1 where the system supports steering, acceleration and deceleration, level 2 where the system supports steering and acceleration and deceleration, and the system in a specific place.
  • level 3 where the driver operates everything in an emergency
  • level 4 where the system operates everything in a specific place
  • level 5 where the system operates everything regardless of the location.
  • the driving difficulty determination unit 2 can calculate the driving difficulty based on the driving situation, the driving scenario, the risk of the surrounding situation, and the traffic situation.
  • the driving difficulty may be calculated by a rule-based algorithm determined from a table prepared in advance or a conditional expression, may be calculated as a cost function from each parameter, or may be calculated based on a probability model. However, it may be calculated using artificial intelligence such as machine learning or decision tree.
  • the support level determination unit 3 determines the support level based on the driving difficulty calculated by the driving difficulty determination unit 2.
  • the support level determination unit 3 controls by manual driving or control control as the driving difficulty level increases, and provides support by cooperative automatic driving that reduces the amount of information to be processed as the driving difficulty level decreases. In addition, if the driving difficulty is extremely high or if only manual driving is supported, it is judged that the support is provided by manual driving.
  • the method of determining the support level from the driving difficulty level can be determined based on the support level management table 5 stored in the storage device 40.
  • the support level determination unit 3 notifies the recommended action generation unit 4 of the determined support level.
  • the support level is different from the automatic driving level defined by SAE International.
  • the support level is roughly divided into autonomous automatic driving in which the moving body 103 autonomously and automatically travels, cooperative automatic driving in which the moving body 103 automatically travels in cooperation with other moving bodies 103, and control by the server 101. It can be classified into four stages: control control for automatic driving and manual operation.
  • Cooperative autonomous driving can be classified into three stages when vehicle information, sensing information, and driving path information are used, depending on the type of information to be supported.
  • control control can be classified into two stages: recommended behavior and vehicle control.
  • the difficulty level is low, it can be judged by a method with a low processing load. If the number of connected cars and self-driving vehicles increases in the future, the amount of information handled will increase dramatically, so it is effective to perform the minimum necessary data processing.
  • the recommended action generation unit 4 generates support information according to the support level based on the support level determined by the support level determination unit 3, the moving object information, the sensing information, and the travel path information acquired by the surrounding situation recognition unit 1. Then, it is transmitted to the mobile body 103 via the communication interface 10.
  • the support information includes, for example, support information for manual driving, support information for cooperative automatic driving, and support information for control control.
  • Autonomous automatic driving does not exist as support information because it determines driving by the moving body alone, but the recommended action generation unit 4 notifies that it is determined to be autonomous automatic driving.
  • Support information for manual driving includes contact warnings with surrounding moving objects, recommended speeds, recommended lanes, recommended lane change timings, recommended right / left turn timings, and emergency vehicle approach warnings.
  • the support information for manual driving includes additional information such as the generation point of the warning information and the effective time. Assistance information for manual driving is information displayed to the driver.
  • the recommended action is the action to be taken next between the moving bodies in anticipation and consideration of the action between the moving bodies. It is an action to make the vehicle run smoothly, such as decelerating and changing the lane of the other vehicle.
  • the support information for control control is the vehicle control information itself that controls the vehicle, such as the traveling speed and the traveling path information.
  • vehicle control information By transmitting vehicle control information to the moving body 103 to control the moving body 103, control control or remote control of the moving body 103 can be performed.
  • the vehicle control information determines the traveling speed and the traveling lane for the smooth flow of the traffic flow, and by controlling the surrounding vehicles, the lane change and the right / left turn can be efficiently realized. For example, by setting the speeds of peripheral vehicles to the same speed, unnecessary acceleration and deceleration can be avoided, and the inter-vehicle distance can be adjusted to smoothly change lanes.
  • Figure 3 shows an example of support targets and support information for support levels.
  • the autonomous automatic driving since only the in-vehicle sensor of the mobile body 103 to be distributed is used, there is no support information to be distributed from the server.
  • the mobile information collected by the server the mobile information (information on other vehicles in the vicinity) of the mobile 103 existing in the vicinity of the mobile 103 to be distributed is distributed as support information.
  • the sensing information of the moving body 103 existing in the vicinity of the moving body 103 to be distributed (the sensing information of the moving body 103). Distributes sensing information of surrounding vehicles).
  • cooperative automatic driving that utilizes the path information of another vehicle, among the path information collected by the server, the traveling path of the moving body 103 existing around the moving body 103 to be distributed.
  • Information passes information of surrounding vehicles is distributed as support information.
  • the server 101 In the control control (first control control) that utilizes the recommended behavior information, the server 101 considers the driving scenario of the mobile body 103 to be supported and the driving scenario of the surrounding moving body, and the recommended speed, recommended lane, and recommended lane. Deliver recommended action information such as change timing as support information.
  • control control (second control control) that utilizes vehicle control information
  • the server 101 distributes vehicle control information that controls traffic by controlling not only the mobile body 103 to be supported but also all surrounding mobile bodies as support information. do.
  • the server distributes warning information and recommended action information to the driver of the mobile 103 to be supported as support information.
  • the support target for control control utilizing vehicle control information is the vehicle control unit of the mobile body 103 to be supported, and the support target for manual driving is the display unit that can be visually recognized by the driver of the mobile body 103 to be supported.
  • the support target at the other support level is the automatic driving determination unit of the mobile body 103 to be supported.
  • the support level management table 5 is composed of, for example, a support level and a table that defines a driving difficulty level and a cost as shown in FIG.
  • the support level can be broadly classified into autonomous automatic driving, cooperative automatic driving, control control, and manual driving, and cooperative automatic driving can be further classified into three stages according to the information to be supported.
  • Control control can be classified into two stages of distributing recommended behavior information and vehicle control information.
  • the cooperative automatic driving the case where the other vehicle information, the other vehicle sensor information, and the other vehicle path information are distributed to the moving body 103 is shown.
  • the driving difficulty levels are autonomous automatic driving, cooperative automatic driving that distributes other vehicle information, cooperative automatic driving that distributes other vehicle sensing information, and cooperative type that distributes other vehicle driving path information.
  • Driving difficulty levels 1 to 7 are specified in the order of automatic driving, control control for distributing recommended action information, control control for distributing vehicle control information, and manual driving.
  • the cost is prepared in advance according to the driving difficulty level.
  • the cost (c) is less than 5
  • the driving difficulty level is 2 cooperative type. 5 or more and less than 10 for automatic driving, 10 or more and less than 15 for cooperative automatic driving with driving difficulty 3, 15 or more and less than 20 for cooperative automatic driving with driving difficulty 4, and 20 or more and less than 25 for control control with driving difficulty 5.
  • the control control of driving difficulty level 6 it is 25 or more and less than 30, and in the control control of driving difficulty level 7, it is 30 or more.
  • FIG. 4 shows an example of switching the driving difficulty level in 7 stages, it may be switched in at least 3 stages of autonomous automatic driving, cooperative automatic driving, and control control, and it may be further subdivided into 8 or more stages. You may switch with.
  • FIG. 4 shows an example in which the support level management table is composed of a table, but the present invention is not limited to this, and for example, as shown in FIG. 5, the support level is switched by a finite state machine (Finite State Machine). May be good.
  • a finite state machine Finite State Machine
  • FIG. 5 is a state transition diagram showing switching of support levels in a finite state machine.
  • arrows indicate an increase in driving difficulty (Up) and a decrease in driving difficulty (Down) between the four states of autonomous automatic driving, cooperative automatic driving, control control, and manual driving. ..
  • the driving difficulty level Up and the driving difficulty level Down are shown between the three states of the cooperative automatic driving utilizing the vehicle information, the sensing information, and the traveling path information.
  • the driving difficulty level Up and the driving difficulty level Down are shown between the two states of the recommended action and the vehicle control.
  • the state is changed according to the change in the driving difficulty level in FIG. 5, the state is not limited to this, and the transition may be made based on a specific driving situation or driving scenario.
  • switching of support levels may be managed by a tree structure such as the decision tree shown in FIG. As shown in FIG. 6, in contrast to the four states of autonomous automatic driving, cooperative automatic driving, control control, and manual driving, cooperative automatic driving utilizes vehicle information, sensing information, and travel path information.
  • the three states of autonomous driving are branched, and the control control shows a tree-like classification in which the two states of recommended behavior and vehicle control are branched.
  • the support level management table may be defined based on the risk of the surrounding situation and the action plan.
  • the driving difficulty and cost, and the judgment conditions of the risk of the surrounding situation may be updated by machine learning or deep learning.
  • Map data 6 has map information related to the map.
  • Map information is composed of a plurality of maps corresponding to a predetermined scale in a layered manner, and constitutes road information which is information about a road, lane information which is information about lanes constituting a road, and lanes.
  • Road information includes, for example, road shape, road latitude, longitude, road curvature, road slope, road identifier, number of road lanes and road line types, as well as general roads, highways and priority roads. Contains information about road attributes.
  • the lane information includes, for example, the identifier of the lane constituting the road, the latitude, longitude and the center line of the lane.
  • the constituent line information includes the identifier of each line constituting the lane, the latitude and longitude of each line constituting the lane, and the line type and curvature of each line constituting the lane.
  • Road information is managed for each road, and lane information and constituent line information are managed for each lane.
  • Map information is used for navigation, driving support, automatic driving, and the like.
  • road information includes traffic regulation information (lane regulation, speed regulation, traffic regulation, chain regulation, etc.) that changes with time, tollgate regulation information (entrance / exit, tollgate), and traffic congestion information (presence / absence of congestion, section, lane).
  • traffic regulation information lane regulation, speed regulation, traffic regulation, chain regulation, etc.
  • tollgate regulation information Entrance / exit, tollgate
  • traffic congestion information Presence / absence of congestion, section, lane.
  • Traffic accident information stopped vehicle, low speed vehicle
  • obstacle information fallening object, animal
  • road abnormality information road damage, road surface abnormality
  • peripheral vehicle information etc.
  • FIG. 7 is a block diagram showing the configuration of the moving body 103 of the traveling support system 1000 according to the first embodiment.
  • the mobile body 103 includes a mobile body system 300 having a processor 200, a communication interface 20, a peripheral recognition interface 21, a vehicle sensor interface 22, and a vehicle control interface 23, a peripheral recognition sensor 31, and a mobile sensor 32. It includes a vehicle control ECU 33 and a communication device 34.
  • the moving body 103 also includes an actuator for automatic operation and the like, but a configuration that is not closely related to the embodiment is omitted.
  • the processor 200 communicates with the peripheral recognition sensor 31, the mobile sensor 32, the vehicle control ECU 33, and the communication device 34 via the communication interface 20, the peripheral recognition interface 21, the vehicle sensor interface 22, and the vehicle control interface 23, and acquires information. do.
  • the processor 200 executes instructions described in the program to execute processes such as data transfer, calculation, processing, control, and management.
  • commands which are composed of ICs
  • the functions of the peripheral situation recognition unit 41, the automatic driving judgment unit 42, the vehicle control unit 43, and the support level switching unit 44, which are represented as functional blocks, are realized, and the display unit 45 is realized. Notify the driver of driving support information via.
  • the peripheral situation recognition unit 41 has the same function as the peripheral situation recognition unit 1 mounted on the server 101, and has sensing information of obstacles around the moving body 103 detected by the peripheral recognition sensor 31 and a moving body sensor.
  • the moving body information detected by the 32, the moving body information of the peripheral moving body received by the communication device 34, and the sensing information are acquired via the vehicle sensor interface 22, the peripheral recognition interface 21, and the communication interface 20, respectively, and are internally stored in advance. Recognize the travelable area by integrating with the map information managed in.
  • the automatic driving determination unit 42 Based on the moving object information, sensing information, and map information integrated by the surrounding situation recognition unit 41, the automatic driving determination unit 42 maintains a lane and changes lanes to drive safely without contacting obstacles and surrounding vehicles. , Acceleration, deceleration, etc. Further, the automatic driving determination unit 42 generates vehicle control information of the traveling path and speed for the vehicle control unit 43 to control the steering, the accelerator, and the brake, and notifies the vehicle control unit 43.
  • the vehicle control unit 43 controls the accelerator, brake, and steering of the moving body 103 via the vehicle control interface 23 according to the vehicle control information notified from the automatic driving determination unit 42.
  • the support level switching unit 44 receives support information from the server 101, determines the type of the received support information, notifies the display unit 45 if the support information is support information for manual driving, and the support information is cooperatively automatic.
  • the automatic driving determination unit 42 is notified, and in the case of the support information is vehicle control information for control control, the vehicle control unit 43 is notified.
  • the support information for cooperative automatic driving is notified to the surrounding situation recognition unit 41 instead of the automatic driving judgment unit 42, and after the peripheral situation recognition unit 41 integrates all the moving body information and the sensing information, the automatic driving is performed. It may be configured to notify the determination unit 42.
  • the display unit 45 displays the support information for manual driving notified from the support level switching unit 44, and notifies the driver.
  • a display such as a car navigation system, a head-up display, an AR (augmented reality) system, a display panel such as a cockpit may be used, or a voice may be combined to notify the driver.
  • the display unit 45 may change the display position according to the position of the support information. Further, the display unit 45 may display only when the driver does not recognize the driving support information.
  • the communication interface 20 is a device including a receiver that receives positioning data from a peripheral mobile body and a base station, detects the arrival angle of radio waves and the time required for transmission / reception, and a transmitter that transmits data.
  • a communication chip or NIC can be applied to the communication interface 20.
  • the communication interface 20 can use a DSRC dedicated to vehicle communication and a communication protocol such as IEEE802.11p.
  • the communication interface 20 may use a mobile phone network such as LTE (registered trademark) or a 5th generation mobile communication system (5G).
  • LTE registered trademark
  • 5G 5th generation mobile communication system
  • the communication interface 20 may use a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n.
  • a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n.
  • the communication device 34 is a communication device compatible with DSRC, LTE, and 5G, and is configured to give the received information to the mobile system 300 in FIG. 7, but the information from the mobile system 300 is sent to the outside. It also has a function to send.
  • the peripheral recognition interface 21 is an interface for acquiring data from the peripheral recognition sensor 31 mounted on the mobile body 103.
  • Specific examples are sensor data acquisition LSI (Large Scale Integration), USB (Universal Serial Bus), and CAN (Controller Area Network) ports.
  • the peripheral recognition sensor 31 is a sensor capable of positioning such as millimeter wave radar, monocular camera, stereo camera, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing), sonar, GPS (Global Positioning System) and the like.
  • the peripheral recognition sensor 31 also includes a DMS (Driver Monitoring System) and a drive recorder that monitor the driver inside the mobile body 103.
  • DMS Driver Monitoring System
  • the vehicle sensor interface 22 is a device for connecting a moving body sensor 32 such as a GPS, a speed sensor, an acceleration sensor, and an orientation sensor to the processor 200. Further, the vehicle sensor interface 22 is a device for connecting an in-vehicle device such as an in-vehicle ECU, an EPS (Electric Power Steering), a car navigation system, and a cockpit to the processor 200. As a specific example, the vehicle sensor interface 22 is a sensor ECU (Electronic Control Unit).
  • the vehicle control interface 23 is a device for connecting the processor 200 to the vehicle control ECU 33 that controls the accelerator, brake, and steering.
  • the server 101 or the mobile body 103 of the travel support system may be mounted in an integrated form or an inseparable form with other components shown in the figure, or may be in a removable form or a separable form. It may be implemented.
  • FIGS. 2 and 7 only one processor 100 and 200 are shown, respectively. However, the number of processors 100 and 200 may be plural, and the plurality of processors 100 and 200 and programs that realize each function may be executed in cooperation with each other.
  • the operation of the traveling support system 1000 according to the first embodiment also includes the operation of the travel support device and the travel support method.
  • the peripheral situation recognition unit 1 of the server 101 acquires the moving body information, the sensing information, and the traveling path information from the plurality of moving bodies 103 via the roadside machine, for example, in a cycle of 100 milliseconds (step S101).
  • step S101 the information type acquired from the moving body 103 is changed, the acquisition cycle is changed, or the acquisition route is changed based on the driving difficulty level determined in step S103 or the support level determined in step S104. do.
  • the driving difficulty level or the support level is low, only the moving object information is acquired, and when the traveling difficulty level or the support level is high, the moving object information, the sensing information, and the traveling path information are acquired.
  • the driving difficulty level or the support level is low, the information is acquired in a cycle of 200 milliseconds, and when the driving difficulty level or the support level is high, the information is acquired in a cycle of 20 milliseconds.
  • the server 101 acquires information from the mobile body 103
  • the information is notified from the mobile body 103 to the server 101 after the server 101 notifies the mobile body 103 of the traveling difficulty level. You may do so.
  • the roadside sensor information may be received from the roadside machine and processed in combination with the information of the moving body 103.
  • sensing information such as the relative distance and the relative speed primary processed by the moving body 103 is acquired, and when the traveling difficulty level or the support level is high, the periphery of the moving body 103 is acquired.
  • the raw data such as the image and the point cloud acquired by the recognition sensor are acquired.
  • the driving difficulty determination unit 2 of the server 101 calculates the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion based on the received mobile body information, sensing information, and driving path information (step S102).
  • the driving situation is based on a certain vehicle (moving body)
  • the number and frequency of lane changes of vehicles existing around the vehicle, the relative distance and relative speed of the vehicle from the front, rear, left and right vehicles, or the surroundings are used. It is defined by situation information with surrounding vehicles such as acceleration of existing vehicles, amount of change in deceleration, degree of overlap of driving paths, and road situation information such as weather, accident occurrence section, and traffic jam section.
  • situation information with surrounding vehicles such as acceleration of existing vehicles, amount of change in deceleration, degree of overlap of driving paths, and road situation information such as weather, accident occurrence section, and traffic jam section.
  • a certain vehicle here is all the moving bodies 103 managed by the server 101, and the calculation is performed for each vehicle.
  • the driving scenario shows the behavior of the vehicle such as lane change, right turn, left turn, merging, branching, temporary stop, signal stop, and driving, and is judged from the route information, driving path information, and map information of the moving body information.
  • the risk of the surrounding situation is, as a conflict index, the time required for two vehicles to make contact when traveling in the same direction at the current speed TTC (Time-to-Collision), the time for a vehicle to enter the contact point.
  • PET Post Encroachment Time
  • DRAC Deceleration Rate
  • the degree of congestion is determined based on the number of vehicles passing within a certain time and the inter-vehicle time specified by the passing time interval of vehicles for a certain point.
  • a certain point here is each point where a vehicle exists, and is any arbitrary place.
  • the driving difficulty determination unit 2 of the server 101 determines the driving difficulty based on the driving situation, the driving scenario, the risk of the surrounding situation, and the congestion degree (step S103).
  • the total cost is calculated based on the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion, and the driving difficulty is determined from the total cost.
  • the determination of the running difficulty level can be determined numerically.
  • FIG. 9 shows an example of the cost for the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion.
  • driver information such as configuration and performance information of various sensors, driver status, driver crisis recognition, and driver line of sight is also included.
  • Sensor configuration and performance information for example, represents sensor performance in high, medium, and low, driver state is represented by whether the driver is awake or sleeping, and driver crisis awareness is a driver's danger. It is represented by whether it is recognized (yes) or not aware of danger (no), and the driver's line of sight is represented by whether the direction of the driver's line of sight is normal or looking aside.
  • the cost is set in three stages of 0, 1, and 2.
  • the cost is set to the lowest (0) so that the driving difficulty is lowered, and the driver is in a sleeping state. In such a case, the cost is set to the highest (2) so that the driving difficulty level increases.
  • the number of floors of the cost may be less than or greater than the three levels.
  • the driving difficulty may be determined by summing up the costs of all items, or may be determined by the cost of at least one item.
  • the driving difficulty determination unit 2 may determine the driving difficulty according to the restrictions of the server 101 in the driving support system or the automatic driving system and the restrictions of the in-vehicle system.
  • the restrictions of the server 101 are, for example, the processing performance of the server 101, the number of vehicles being processed, and the like, and if the processing performance is low or the number of processed vehicles is large, the driving difficulty is increased.
  • the restrictions of the in-vehicle system are, for example, the detection accuracy of the peripheral recognition sensor 31 mounted on the moving body 103 and the recognizable distance of the peripheral recognition sensor 31, and it is difficult to drive when the sensing distance is short or the accuracy is low. The degree will be high.
  • conditions and costs shown in FIG. 9 may be updated by learning the judgment conditions by machine learning and deep learning.
  • the next determination timing is delayed or the determination cycle is lengthened, and when the traveling difficulty level is high, the next determination timing is advanced or the determination cycle is set. shorten. Further, when the driving difficulty level is low, the range of information used for the determination is narrowed, and when the driving difficulty level is high, the range of information used for the determination is widened.
  • the driving difficulty may be weighted based on the results determined in the past, or may be determined based on the results within a certain period of time.
  • this running difficulty level is determined for the moving body 103, the running difficulty level may be assigned to the position where the moving body 103 exists.
  • the support level determination unit 3 of the server 101 determines the support level according to the driving difficulty level notified from the driving difficulty level determination unit 2 and the support level management table (step S104). For example, in the support level management table shown in FIG. 4, when the cost is 12, the driving difficulty level is 3, and the support level indicates cooperative automatic driving (other vehicle sensor information).
  • the driving difficulty level is determined based on the support level management table, but the determination may be made using a cost function or a probability model instead of the table.
  • the support level may be weighted based on the results determined in the past, or may be determined based on the results within a certain period of time.
  • the support level is determined for the moving body 103, the support level may be assigned to the position where the moving body 103 exists.
  • the support level determination unit 3 of the server 101 requests the recommended action generation unit 4 to generate support information based on the determined support level (step S105). Examples of support levels, support targets, and support information are as described with reference to FIG.
  • another vehicle information, another vehicle sensor information, and another vehicle path information are distributed separately, but when the other vehicle sensor information is distributed, the other vehicle information is integrated.
  • the other vehicle information and the other vehicle sensor information may be integrated and distributed.
  • the recommended action generation unit 4 may be requested to generate support information based on the driving difficulty level.
  • the generation cycle of the support information may be changed based on the support level or the driving difficulty level.
  • the server 101 controls not only the vehicle VA to be supported but also the vehicle VB.
  • the vehicle VA is changed to the lane first, and the vehicle VB is changed to the lane later.
  • the recommended action generation unit 4 of the server 101 distributes the generated support information to the mobile body 103 to be supported.
  • the generated support information may be distributed not only to the mobile body 103 to be supported but also to the mobile body 103 in the vicinity of the support target.
  • the information may be distributed to the point.
  • steps S101 to S106 described above may be sequentially processed each time data is received, or may be processed at regular time intervals.
  • steps S102 and S103 may be determined by the moving body, the traveling difficulty level may be notified from the moving body 103 to the server 101, and steps S104 and subsequent steps may be performed on the server 101.
  • FIG. 12 shows a processing sequence when information is transmitted from the mobile body 103 to the server 101
  • FIG. 13 shows a processing sequence when the mobile body 103 receives information from the server 101. ..
  • the peripheral situational awareness unit 41 of the mobile body 103 acquires the mobile body information from the mobile body sensor 32 and acquires the sensing (sensor) information from the peripheral recognition sensor 31 (step S201).
  • the moving body information is moving body information such as the position, speed, direction, acceleration, and traveling path of the moving body 103.
  • the sensing information is information such as the relative position, relative speed, relative angle, and type of the detected object of other moving objects and obstacles existing in the vicinity of the moving object 103 as a starting point. Further, the sensing information may handle raw data such as a video before processing and a point cloud.
  • the peripheral situation awareness unit 41 of the mobile body 103 transmits the acquired mobile body information and sensing information to the server 101 (step S202). Since step S202 is a process corresponding to the process of step S101 in the peripheral situation awareness unit 1 of the server 101, the information may be transmitted after receiving the information acquisition request from the server 101.
  • the mobile body information and the sensing information may also be transmitted to the peripheral mobile body. This is a configuration corresponding to the case where the peripheral mobile unit has a function of executing the processing of the server 101.
  • the automatic driving judgment unit 42 of the moving body 103 creates an action plan for automatic driving such as lane keeping and lane change based on the moving body information and sensing information acquired by the surrounding situation recognition unit 41, and realizes the action plan. (Step S203).
  • map information may be used. Further, here, an example of making a judgment without using the support information notified from the server 101 is shown.
  • action plan for autonomous driving behavior judgment is made by observing traffic rules, driving judgment along the road shape, state estimation of surrounding vehicles and risk estimation, and rule-based method, optimization method, probabilistic method, Action plans can be created using learning-based methods and methods that integrate these methods.
  • the rule-based method is a method that makes decisions based on defined conditions and rules, such as FiniteStateMachine and DecisionTree, and is a robust process for simple scenarios where it is easy to handle defined situations. Is possible.
  • the Optimization Based method is a method that defines a function for data and finds the minimum or maximum value of the function, like CostBasedFunction, and is easy to implement, maintain, and test.
  • the probabilistic based method is a method of probabilistically judging the result by inputting each condition, such as the Markov decision process (Partially Observable Markov Decision Process: POMDP) and Bayesian estimation, and formulates a reliable scenario. It is possible.
  • POMDP Partially Observable Markov Decision Process
  • Bayesian estimation Bayesian estimation
  • the learning based method is a method of learning from past data and correctly predicting new inputs, such as reinforcement learning and machine learning, and can handle unknown situations that cannot be defined.
  • model predictive control is a control method that optimizes while predicting future responses at each time
  • route planning algorithm is a method that randomly searches for a route and repeats the search until the goal is reached. ..
  • the automatic driving determination unit 42 of the mobile body 103 transmits the generated travel path information to the server 101 (step S204).
  • the travel path information may also be transmitted to surrounding moving objects. This is a configuration corresponding to the case where the peripheral mobile unit has a function of executing the processing of the server 101.
  • the vehicle control unit 43 of the mobile body 103 controls the travel of the mobile body 103 (vehicle) according to the travel path information generated by the automatic driving determination unit 42 (step S205).
  • the automatic driving determination unit 42 controls the vehicle according to the travel path information generated by the automatic driving determination unit 42 (step S205).
  • the manual driving may be maintained when the driver is manually driving.
  • sequence shown in FIG. 12 is a sequence in which data is transmitted to the server 101, so processing when data is received from the server 101 is not considered.
  • the support level switching unit 44 of the mobile body 103 waits until the support information is received from the server 101 (step S301), and when the support information is received, the process proceeds to step S302.
  • step S302 the support level switching unit 44 determines the support target of the received support information. If the support target is the vehicle control unit 43, the process proceeds to step S303. If the support target is the automatic driving determination unit 42, the process proceeds to step S305, and if the support target is the display unit 45, the process proceeds to step S308.
  • step S303 the support level switching unit 44 notifies the vehicle control unit 43 of the vehicle control information received as the support information.
  • the vehicle control unit 43 performs vehicle control via the vehicle control interface 23 according to the notified vehicle control information (step S304). After that, the process of step S301 and the like is repeated.
  • the vehicle control unit 43 uses the notified vehicle control information, the vehicle is notified by determining safety and comfort by comparing with the vehicle control information generated by the automatic driving determination unit 42. It may be determined whether or not to use the control information, and if the vehicle control information generated by the automatic driving judgment unit 42 is superior, the vehicle is autonomous based on the vehicle control information generated by the automatic driving judgment unit 42. Carry out automatic driving by.
  • step S305 the support level switching unit 44 notifies the automatic driving determination unit 42 of the automatic driving support information received as the support information.
  • the automatic driving support information is notified to the automatic driving determination unit 42 here, it may be notified to the surrounding situation recognition unit 41.
  • the automatic driving determination unit 42 generates a traveling path by using the notified automatic driving support information in addition to the moving body information and the sensing information, and notifies the vehicle control unit 43 (step S306).
  • the peripheral recognition sensor detects it by treating it as moving object information existing in the vicinity, similar to the peripheral vehicle or obstacle detected by the peripheral recognition sensor 31. Information that cannot be used can also be used.
  • the traveling path of the moving body 103 existing in the vicinity can be recognized, so that the safety can be enhanced by the judgment considering the behavior of the surrounding moving body.
  • the vehicle control unit 43 performs vehicle control via the vehicle control interface 23 according to the travel path information notified from the automatic driving determination unit 42 (step S307). After that, the process of step S301 and the like is repeated.
  • step S308 the support level switching unit 44 notifies the display unit 45 of the alarm information and the recommended action information received as the support information.
  • the display unit 45 displays the support information and notifies the driver (step S309). After that, the process of step S301 and the like is repeated.
  • the amount of information used for automatic driving changes based on the driving difficulty level of the driving support system 1000 of the first embodiment described above, the amount of information to be processed is reduced in a situation where driving is easy to improve safety and comfort. It is possible to reduce the processing load while ensuring, and in situations where driving is difficult, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
  • the information type to be used in the server 101 is determined based on the driving difficulty level instead of determining the information type to be used in each mobile body 103, the efficiency in consideration of the distant situation that cannot be grasped by the mobile body alone. Support can be realized.
  • the amount of communication can be changed by changing the type of information acquired from the mobile body 103, the frequency of acquiring information, or the communication route to be acquired, depending on the driving difficulty level or the support level. It is possible to reduce the processing load.
  • the server 101 determines the travel difficulty level and determines the information type to be provided to the mobile body 103, but the mobile body 103 determines the travel difficulty level. Then, the configuration may be such that the server is requested to provide information.
  • FIG. 14 is a block diagram showing the configuration of the server 101 of the driving support system 2000 according to the second embodiment.
  • the server 101 includes a processor 100A, a communication interface 10, a communication device 30, and a storage device 40.
  • FIG. 15 is a block diagram showing the configuration of the moving body 103 of the traveling support system 2000 according to the second embodiment.
  • the mobile body 103 includes a mobile body system 300 having a processor 200A, a communication interface 20, a peripheral recognition interface 21, a vehicle sensor interface 22, and a vehicle control interface 23, a peripheral recognition sensor 31, and a mobile sensor 32. It includes a vehicle control ECU 33 and a communication device 34.
  • the processor 200A communicates with the peripheral recognition sensor 31, the mobile sensor 32, the vehicle control ECU 33, and the communication device 34 via the communication interface 20, the peripheral recognition interface 21, the vehicle sensor interface 22, and the vehicle control interface 23, and acquires information. do.
  • the processor 200A has a server communication unit 46 and a vehicle control unit 43 as functional blocks.
  • the server communication unit 46 receives the sensing information of obstacles around the moving body 103 detected by the peripheral recognition sensor 31 and the moving body information detected by the moving body sensor 32, respectively, in the vehicle sensor interface 22 and the peripheral recognition interface 21.
  • the acquired information is notified to the vehicle control unit 43 and transmitted to the server 101 via the communication interface 20.
  • the server communication unit 46 uses the server 101 to transmit the sensing information of obstacles around the mobile body 103 detected by the peripheral recognition sensor 31 and the mobile body information detected by the mobile body sensor 32. Is sent periodically to.
  • the processor 100A of the server 101 determines the automatic driving and transmits the vehicle control information to the moving body 103.
  • the processor 100A of the server 101 has, in addition to the configuration of the processor 100 shown in FIG. 2, an automatic operation determination unit 7 that receives the output of the recommended action generation unit 4, and has a recommended action.
  • the generation unit 4 notifies the automatic driving determination unit 7 of the support information generated according to the support level.
  • the automatic driving determination unit 7 uses the automatic driving support information notified from the recommended action generation unit 4 in addition to the sensing information and the moving body information of obstacles around the moving body 103 transmitted from the moving body 103. Generates vehicle control information such as a travel path. Then, the generated vehicle control information is transmitted to the mobile body 103 via the communication interface 10.
  • the server communication unit 46 of the moving body 103 notifies the vehicle control unit 43 of the vehicle control information received from the server 101, and the vehicle control unit 43 passes through the vehicle control interface 23 according to the notified vehicle control information.
  • the accelerator, brake, and steering of the moving body 103 are controlled to perform automatic traveling.
  • the server 101 since the server 101 generates vehicle control information for automatic driving based on the driving difficulty level, the information to be processed is reduced in a situation where driving is easy, and safety and comfort are achieved. It is possible to reduce the processing load while ensuring the performance, and in situations where driving is difficult, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
  • the server 101 since the server 101 generates vehicle control information for automatic driving and remotely controls the moving body 103, the system configuration mounted on the moving body 103 can be simplified.
  • FIG. 16 is a block diagram showing the configuration of the server 101 of the driving support system 3000 according to the third embodiment.
  • the server 101 includes a processor 100B, a communication interface 10, a communication device 30, and a storage device 40.
  • the configuration of the moving body 103 the configuration shown in FIG. 7 may be adopted.
  • the processor 100B of the server 101 has a group determination unit 13 connected to the support level determination unit 3 in addition to the configuration of the processor 100 shown in FIG. 2, and the group determination unit 13 is based on the mobile body information and the sensing information.
  • a plurality of moving bodies 103 moving in the same direction are determined as a group.
  • the determination is made based on the inter-vehicle time, the relative distance, and the relative speed between the moving bodies.
  • the group determination unit 13 of the processor 100B determines a plurality of moving objects moving in the same direction as one group, and the information of the determined moving body group is used as a support level determination unit. Notify 3.
  • the support level determination unit 3 applies the same support level to the same group based on the information of the mobile group.
  • FIG. 17 shows an example in which a plurality of moving objects are determined as a group and the same support level is applied.
  • FIG. 17 shows a state in which a plurality of vehicles including the vehicle VA to be supported are about to turn right at an intersection, and the group determination unit 13 determines that the plurality of vehicles are vehicle group G1 and vehicles in the group. Apply the same level of support to.
  • there are multiple vehicles trying to turn right at the intersection in the opposite lane across the intersection and the multiple vehicles are judged to be vehicle group G2, and the same support level is applied to the vehicles in the group. do.
  • the control control is used to control the vehicle group to start at the same time, and when the color changes from blue to red, the control control is used to simultaneously decelerate the vehicle group.
  • the support level determination unit 3 determines the support level in step S404, if there are a plurality of moving bodies 103 that the group determination unit 13 moves in the same direction, they are determined as one group, and the information of the moving body group is determined. Is notified to the support level determination unit 3 (step S405).
  • the support level determination unit 3 extracts the support level of the mobile body 103 having the highest running difficulty in the group based on the information of the mobile body group notified from the group determination unit 13, and the mobile body 103 of the same group. Requests the recommended action generation unit 4 to generate support information based on the same driving difficulty and support level (step S406).
  • the peripheral situation recognition unit 1 of the server 101 acquires the mobile body information, the sensing information, and the travel path information from the plurality of mobile bodies 103 via the roadside unit (step S401), and the group determination unit 13 By associating these information with the information of the determined mobile body group, each mobile body in the mobile body group can be specified, and the information can be distributed to each mobile body via the communication interface 10.
  • each moving object is notified of the traveling difficulty level or the support level from the server 101, it is possible to know its own traveling difficulty level or the support level, and the traveling of a plurality of moving objects 103 included in one group.
  • the difficulty level or the support level is low, one mobile body 103 in the group collects information on the mobile body 103 and sends it to the server 101, and a plurality of mobile bodies included in one group. If the traveling difficulty level or the support level of 103 is high, each moving object in the group may transmit to the server.
  • the state transition diagram showing the switching of the support level in the third embodiment may change the state according to the driving difficulty level, or cooperates from the autonomous automatic driving by forming and leaving the group.
  • the transition from type automatic operation and cooperative automatic operation to control control may be performed.
  • the traveling support system 3000 of the third embodiment described above can smoothly travel the moving bodies 103 in the same group by treating a plurality of moving bodies 103 moving in the same direction as a group.
  • FIG. 19 is a block diagram showing the configuration of the server 101 of the driving support system 4000 according to the fourth embodiment.
  • the server 101 includes a processor 100C, a communication interface 10, a communication device 30, and a storage device 40A.
  • the configuration of the moving body 103 the configuration shown in FIG. 7 may be adopted.
  • the processor 100C of the server 101 has a driving difficulty learning unit 14 connected to the traveling difficulty determination unit 2, and the traveling difficulty learning unit 14 is used for moving body information. Based on the sensing information, the travel path information, and the travel difficulty determined by the travel difficulty determination unit 2, the travel difficulty determination model 15 is generated and stored in the storage device 40A.
  • the driving difficulty determination unit 2 reads the driving difficulty determination model 15 from the storage device 40A, and determines the driving difficulty using the driving difficulty determination model 15.
  • the driving difficulty level determination can be improved and improved. can do.
  • the driving difficulty determination model 15 determines and evaluates the input data and sets the difficulty level. This includes finite state machines, decision trees and logistic regression analysis.
  • the driving difficulty determination model 15 In the generation of the driving difficulty determination model 15, a sample of input data is prepared, and data analysis, model creation and model evaluation are repeatedly carried out. If the result of model evaluation is bad, the cause is analyzed and the model is corrected. For example, the cost is calculated based on the conditions shown in FIG. 9, a model for determining the driving difficulty level is generated based on the cost range shown in FIG. 4, and if the generated model is not valid, the conditions in FIG. 9 are generated. And modify the cost range in Figure 4.
  • the driving difficulty learning unit 14 may generate the driving difficulty determination model 15 based on the moving body information, the sensing information, the traveling path information, the driving situation, the driving scenario, the risk of the surrounding situation, and the traffic situation. ..
  • the driving support system 4000 of the fourth embodiment described above learns the driving difficulty level on the server 101, generates a driving difficulty level determination model 15, updates it, and can feed back to the next determination. It is possible to improve the judgment accuracy and speed up the processing.
  • FIG. 20 is a block diagram showing the configuration of the moving body 103 of the traveling support system 4000 according to the fifth embodiment.
  • the mobile body 103 includes a storage device 40, and the processor 200A has a running difficulty level determination unit 2, a support level determination unit 3, and a recommended action generation unit 4.
  • the driving difficulty determination unit 2, the support level determination unit 3, and the recommended action generation unit 4 have the same functions as the server 101 of the first embodiment shown in FIG. That is, the traveling difficulty determination unit 2 aggregates the information of the moving object 103 based on the moving object information, the sensing information, and the traveling path information acquired by the peripheral situation recognition unit 1, and the map stored in the storage device 40. It is integrated with the map information acquired from the data 6.
  • the support level determination unit 3 determines the support level based on the driving difficulty calculated by the driving difficulty determination unit 2.
  • the recommended action generation unit 4 has support information according to the support level based on the support level determined by the support level determination unit 3, the moving object information, the sensing information, and the traveling path information acquired by the surrounding situation recognition unit 1. Is generated, and if the support information is support information for manual driving, the display unit 45 is notified, and if the support information is support information for cooperative automatic driving, the automatic driving judgment unit 42 is notified, and the support information is for control control. In the case of the vehicle control information of the above, the vehicle control unit 43 is notified.
  • the driving support system 5000 of the fifth embodiment described above determines the driving difficulty level in the moving body 103 and determines the support level, thereby determining which information is used for automatic driving.
  • the load can be expected to be reduced. For example, when autonomous automatic driving or cooperative automatic driving using vehicle information is selected, the processing load can be reduced because it is not necessary to process the sensor information and the path information received from the surrounding moving body.
  • each embodiment can be freely combined, and each embodiment can be appropriately modified or omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to a travel assistance device for assisting the travel of a mobile object. The travel assistance device determines a travel difficulty representing the difficulty of travel of the mobile object in at least three levels, on the basis of mobile object information including at least the location, speed, and orientation of the mobile object acquired by a mobile object sensor mounted on the mobile object, and sensing information including at least the location and speed of an obstacle around the mobile object acquired by a surroundings recognition sensor mounted on the mobile object, and sets assistance information, which is used for assisting the travel of the mobile object in accordance with the determined travel difficulty.

Description

走行支援装置および走行支援方法Driving support device and driving support method
 本開示は、自動車の走行支援装置に関し、特に、処理負荷を軽減した走行支援装置に関する。 The present disclosure relates to a driving support device for an automobile, and particularly to a driving support device having a reduced processing load.
 近年、自動車に搭載されるカメラおよびミリ波などの周辺監視センサに基づいて、自律的に危険を回避または軽減する衝突回避ブレーキおよび一定速度で走行するアダプティブクルーズコントロール(ACC)などの自律型の自動運転システムの普及が始まっている。また、無線通信を用いて、周辺車両および路側機などと情報交換を行い、車両に搭載される周辺監視センサだけでは検知できない情報およびインフラが保有する事故および規制などのリアルタイムなダイナミックマップ情報を取得して、自動運転の高度化を実現する路車協調型自動運転システムに関する検討が行われている。 In recent years, based on cameras mounted on automobiles and peripheral monitoring sensors such as millimeter waves, autonomous automatic braking such as collision avoidance braking that autonomously avoids or reduces danger and adaptive cruise control (ACC) that runs at a constant speed Driving systems are beginning to spread. In addition, using wireless communication, information is exchanged with peripheral vehicles and roadside vehicles, and information that cannot be detected by the peripheral monitoring sensor mounted on the vehicle and real-time dynamic map information such as accidents and regulations possessed by the infrastructure are acquired. Therefore, studies are being conducted on a road-vehicle cooperative automatic driving system that realizes the sophistication of automatic driving.
 自動運転システムでは周辺状況に応じて、適切に自律型自動運転または協調型自動運転を選択することはできず、無線通信で情報が得られる場合のみ協調型自動運転を選択する方式が検討されている。 In the automatic driving system, it is not possible to properly select autonomous driving or cooperative automatic driving according to the surrounding situation, and a method of selecting cooperative automatic driving only when information can be obtained by wireless communication is being considered. There is.
 例えば特許文献1には、自動運転の実施、不実施、および自動運転の自動化レベルを、自動運転車両の交通量に応じて切り替える技術が記載されている。 For example, Patent Document 1 describes a technique for switching the implementation, non-execution, and automation level of autonomous driving according to the traffic volume of the autonomous driving vehicle.
特開2017-151041号公報Japanese Unexamined Patent Publication No. 2017-151041
 特許文献1では、自動運転の実施、不実施または自動化レベルの切り替えを自動運転の交通量に応じて切り替えているが、自動運転車両の混雑度に依存しており、自動運転が難しい状況においても、自動運転車両がそれぞれで判断して走行する必要がある。また、自動運転の判断に利用する情報量は自車両および通信で取得できる情報を利用しているため、処理負荷が高くなる可能性がある。 In Patent Document 1, the implementation, non-execution, or switching of the automation level of autonomous driving is switched according to the traffic volume of autonomous driving, but it depends on the degree of congestion of the autonomous driving vehicle, and even in a situation where automatic driving is difficult. , Self-driving vehicles need to make their own judgments and drive. In addition, since the amount of information used for determining automatic driving uses information that can be acquired by the own vehicle and communication, the processing load may increase.
 また、一般的に今後、自動運転車両の増加、通信車両の増加、通信する情報量の増加に伴い、走行支援を実現するために処理する情報量が増え、処理負荷が大きくなる課題がある。 In addition, in general, as the number of autonomous driving vehicles increases, the number of communication vehicles increases, and the amount of information to be communicated increases, the amount of information to be processed in order to realize driving support increases, and there is a problem that the processing load increases.
 本開示は上記のような問題を解決するためになされたものであり、走行支援を実現するための処理負荷を軽減した走行支援装置を提供することを目的とする。 The present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide a driving support device having a reduced processing load for realizing driving support.
 本開示は移動体の走行を支援する走行支援装置に係り、前記移動体に搭載された移動体センサで取得された前記移動体の位置、速度、および方位を少なくとも含む移動体情報と、前記移動体に搭載された周辺認識センサで取得された前記移動体の周辺の障害物の位置、速度を少なくとも含むセンシング情報と、に基づいて、前記移動体の走行の難しさを表す走行難易度を少なくとも3段階以上で判定し、判定した前記走行難易度に応じて、前記移動体の走行支援に利用する支援情報を設定する。 The present disclosure relates to a travel support device that supports the travel of a mobile body, and includes moving body information including at least the position, speed, and orientation of the moving body acquired by a moving body sensor mounted on the moving body, and the movement. Based on the sensing information including at least the position and speed of obstacles around the moving body acquired by the peripheral recognition sensor mounted on the body, at least the running difficulty level indicating the difficulty of running the moving body is determined. Judgment is made in three or more stages, and the support information used for the running support of the moving body is set according to the determined running difficulty level.
 本開示に係る走行支援装置によれば、走行難易度に応じて、移動体の走行支援に利用する支援情報を設定するので、走行難易度が低い状況では処理する情報を減らして安全性および快適性を確保しつつ処理負荷を低減することができ、走行難易度が低い状況では処理する情報を増やして安全性および快適性を確保しつつ効率的な処理を実現することができる。 According to the driving support device according to the present disclosure, the support information used for the driving support of the moving object is set according to the driving difficulty level, so that the information to be processed is reduced in the situation where the driving difficulty level is low, and the safety and comfort are achieved. It is possible to reduce the processing load while ensuring the performance, and in a situation where the driving difficulty is low, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
実施の形態1に係る走行支援システムの構成を説明する図である。It is a figure explaining the structure of the driving support system which concerns on Embodiment 1. FIG. 実施の形態1に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on Embodiment 1. FIG. 支援レベルに対する支援対象と支援情報の例を示す図である。It is a figure which shows the example of the support target and support information for a support level. 支援レベル管理テーブルの一例を示す図である。It is a figure which shows an example of the support level management table. 支援レベルの状態遷移の一例を示す図である。It is a figure which shows an example of the state transition of a support level. 支援レベルの切り替えを木構造で管理する一例を示す図である。It is a figure which shows an example which manages the switching of a support level by a tree structure. 実施の形態1に係る移動体の構成を示すブロック図である。It is a block diagram which shows the structure of the moving body which concerns on Embodiment 1. FIG. 実施の形態1に係るサーバの全体処理を示すフローチャートである。It is a flowchart which shows the whole processing of the server which concerns on Embodiment 1. 実施の形態1に係るサーバでの支援レベル判定条件の一例を示す図である。It is a figure which shows an example of the support level determination condition in the server which concerns on Embodiment 1. FIG. 走行難易度の判定および支援レベルの判定の一例を説明する図である。It is a figure explaining an example of the determination of the driving difficulty level and the determination of a support level. 走行難易度の判定および支援レベルの判定の一例を説明する図である。It is a figure explaining an example of the determination of the driving difficulty level and the determination of a support level. 実施の形態1に係る移動体の送信処理を示すフローチャートである。It is a flowchart which shows the transmission process of the moving body which concerns on Embodiment 1. 実施の形態1に係る移動体の受信処理を示すフローチャートである。It is a flowchart which shows the reception process of the moving body which concerns on Embodiment 1. 実施の形態2に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on Embodiment 2. 実施の形態2に係る移動体の構成を示すブロック図である。It is a block diagram which shows the structure of the moving body which concerns on Embodiment 2. 実施の形態3に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on Embodiment 3. FIG. 実施の形態3に係るサーバにおける群判定の適用例を示す図である。It is a figure which shows the application example of the group determination in the server which concerns on Embodiment 3. FIG. 実施の形態3に係るサーバの全体処理を示すフローチャートである。It is a flowchart which shows the whole processing of the server which concerns on Embodiment 3. 実施の形態4に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on Embodiment 4. FIG. 実施の形態5に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on Embodiment 5.
 <実施の形態1>
  <構成>
 図1は、実施の形態1に係る走行支援システム1000の構成を説明する図である。なお、図1における各構成の符号は、他の図において同一または相当する構成には同一の符号を付し、重複する説明は省略する。なお、走行支援システム1000の構成は、実施の形態2~5においても共通であり、それぞれ、走行支援システム2000~5000として説明する。
<Embodiment 1>
<Structure>
FIG. 1 is a diagram illustrating a configuration of a traveling support system 1000 according to the first embodiment. It should be noted that the reference numerals of the respective configurations in FIG. 1 are the same or the same reference numerals are given to the configurations corresponding to the same in the other drawings, and duplicate description will be omitted. The configuration of the travel support system 1000 is also common to the embodiments 2 to 5, and will be described as the travel support systems 2000 to 5000, respectively.
 図1に示す走行支援システム1000は、サーバ101、路側通信機102、移動体103および路側センサ104を備えている。走行支援システム1000は、各構成が1台以上で構成され、サーバ101は複数の路側通信機102と接続され、路側通信機102は、複数の移動体103および複数の路側センサ104と接続される。本実施の形態1では、移動体103は車両を想定して説明する。サーバ101と路側通信機102はインターネット網を介して接続されてもよい。 The travel support system 1000 shown in FIG. 1 includes a server 101, a roadside communication device 102, a mobile body 103, and a roadside sensor 104. The travel support system 1000 is composed of one or more units, the server 101 is connected to a plurality of roadside communication devices 102, and the roadside communication device 102 is connected to a plurality of mobile bodies 103 and a plurality of roadside sensors 104. .. In the first embodiment, the moving body 103 will be described assuming a vehicle. The server 101 and the roadside communication device 102 may be connected via the Internet network.
 なお、サーバ101は、路側通信機102および路側センサ104と一体化した形態でもよい。また、本実施の形態1ではサーバ101は移動体103とは別個に設けられており、移動体103の走行を支援する走行支援装置として説明するが、サーバ101は移動体103の1台に搭載されていてもよい。 The server 101 may be integrated with the roadside communication device 102 and the roadside sensor 104. Further, in the first embodiment, the server 101 is provided separately from the mobile body 103 and will be described as a travel support device for supporting the travel of the mobile body 103, but the server 101 is mounted on one of the mobile body 103. It may have been done.
 サーバ101を移動体103とは別個に設けることで、サーバ101の大きさに関する制約が少なく、サーバ101の処理能力を高めることも容易である。 By providing the server 101 separately from the mobile body 103, there are few restrictions on the size of the server 101, and it is easy to increase the processing capacity of the server 101.
 図2は実施の形態1に係る走行支援システム1000のサーバ101の構成を示すブロック図である。図2に示すようにサーバ101は、プロセッサ100、通信インタフェース10、通信器30および記憶装置40を備えている。なお、サーバ101は、計算機サーバであり、エッジサーバ、クラウドサーバなどで実現されるので、サーバ101を、エッジサーバ、クラウドサーバと表示する場合もある。 FIG. 2 is a block diagram showing the configuration of the server 101 of the driving support system 1000 according to the first embodiment. As shown in FIG. 2, the server 101 includes a processor 100, a communication interface 10, a communication device 30, and a storage device 40. Since the server 101 is a computer server and is realized by an edge server, a cloud server, or the like, the server 101 may be displayed as an edge server or a cloud server.
 プロセッサ100は、通信インタフェース10を介して通信器30と通信を行い、また、記憶装置40から地図情報等の情報を取得する。 The processor 100 communicates with the communication device 30 via the communication interface 10 and also acquires information such as map information from the storage device 40.
 プロセッサ100は、プログラムに記述された命令を実行して、データの転送、計算、加工、制御、管理といった処理を実行するためのIC(Integrated Circuit)で構成され、命令を実行することで、機能ブロックとして表される周辺状況認識部1、走行難易度判定部2、支援レベル判定部3および推奨行動生成部4の各機能を実現する。 The processor 100 is composed of an IC (Integrated Circuit) for executing a process described in a program to execute processes such as data transfer, calculation, processing, control, and management, and functions by executing the instruction. Each function of the surrounding situation recognition unit 1, the driving difficulty level determination unit 2, the support level determination unit 3, and the recommended action generation unit 4 represented as blocks is realized.
 プロセッサ100は、演算回路と、命令および情報が格納されるレジスタおよびキャッシュメモリを有する。プロセッサ100は、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、GPU(Graphics Processing Unit)などのプロセッサが適用される。 The processor 100 has an arithmetic circuit, a register for storing instructions and information, and a cache memory. A processor such as a CPU (Central Processing Unit), a DSP (Digital Signal Processor), or a GPU (Graphics Processing Unit) is applied to the processor 100.
 記憶装置40には、支援レベル管理テーブル5および地図データ6が保存されるが、これらについては後に説明する。 The support level management table 5 and the map data 6 are stored in the storage device 40, which will be described later.
 記憶装置40は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)、SD(Secure Digital:登録商標)、メモリカード、CF(Compact Flash)メモリ、NANDフラッシュメモリ等の、不揮発性または揮発性の半導体メモリを適用することができる。また、HDD(Hard Disk Drive)、SSD(Solid State Drive)、磁気ディスク、フレキシブルディスク、ブルーレイ(登録商標)ディスク、DVD(Digital Versatile Disc)、光ディスク、コンパクトディスク、ミニディスク等の可搬記憶媒体を適用してもよい。 The storage device 40 includes RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EEPROM (ElectricallyErasableProgrammableReadOnlyMemory), SD (SecureDigital: registered trademark), Non-volatile or volatile semiconductor memories such as memory cards, CF (Compact Flash) memories, and NAND flash memories can be applied. In addition, portable storage media such as HDD (Hard Disk Drive), SSD (Solid State Drive), magnetic disk, flexible disk, Blu-ray (registered trademark) disk, DVD (Digital Versatile Disc), optical disk, compact disk, and mini disk can be used. May be applied.
 通信器30は、路側通信機102を経由して移動体103または路側センサ104からデータを受信するレシーバーおよびデータを送信するトランスミッターを含む装置である。 The communication device 30 is a device including a receiver for receiving data from the mobile body 103 or the roadside sensor 104 via the roadside communication device 102 and a transmitter for transmitting the data.
 通信インタフェース10は、例えば、通信チップまたはNIC(Network Interface Card)を適用できる。 For example, a communication chip or a NIC (Network Interface Card) can be applied to the communication interface 10.
 また、通信インタフェース10は、車両通信専用のDSRC(Dedicated Short Range Communication)と、IEEE802.11p等の通信プロトコルを用いることができる。 Further, the communication interface 10 can use DSRC (Dedicated Short Range Communication) dedicated to vehicle communication and a communication protocol such as IEEE802.11p.
 また、通信インタフェース10は、LTE(Long Term Evolution:登録商標)、第5世代移動通信システム(5G)などの通信回線を用いてもよい。 Further, the communication interface 10 may use a communication line such as LTE (Long Term Evolution: registered trademark) or a 5th generation mobile communication system (5G).
 また、通信インタフェース10は、Bluetooth(登録商標)またはIEEE802.11a/b/g/n/acなどの無線LANを用いてもよい。 Further, the communication interface 10 may use a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n / ac.
 次にプロセッサ100について、図1および図2を参照してさらに説明する。周辺状況認識部1は、通信インタフェース10を介して、移動体103に搭載される移動体センサにより検知される位置、速度、方位、走行経路、ドライバ状態、ドライバ視点、乗員数、乗員状態などの移動体情報を取得する。また、移動体103に搭載される周辺認識センサにより検知される移動体103の周辺の障害物の位置、速度、属性、検知精度、映像、点群などのセンシング情報と、移動体103が自動運転を行うために生成する行動計画を示す走行軌跡および速度で構成される走行パス情報を取得する。 Next, the processor 100 will be further described with reference to FIGS. 1 and 2. The peripheral situation recognition unit 1 determines the position, speed, direction, traveling route, driver state, driver viewpoint, number of occupants, occupant state, etc. detected by the moving body sensor mounted on the moving body 103 via the communication interface 10. Get moving object information. In addition, sensing information such as the position, speed, attributes, detection accuracy, video, and point cloud of obstacles around the moving body 103 detected by the peripheral recognition sensor mounted on the moving body 103, and the moving body 103 automatically operates. Acquires the travel path information composed of the travel locus and the speed indicating the action plan to be generated in order to perform the above.
 また、周辺状況認識部1は、路側センサ104で検知される路側センサ104の周辺の障害物の位置、速度、属性、精度などのセンシング情報と、路側センサ104の位置および設置角度などの路側機情報を取得する。なお、路側センサ104を搭載する路側機も移動体として扱い、センシング情報を統合するものとする。 Further, the peripheral situation recognition unit 1 includes sensing information such as the position, speed, attribute, and accuracy of obstacles around the roadside sensor 104 detected by the roadside sensor 104, and the roadside machine such as the position and installation angle of the roadside sensor 104. Get information. The roadside unit equipped with the roadside sensor 104 is also treated as a moving body, and the sensing information is integrated.
 走行難易度判定部2は、周辺状況認識部1が取得した移動体情報、センシング情報および走行パス情報に基づいて、移動体103の情報を集約すると共に、記憶装置40に保存された地図データ6から取得される地図情報と統合する。また、走行難易度判定部2は、移動体情報およびセンシング情報に基づいて、各移動体の周辺状況を考慮して走行シチュエーションおよび走行シナリオを分析すると共に、各移動体間のコンフリクト指標に基づいて周辺移動体との接触可能性を含む周辺状況のリスクを算出する。また、走行難易度判定部2は、移動体103の数、相対距離、相対速度などから交通状況を分析する。 The traveling difficulty determination unit 2 aggregates the information of the moving object 103 based on the moving object information, the sensing information, and the traveling path information acquired by the surrounding situation recognition unit 1, and the map data 6 stored in the storage device 40. Integrate with map information obtained from. Further, the traveling difficulty determination unit 2 analyzes the traveling situation and the traveling scenario in consideration of the surrounding situation of each moving body based on the moving body information and the sensing information, and is based on the conflict index between the moving bodies. Calculate the risk of surrounding conditions, including the possibility of contact with surrounding moving objects. Further, the traveling difficulty determination unit 2 analyzes the traffic condition from the number of moving bodies 103, the relative distance, the relative speed, and the like.
 ここで、走行難易度とは、ドライバによる手動運転または自動運転システムによる自動運転における走行の難しさを示しており、移動体103の周辺に存在する車両状況、道路環境、天候、自動走行システムの構成、センサ性能などによって変化するものである。走行難易度は段階的に変化するものであり、少なくとも3段階以上で定義され、例えば、容易、普通、難しいに分類される。 Here, the driving difficulty level indicates the difficulty of driving in manual driving by the driver or automatic driving by the automatic driving system, and the vehicle condition, the road environment, the weather, and the automatic driving system existing around the moving body 103. It changes depending on the configuration, sensor performance, etc. The driving difficulty level changes in stages and is defined in at least three stages, and is classified into, for example, easy, normal, and difficult.
 走行難易度は、SAE(Society of Automotive Engineers)Internationalが定義する自動運転レベルとは異なる。なお、自動運転レベルとは、ドライバが全て操作するレベル0、システムがステアリング、加速および減速のいずれかをサポートするレベル1、システムがステアリングと加速および減速をサポートするレベル2、特定の場所ではシステムが全て操作し、緊急時はドライバが操作するレベル3、特定の場所ではシステムが全て操作するレベル4、場所に限定なくシステムが全てを操作するレベル5がある。 The driving difficulty level is different from the automatic driving level defined by SAE (Society of Automotive Engineers) International. The automatic driving level is level 0 operated by the driver, level 1 where the system supports steering, acceleration and deceleration, level 2 where the system supports steering and acceleration and deceleration, and the system in a specific place. There are level 3 where the driver operates everything in an emergency, level 4 where the system operates everything in a specific place, and level 5 where the system operates everything regardless of the location.
 走行難易度判定部2は、走行シチュエーション、走行シナリオ、周辺状況のリスクおよび交通状況に基づいて、走行難易度を算出することができる。 The driving difficulty determination unit 2 can calculate the driving difficulty based on the driving situation, the driving scenario, the risk of the surrounding situation, and the traffic situation.
 また、難易度を算出(判定)するために走行シチュエーション、走行シナリオなどを考慮しているので、支援対象車両の走行難易度を精度よく推定することができる。 In addition, since driving situations, driving scenarios, etc. are taken into consideration in order to calculate (determine) the difficulty level, it is possible to accurately estimate the driving difficulty level of the vehicle to be supported.
 走行難易度は、事前に準備したテーブルまたは条件式から判断するルールベースアルゴリズムで算出してもよいし、各パラメータからコスト関数として算出してもよいし、確率モデルに基づいて算出してもよいし、機械学習、決定木などの人工知能を用いて算出してもよい。 The driving difficulty may be calculated by a rule-based algorithm determined from a table prepared in advance or a conditional expression, may be calculated as a cost function from each parameter, or may be calculated based on a probability model. However, it may be calculated using artificial intelligence such as machine learning or decision tree.
 本実施の形態1では事前に準備した走行シチュエーションなどからコストを算出して、合計コストから支援レベルを判定するルールベースの方法を説明する。なお、走行難易度の算出については、後にさらに説明する。 In the first embodiment, a rule-based method of calculating the cost from the driving situations prepared in advance and determining the support level from the total cost will be described. The calculation of the driving difficulty level will be further described later.
 支援レベル判定部3は、走行難易度判定部2で算出された走行難易度に基づいて支援レベルを判定する。 The support level determination unit 3 determines the support level based on the driving difficulty calculated by the driving difficulty determination unit 2.
 支援レベル判定部3は走行難易度が高いほど手動運転または管制制御による制御を行い、走行難易度が下がるにつれて処理する情報量を削減した協調型自動運転による支援を行う。また、走行難易度が非常に高い場合または手動運転しか対応していない場合には手動運転による支援と判断する。走行難易度から支援レベルを判定する方法は、記憶装置40に保存された支援レベル管理テーブル5に基づいて判断することができる。支援レベル判定部3は、判定した支援レベルを推奨行動生成部4に通知する。 The support level determination unit 3 controls by manual driving or control control as the driving difficulty level increases, and provides support by cooperative automatic driving that reduces the amount of information to be processed as the driving difficulty level decreases. In addition, if the driving difficulty is extremely high or if only manual driving is supported, it is judged that the support is provided by manual driving. The method of determining the support level from the driving difficulty level can be determined based on the support level management table 5 stored in the storage device 40. The support level determination unit 3 notifies the recommended action generation unit 4 of the determined support level.
 支援レベルは、SAEInternationalが定義する自動運転レベルとは異なる。支援レベルは、大きく分けて、移動体103が自律的に自動走行する自律型自動運転、移動体103が他の移動体103と協調して自動走行する協調型自動運転、サーバ101による管制を受けて自動走行する管制制御および手動運転の4段階に分類できる。 The support level is different from the automatic driving level defined by SAE International. The support level is roughly divided into autonomous automatic driving in which the moving body 103 autonomously and automatically travels, cooperative automatic driving in which the moving body 103 automatically travels in cooperation with other moving bodies 103, and control by the server 101. It can be classified into four stages: control control for automatic driving and manual operation.
 協調型自動運転は、支援する情報種別に応じて、車両情報、センシング情報、走行パス情報を使う場合の3段階に分類できる。また、管制制御は推奨行動と車両制御の2段階に分類できる。 Cooperative autonomous driving can be classified into three stages when vehicle information, sensing information, and driving path information are used, depending on the type of information to be supported. In addition, control control can be classified into two stages: recommended behavior and vehicle control.
 支援レベルとして複数段階を設定することで、難易度が低い場合には処理負荷が低い方法で判断することができる。将来的にコネクテッドカー、自動運転車両が増加した場合、扱う情報量が飛躍的に増えるので、必要最小限のデータ処理を行うことは有効である。 By setting multiple levels as the support level, if the difficulty level is low, it can be judged by a method with a low processing load. If the number of connected cars and self-driving vehicles increases in the future, the amount of information handled will increase dramatically, so it is effective to perform the minimum necessary data processing.
 なお、支援レベルの切り替えには、コンフリクト指標、走行パスの重複度合、混雑度などの交通状況および車両の行動計画を利用できる。これにより、運転が困難なシチュエーションほど扱う情報量を増やして自動運転での走行を継続することが可能になる。 To switch the support level, you can use the conflict index, the degree of duplication of driving paths, the degree of congestion, and other traffic conditions and vehicle action plans. This makes it possible to increase the amount of information handled in situations where driving is difficult and to continue driving in automatic driving.
 推奨行動生成部4は、支援レベル判定部3で判定された支援レベル、周辺状況認識部1が取得した移動体情報、センシング情報および走行パス情報に基づいて、支援レベルに応じた支援情報を生成し、移動体103に対して通信インタフェース10を介して送信する。 The recommended action generation unit 4 generates support information according to the support level based on the support level determined by the support level determination unit 3, the moving object information, the sensing information, and the travel path information acquired by the surrounding situation recognition unit 1. Then, it is transmitted to the mobile body 103 via the communication interface 10.
 ここで、支援情報とは、例えば、手動運転向けの支援情報、協調型自動運転向けの支援情報、管制制御向けの支援情報がある。自律型自動運転は移動体単独で走行を判断するため支援情報としては存在しないが、推奨行動生成部4は自律型自動運転と判断したことを通知する。 Here, the support information includes, for example, support information for manual driving, support information for cooperative automatic driving, and support information for control control. Autonomous automatic driving does not exist as support information because it determines driving by the moving body alone, but the recommended action generation unit 4 notifies that it is determined to be autonomous automatic driving.
 手動運転向け支援情報は、周辺移動体との接触警報、推奨速度、推奨車線、推奨車線変更タイミング、推奨右左折タイミング、緊急車両の接近警報などである。また、手動運転向け支援情報には、警報情報の発生地点、有効時間などの付加情報を含む。手動運転向け支援情報は、ドライバに表示される情報である。 Support information for manual driving includes contact warnings with surrounding moving objects, recommended speeds, recommended lanes, recommended lane change timings, recommended right / left turn timings, and emergency vehicle approach warnings. In addition, the support information for manual driving includes additional information such as the generation point of the warning information and the effective time. Assistance information for manual driving is information displayed to the driver.
 協調型自動運転向けの支援情報としては、移動体情報、センシング情報、走行パス情報の他に、推奨速度、推奨走行車線、推奨車線変更タイミング、推奨右左折タイミングなどの推奨行動情報がある。 As support information for cooperative autonomous driving, in addition to moving object information, sensing information, and driving path information, there is recommended behavior information such as recommended speed, recommended driving lane, recommended lane change timing, and recommended right / left turn timing.
 推奨行動とは、移動体間の行動を予測および考慮して、移動体間で次にとるべき行動であり、例えば、車両が並走している状況において車線変更するために、一方の車両を減速させ、他方の車両を車線変更させるなど、走行をスムーズにするための行動である。 The recommended action is the action to be taken next between the moving bodies in anticipation and consideration of the action between the moving bodies. It is an action to make the vehicle run smoothly, such as decelerating and changing the lane of the other vehicle.
 管制制御向けの支援情報としては、走行速度、走行パス情報などの車両を制御する車両制御情報そのものである。車両制御情報を移動体103に送信して移動体103を制御することで、移動体103の管制制御または遠隔制御を行うことができる。車両制御情報は、交通流がスムーズに流れるための走行速度、走行車線を決定して、周辺車両も制御することで車線変更および右左折も効率的に実現することができる。例えば、周辺車両も同じ速度に設定することで不必要な加速および減速を回避することができ、車間距離を調整して車線変更をスムーズに実現することができる。 The support information for control control is the vehicle control information itself that controls the vehicle, such as the traveling speed and the traveling path information. By transmitting vehicle control information to the moving body 103 to control the moving body 103, control control or remote control of the moving body 103 can be performed. The vehicle control information determines the traveling speed and the traveling lane for the smooth flow of the traffic flow, and by controlling the surrounding vehicles, the lane change and the right / left turn can be efficiently realized. For example, by setting the speeds of peripheral vehicles to the same speed, unnecessary acceleration and deceleration can be avoided, and the inter-vehicle distance can be adjusted to smoothly change lanes.
 図3に、支援レベルに対する支援対象と支援情報の例を示す。図3に示すように、自律型自動運転では、配信対象となる移動体103の車載センサのみを利用するため、サーバから配信する支援情報はない。 Figure 3 shows an example of support targets and support information for support levels. As shown in FIG. 3, in the autonomous automatic driving, since only the in-vehicle sensor of the mobile body 103 to be distributed is used, there is no support information to be distributed from the server.
 他車両情報を活用する協調型自動運転(第1の協調型自動運転)では。サーバが収集した移動体情報のうち、配信対象となる移動体103の周辺に存在する移動体103の移動体情報(周辺の他車両情報)を支援情報として配信する。 In cooperative automatic driving that utilizes information on other vehicles (first cooperative automatic driving). Among the mobile information collected by the server, the mobile information (information on other vehicles in the vicinity) of the mobile 103 existing in the vicinity of the mobile 103 to be distributed is distributed as support information.
 他車両センサ情報を活用する協調型自動運転(第2の協調型自動運転)では、サーバが収集したセンシング情報のうち、配信対象となる移動体103の周辺に存在する移動体103のセンシング情報(周辺車両のセンシング情報)を配信する。 In the cooperative automatic driving (second cooperative automatic driving) that utilizes the sensor information of other vehicles, among the sensing information collected by the server, the sensing information of the moving body 103 existing in the vicinity of the moving body 103 to be distributed (the sensing information of the moving body 103). Distributes sensing information of surrounding vehicles).
 他車両のパス情報を活用する協調型自動運転(第3の協調型自動運転)では、サーバが収集したパス情報のうち、配信対象となる移動体103の周辺に存在する移動体103の走行パス情報(周辺車両のパス情報)を支援情報として配信する。 In cooperative automatic driving (third cooperative automatic driving) that utilizes the path information of another vehicle, among the path information collected by the server, the traveling path of the moving body 103 existing around the moving body 103 to be distributed. Information (pass information of surrounding vehicles) is distributed as support information.
 推奨行動情報を活用する管制制御(第1の管制制御)では、サーバ101が支援対象の移動体103の走行シナリオ、および周辺移動体の走行シナリオを考慮して、推奨速度、推奨車線、推奨車線変更タイミングなどの推奨行動情報を支援情報として配信する。 In the control control (first control control) that utilizes the recommended behavior information, the server 101 considers the driving scenario of the mobile body 103 to be supported and the driving scenario of the surrounding moving body, and the recommended speed, recommended lane, and recommended lane. Deliver recommended action information such as change timing as support information.
 車両制御情報を活用する管制制御(第2の管制制御)では、サーバ101が支援対象の移動体103だけでなく、周辺移動体を全て走行制御し交通を制御する車両制御情報を支援情報として配信する。 In control control (second control control) that utilizes vehicle control information, the server 101 distributes vehicle control information that controls traffic by controlling not only the mobile body 103 to be supported but also all surrounding mobile bodies as support information. do.
 手動運転では、サーバが支援対象の移動体103のドライバに対して警告情報、推奨行動の情報を支援情報として配信する。 In manual operation, the server distributes warning information and recommended action information to the driver of the mobile 103 to be supported as support information.
 車両制御情報を活用する管制制御の支援対象は、支援対象の移動体103の車両制御部であり、手動運転の支援対象は、支援対象の移動体103のドライバが視認可能な表示部である。その他の支援レベルでの支援対象は、支援対象の移動体103の自動運転判断部である。 The support target for control control utilizing vehicle control information is the vehicle control unit of the mobile body 103 to be supported, and the support target for manual driving is the display unit that can be visually recognized by the driver of the mobile body 103 to be supported. The support target at the other support level is the automatic driving determination unit of the mobile body 103 to be supported.
 支援レベル管理テーブル5は、例えば、図4に示すように支援レベルと、走行難易度およびコストを規定するテーブルで構成される。支援レベルは図3にも示したように、大きく自律型自動運転、協調型自動運転、管制制御、手動運転に分類でき、協調型自動運転は支援する情報に応じてさらに3段階に分類できる。管制制御は、推奨行動情報と車両制御情報を配信する2段階に分類できる。協調型自動運転としては、他車両情報、他車両センサ情報および他車両パス情報を移動体103に配信する場合を示している。 The support level management table 5 is composed of, for example, a support level and a table that defines a driving difficulty level and a cost as shown in FIG. As shown in FIG. 3, the support level can be broadly classified into autonomous automatic driving, cooperative automatic driving, control control, and manual driving, and cooperative automatic driving can be further classified into three stages according to the information to be supported. Control control can be classified into two stages of distributing recommended behavior information and vehicle control information. As the cooperative automatic driving, the case where the other vehicle information, the other vehicle sensor information, and the other vehicle path information are distributed to the moving body 103 is shown.
 図4に示すように、走行難易度は、自律型自動運転、他車両情報を配信する協調型自動運転、他車両センシング情報を配信する協調型自動運転、他車両走行パス情報を配信する協調型自動運転、推奨行動情報を配信する管制制御、車両制御情報を配信する管制制御および手動運転の順に1~7の走行難易度が規定されている。 As shown in FIG. 4, the driving difficulty levels are autonomous automatic driving, cooperative automatic driving that distributes other vehicle information, cooperative automatic driving that distributes other vehicle sensing information, and cooperative type that distributes other vehicle driving path information. Driving difficulty levels 1 to 7 are specified in the order of automatic driving, control control for distributing recommended action information, control control for distributing vehicle control information, and manual driving.
 コストは、先に説明したように走行難易度に合わせて事前に準備されており、例えば、走行難易度1の自律型自動運転ではコスト(c)は、5未満、走行難易度2の協調型自動運転では5以上10未満、走行難易度3の協調型自動運転では10以上15未満、走行難易度4の協調型自動運転では15以上20未満、走行難易度5の管制制御では20以上25未満、走行難易度6の管制制御では25以上30未満、走行難易度7の管制制御では30以上となっている。 As explained above, the cost is prepared in advance according to the driving difficulty level. For example, in the autonomous automatic driving with the driving difficulty level 1, the cost (c) is less than 5, and the driving difficulty level is 2 cooperative type. 5 or more and less than 10 for automatic driving, 10 or more and less than 15 for cooperative automatic driving with driving difficulty 3, 15 or more and less than 20 for cooperative automatic driving with driving difficulty 4, and 20 or more and less than 25 for control control with driving difficulty 5. In the control control of driving difficulty level 6, it is 25 or more and less than 30, and in the control control of driving difficulty level 7, it is 30 or more.
 なお、図4では走行難易度を7段階で切り替える例を示しているが、自律型自動運転、協調型自動運転、管制制御の最低3段階で切り替えてもよいし、より細分化して8段階以上で切り替えてもよい。 Although FIG. 4 shows an example of switching the driving difficulty level in 7 stages, it may be switched in at least 3 stages of autonomous automatic driving, cooperative automatic driving, and control control, and it may be further subdivided into 8 or more stages. You may switch with.
 図4では、支援レベル管理テーブルはテーブルで構成する例を示しているが、これに限定したものではなく、例えば、図5に示すように有限状態機械(Finite State Machine)で支援レベルを切り替えてもよい。 FIG. 4 shows an example in which the support level management table is composed of a table, but the present invention is not limited to this, and for example, as shown in FIG. 5, the support level is switched by a finite state machine (Finite State Machine). May be good.
 図5は有限状態機械での支援レベルの切り替えを表す状態遷移図である。図5において、自律型自動運転、協調型自動運転、管制制御および手動運転の4つの状態間に、矢印で走行難易度の上昇(Up)、走行難易度の低下(Down)が示されている。また、協調型自動運転では、車両情報、センシング情報および走行パス情報を活用する協調型自動運転の3つの状態間に、走行難易度のUp、走行難易度のDownが示されている。また、管制制御では、推奨行動および車両制御の2つの状態間に、走行難易度のUp、走行難易度のDownが示されている。 FIG. 5 is a state transition diagram showing switching of support levels in a finite state machine. In FIG. 5, arrows indicate an increase in driving difficulty (Up) and a decrease in driving difficulty (Down) between the four states of autonomous automatic driving, cooperative automatic driving, control control, and manual driving. .. Further, in the cooperative automatic driving, the driving difficulty level Up and the driving difficulty level Down are shown between the three states of the cooperative automatic driving utilizing the vehicle information, the sensing information, and the traveling path information. Further, in the control control, the driving difficulty level Up and the driving difficulty level Down are shown between the two states of the recommended action and the vehicle control.
 なお、図5では走行難易度の変化によって状態を遷移させているが、これに限定したものではなく、特定の走行シチュエーションまたは走行シナリオに基づいて遷移させてもよい。 Although the state is changed according to the change in the driving difficulty level in FIG. 5, the state is not limited to this, and the transition may be made based on a specific driving situation or driving scenario.
 さらに、図6に示す決定木(Decision Tree)のような木構造で支援レベルの切り替えを管理してもよい。図6に示すように、自律型自動運転、協調型自動運転、管制制御および手動運転の4つの状態に対して、協調型自動運転では、車両情報、センシング情報および走行パス情報を活用する協調型自動運転の3つの状態が枝分かれし、管制制御では、推奨行動および車両制御の2つの状態が枝分かれしたツリー状の分類が示されている。 Furthermore, switching of support levels may be managed by a tree structure such as the decision tree shown in FIG. As shown in FIG. 6, in contrast to the four states of autonomous automatic driving, cooperative automatic driving, control control, and manual driving, cooperative automatic driving utilizes vehicle information, sensing information, and travel path information. The three states of autonomous driving are branched, and the control control shows a tree-like classification in which the two states of recommended behavior and vehicle control are branched.
 また、支援レベル管理テーブルは走行難易度およびコストの他に、周辺状況のリスクおよび行動計画に基づいて定義されてもよい。また、機械学習または深層学習によって走行難易度およびコスト、周辺状況のリスクの判断条件を更新してもよい。 In addition to the driving difficulty and cost, the support level management table may be defined based on the risk of the surrounding situation and the action plan. In addition, the driving difficulty and cost, and the judgment conditions of the risk of the surrounding situation may be updated by machine learning or deep learning.
 地図データ6は、地図に関する地図情報を有している。地図情報とは、予め定められた縮尺に対応する複数の地図が階層化されて構成されており、道路に関する情報である道路情報、道路を構成する車線に関する情報である車線情報、および車線を構成する構成線に関する情報である構成線情報を含んでいる。 Map data 6 has map information related to the map. Map information is composed of a plurality of maps corresponding to a predetermined scale in a layered manner, and constitutes road information which is information about a road, lane information which is information about lanes constituting a road, and lanes. Contains constituent line information, which is information about the constituent lines to be used.
 道路情報は、例えば、道路の形状、道路の緯度、経度、道路の曲率、道路の勾配、道路の識別子、道路の車線数および道路の線種に加え、一般道路、高速道路および優先道路などの道路の属性に関する情報を含んでいる。 Road information includes, for example, road shape, road latitude, longitude, road curvature, road slope, road identifier, number of road lanes and road line types, as well as general roads, highways and priority roads. Contains information about road attributes.
 車線情報は、例えば、道路を構成する車線の識別子、車線の緯度、経度および中央線に関する情報を含んでいる。 The lane information includes, for example, the identifier of the lane constituting the road, the latitude, longitude and the center line of the lane.
 構成線情報は、車線を構成する各線の識別子、車線を構成する各線の緯度、経度および車線を構成する各線の線種および曲率に関する情報を含んでいる。 The constituent line information includes the identifier of each line constituting the lane, the latitude and longitude of each line constituting the lane, and the line type and curvature of each line constituting the lane.
 道路情報は、道路ごとに管理され、車線情報および構成線情報は、車線ごとに管理される。地図情報は、ナビゲーション、走行支援、自動運転などに利用される。 Road information is managed for each road, and lane information and constituent line information are managed for each lane. Map information is used for navigation, driving support, automatic driving, and the like.
 また、道路情報は、時間によって変化する交通規制情報(車線規制、速度規制、通行規制、チェーン規制等)、料金所規制情報(出入口、料金所)、交通渋滞情報(渋滞の有無、区間、車線)、交通事故情報(停止車両、低速車両)、障害物情報(落下物、動物)、道路異常情報(道路損傷、路面異常)、周辺車両情報などを含むダイナミックマップとして定義してもよい。 In addition, road information includes traffic regulation information (lane regulation, speed regulation, traffic regulation, chain regulation, etc.) that changes with time, tollgate regulation information (entrance / exit, tollgate), and traffic congestion information (presence / absence of congestion, section, lane). ), Traffic accident information (stopped vehicle, low speed vehicle), obstacle information (falling object, animal), road abnormality information (road damage, road surface abnormality), peripheral vehicle information, etc. may be defined as a dynamic map.
 図7は実施の形態1に係る走行支援システム1000の移動体103の構成を示すブロック図である。図7に示すように移動体103は、プロセッサ200、通信インタフェース20、周辺認識インタフェース21、車両センサインタフェース22および車両制御インタフェース23を有する移動体システム300と、周辺認識センサ31、移動体センサ32、車両制御ECU33および通信器34を備えている。なお、移動体103は自動運転のためのアクチュエータ等も備えるが、実施の形態との関連が薄い構成は省略している。 FIG. 7 is a block diagram showing the configuration of the moving body 103 of the traveling support system 1000 according to the first embodiment. As shown in FIG. 7, the mobile body 103 includes a mobile body system 300 having a processor 200, a communication interface 20, a peripheral recognition interface 21, a vehicle sensor interface 22, and a vehicle control interface 23, a peripheral recognition sensor 31, and a mobile sensor 32. It includes a vehicle control ECU 33 and a communication device 34. The moving body 103 also includes an actuator for automatic operation and the like, but a configuration that is not closely related to the embodiment is omitted.
 プロセッサ200は、通信インタフェース20、周辺認識インタフェース21、車両センサインタフェース22および車両制御インタフェース23を介して周辺認識センサ31、移動体センサ32、車両制御ECU33および通信器34と通信を行い、情報を取得する。 The processor 200 communicates with the peripheral recognition sensor 31, the mobile sensor 32, the vehicle control ECU 33, and the communication device 34 via the communication interface 20, the peripheral recognition interface 21, the vehicle sensor interface 22, and the vehicle control interface 23, and acquires information. do.
 プロセッサ200は、図2に示したサーバ101に搭載されるプロセッサ100と同様に、プログラムに記述された命令を実行して、データの転送、計算、加工、制御、管理といった処理を実行するためのICで構成され、命令を実行することで、機能ブロックとして表される周辺状況認識部41、自動運転判断部42、車両制御部43、支援レベル切替部44の各機能を実現し、表示部45を介してドライバに対して走行支援情報を通知する。 Similar to the processor 100 mounted on the server 101 shown in FIG. 2, the processor 200 executes instructions described in the program to execute processes such as data transfer, calculation, processing, control, and management. By executing commands, which are composed of ICs, the functions of the peripheral situation recognition unit 41, the automatic driving judgment unit 42, the vehicle control unit 43, and the support level switching unit 44, which are represented as functional blocks, are realized, and the display unit 45 is realized. Notify the driver of driving support information via.
 周辺状況認識部41は、サーバ101に搭載される周辺状況認識部1と同じ機能を有しており、周辺認識センサ31により検出された移動体103の周辺の障害物のセンシング情報、移動体センサ32により検出された移動体情報、通信器34で受信した周辺移動体の移動体情報およびセンシング情報を、それぞれ、車両センサインタフェース22、周辺認識インタフェース21および通信インタフェース20を介して取得し、予め内部で管理している地図情報などと統合して、走行可能な領域を認識する。 The peripheral situation recognition unit 41 has the same function as the peripheral situation recognition unit 1 mounted on the server 101, and has sensing information of obstacles around the moving body 103 detected by the peripheral recognition sensor 31 and a moving body sensor. The moving body information detected by the 32, the moving body information of the peripheral moving body received by the communication device 34, and the sensing information are acquired via the vehicle sensor interface 22, the peripheral recognition interface 21, and the communication interface 20, respectively, and are internally stored in advance. Recognize the travelable area by integrating with the map information managed in.
 自動運転判断部42は、周辺状況認識部41が統合する移動体情報、センシング情報および地図情報に基づいて、障害物および周辺車両と接触せずに、安全に走行するための車線維持、車線変更、加速、減速などの行動計画を作成する。また、自動運転判断部42は、車両制御部43がステアリング、アクセルおよびブレーキを制御するための走行パス、速度の車両制御情報を生成し、車両制御部43に通知する。 Based on the moving object information, sensing information, and map information integrated by the surrounding situation recognition unit 41, the automatic driving determination unit 42 maintains a lane and changes lanes to drive safely without contacting obstacles and surrounding vehicles. , Acceleration, deceleration, etc. Further, the automatic driving determination unit 42 generates vehicle control information of the traveling path and speed for the vehicle control unit 43 to control the steering, the accelerator, and the brake, and notifies the vehicle control unit 43.
 車両制御部43は、自動運転判断部42から通知された車両制御情報に従って、車両制御インタフェース23を経由して、移動体103のアクセル、ブレーキ、ステアリングを制御する。 The vehicle control unit 43 controls the accelerator, brake, and steering of the moving body 103 via the vehicle control interface 23 according to the vehicle control information notified from the automatic driving determination unit 42.
 支援レベル切替部44は、サーバ101から支援情報を受信し、受信した支援情報の種別を判断し、支援情報が手動運転向け支援情報の場合は表示部45に通知し、支援情報が協調型自動運転向け支援情報の場合は自動運転判断部42に通知し、支援情報が管制制御向けの車両制御情報の場合は車両制御部43に通知する。なお、協調型自動運転向け支援情報は、自動運転判断部42の代わりに周辺状況認識部41に通知して、周辺状況認識部41において移動体情報およびセンシング情報を全て統合してから、自動運転判断部42に通知する構成としてもよい。 The support level switching unit 44 receives support information from the server 101, determines the type of the received support information, notifies the display unit 45 if the support information is support information for manual driving, and the support information is cooperatively automatic. In the case of support information for driving, the automatic driving determination unit 42 is notified, and in the case of the support information is vehicle control information for control control, the vehicle control unit 43 is notified. The support information for cooperative automatic driving is notified to the surrounding situation recognition unit 41 instead of the automatic driving judgment unit 42, and after the peripheral situation recognition unit 41 integrates all the moving body information and the sensing information, the automatic driving is performed. It may be configured to notify the determination unit 42.
 表示部45は、支援レベル切替部44から通知された手動運転向け支援情報を表示して、ドライバに通知する。表示部45としては、カーナビゲーションシステムなどのディスプレイ、ヘッドアップディスプレイ、AR(拡張現実)システム、コックピットなどの表示パネルを利用してもよいし、音声を組み合わせてドライバに通知してもよい。また、表示部45は、支援情報の位置に応じて、表示する位置を変更してもよい。さらに、表示部45はドライバが走行支援情報を認識していない場合のみ表示してもよい。 The display unit 45 displays the support information for manual driving notified from the support level switching unit 44, and notifies the driver. As the display unit 45, a display such as a car navigation system, a head-up display, an AR (augmented reality) system, a display panel such as a cockpit may be used, or a voice may be combined to notify the driver. Further, the display unit 45 may change the display position according to the position of the support information. Further, the display unit 45 may display only when the driver does not recognize the driving support information.
 通信インタフェース20は、周辺移動体および基地局から測位データを受信したり、電波の到来角度および送受信に要する時間を検知したりするレシーバーおよびデータを送信するトランスミッターを含む装置である。 The communication interface 20 is a device including a receiver that receives positioning data from a peripheral mobile body and a base station, detects the arrival angle of radio waves and the time required for transmission / reception, and a transmitter that transmits data.
 通信インタフェース20は、例えば、通信チップまたはNICを適用できる。また、通信インタフェース20は、車両通信専用のDSRCと、IEEE802.11p等の通信プロトコルを用いることができる。 For example, a communication chip or NIC can be applied to the communication interface 20. Further, the communication interface 20 can use a DSRC dedicated to vehicle communication and a communication protocol such as IEEE802.11p.
 また、通信インタフェース20は、LTE(登録商標)、第5世代移動通信システム(5G)などの携帯電話網を用いてもよい。 Further, the communication interface 20 may use a mobile phone network such as LTE (registered trademark) or a 5th generation mobile communication system (5G).
 また、通信インタフェース20は、Bluetooth(登録商標)またはIEEE802.11a/b/g/nといった無線LANを用いてもよい。 Further, the communication interface 20 may use a wireless LAN such as Bluetooth (registered trademark) or IEEE802.11a / b / g / n.
 また、通信器34は、DSRC、LTE、5Gに対応した通信装置であり、図7では受信した情報を移動体システム300に与える構成となっているが、移動体システム300からの情報を外部に送信する機能も有している。 Further, the communication device 34 is a communication device compatible with DSRC, LTE, and 5G, and is configured to give the received information to the mobile system 300 in FIG. 7, but the information from the mobile system 300 is sent to the outside. It also has a function to send.
 周辺認識インタフェース21は、移動体103に搭載された周辺認識センサ31からデータを取得するためのインタフェースである。具体例としては、センサデータ取得LSI(Large Scale Integration)、またUSB(Universal Serial Bus)、CAN(Controller Area Network)のポートである。 The peripheral recognition interface 21 is an interface for acquiring data from the peripheral recognition sensor 31 mounted on the mobile body 103. Specific examples are sensor data acquisition LSI (Large Scale Integration), USB (Universal Serial Bus), and CAN (Controller Area Network) ports.
 周辺認識センサ31は、ミリ波レーダ、単眼カメラ、ステレオカメラ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ソナー、GPS(Global Positioning System)等の測位が可能なセンサである。また、周辺認識センサ31は、移動体103内部のドライバを監視するDMS(Driver Monitoring System)およびドライブレコーダーも含んでいる。 The peripheral recognition sensor 31 is a sensor capable of positioning such as millimeter wave radar, monocular camera, stereo camera, LiDAR (Light Detection and Ringing, Laser Imaging Detection and Ringing), sonar, GPS (Global Positioning System) and the like. The peripheral recognition sensor 31 also includes a DMS (Driver Monitoring System) and a drive recorder that monitor the driver inside the mobile body 103.
 車両センサインタフェース22は、GPS、速度センサ、加速度センサ、方位センサ等の移動体センサ32をプロセッサ200に接続するための装置である。また、車両センサインタフェース22は、車載ECU、EPS(Electric Power Steering)、カーナビゲーションシステムおよびコックピット等の車載機器とプロセッサ200とを接続するための装置である。車両センサインタフェース22は、具体例としては、センサECU(Electronic Control Unit)である。 The vehicle sensor interface 22 is a device for connecting a moving body sensor 32 such as a GPS, a speed sensor, an acceleration sensor, and an orientation sensor to the processor 200. Further, the vehicle sensor interface 22 is a device for connecting an in-vehicle device such as an in-vehicle ECU, an EPS (Electric Power Steering), a car navigation system, and a cockpit to the processor 200. As a specific example, the vehicle sensor interface 22 is a sensor ECU (Electronic Control Unit).
 車両制御インタフェース23は、アクセル、ブレーキ、ステアリングを制御する車両制御ECU33をプロセッサ200接続するための装置である。 The vehicle control interface 23 is a device for connecting the processor 200 to the vehicle control ECU 33 that controls the accelerator, brake, and steering.
 なお、走行支援システムのサーバ101または移動体103は、図示した他の構成要素と、一体化した形態または分離不可能な形態で実装されてもよいし、取り外し可能な形態または分離可能な形態で実装されてもよい。 The server 101 or the mobile body 103 of the travel support system may be mounted in an integrated form or an inseparable form with other components shown in the figure, or may be in a removable form or a separable form. It may be implemented.
 また、図2および図7では、それぞれプロセッサ100および200は1つだけ示されていた。しかし、プロセッサ100および200は、複数であってもよく、複数のプロセッサ100および200、各機能を実現するプログラムを連携して実行してもよい。 Further, in FIGS. 2 and 7, only one processor 100 and 200 are shown, respectively. However, the number of processors 100 and 200 may be plural, and the plurality of processors 100 and 200 and programs that realize each function may be executed in cooperation with each other.
  <動作>
 以下、図8~図13を用いて、実施の形態1に係る走行支援システム1000の動作について説明する。なお、実施の形態1の走行支援システム1000の動作は、走行支援装置の動作および走行支援方法も含んでいる。
<Operation>
Hereinafter, the operation of the traveling support system 1000 according to the first embodiment will be described with reference to FIGS. 8 to 13. The operation of the travel support system 1000 of the first embodiment also includes the operation of the travel support device and the travel support method.
 まず、図8に示すフローチャートを用いて、実施の形態1に係る走行支援システム1000におけるサーバ101の全体的な動作を説明する。 First, the overall operation of the server 101 in the driving support system 1000 according to the first embodiment will be described with reference to the flowchart shown in FIG.
 サーバ101の周辺状況認識部1は、複数の移動体103から路側機を経由して、移動体情報、センシング情報および走行パス情報を、例えば、100ミリ秒周期で取得する(ステップS101)。 The peripheral situation recognition unit 1 of the server 101 acquires the moving body information, the sensing information, and the traveling path information from the plurality of moving bodies 103 via the roadside machine, for example, in a cycle of 100 milliseconds (step S101).
 ステップS101は、ステップS103で判定する走行難易度またはステップS104で判定する支援レベルに基づいて、移動体103から取得する情報種別を変更したり、取得周期を変更したり、取得経路を変更したりする。例えば、走行難易度または支援レベルが低い場合には移動体情報のみを取得し、走行難易度または支援レベルが高い場合には移動体情報、センシング情報、走行パス情報を取得する。また、例えば、走行難易度または支援レベルが低い場合には200ミリ秒周期で情報を取得し、走行難易度または支援レベルが高い場合には20ミリ秒周期で情報を取得する。 In step S101, the information type acquired from the moving body 103 is changed, the acquisition cycle is changed, or the acquisition route is changed based on the driving difficulty level determined in step S103 or the support level determined in step S104. do. For example, when the driving difficulty level or the support level is low, only the moving object information is acquired, and when the traveling difficulty level or the support level is high, the moving object information, the sensing information, and the traveling path information are acquired. Further, for example, when the driving difficulty level or the support level is low, the information is acquired in a cycle of 200 milliseconds, and when the driving difficulty level or the support level is high, the information is acquired in a cycle of 20 milliseconds.
 なお、ここではサーバ101が移動体103から情報を取得する例を示しているが、移動体103に対してサーバ101から走行難易度を通知した後、移動体103からサーバ101に情報が通知されるようにしてもよい。また、路側機から路側センサ情報を受信して、移動体103の情報と組み合わせて処理してもよい。 Although an example in which the server 101 acquires information from the mobile body 103 is shown here, the information is notified from the mobile body 103 to the server 101 after the server 101 notifies the mobile body 103 of the traveling difficulty level. You may do so. Further, the roadside sensor information may be received from the roadside machine and processed in combination with the information of the moving body 103.
 また、走行難易度または支援レベルが低い場合には移動体103で一次処理された相対距離および相対速度などのセンシング情報を取得し、走行難易度または支援レベルが高い場合には移動体103の周辺認識センサで取得された映像および点群などの生データを取得する。 Further, when the driving difficulty level or the support level is low, sensing information such as the relative distance and the relative speed primary processed by the moving body 103 is acquired, and when the traveling difficulty level or the support level is high, the periphery of the moving body 103 is acquired. The raw data such as the image and the point cloud acquired by the recognition sensor are acquired.
 サーバ101の走行難易度判定部2は、受信した移動体情報、センシング情報および走行パス情報に基づいて、走行シチュエーション、走行シナリオ、周辺状況のリスクおよび混雑度を算出する(ステップS102)。 The driving difficulty determination unit 2 of the server 101 calculates the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion based on the received mobile body information, sensing information, and driving path information (step S102).
 走行シチュエーションは、ある車両(移動体)を基準とした場合、その車両の周辺に存在する車両の車線変更発生回数および頻度、当該車両の前後左右の車両との相対距離および相対速度、または周辺に存在する車両の加速度、減速度の変化量、走行パスの重複度合などの周辺車両とのシチュエーション情報、および天候、事故発生区間、渋滞区間などの道路のシチュエーション情報などで規定される。なお、ここでのある車両は、サーバ101が管理する全ての移動体103であり、それぞれの車両に対して演算を実施する。 When the driving situation is based on a certain vehicle (moving body), the number and frequency of lane changes of vehicles existing around the vehicle, the relative distance and relative speed of the vehicle from the front, rear, left and right vehicles, or the surroundings are used. It is defined by situation information with surrounding vehicles such as acceleration of existing vehicles, amount of change in deceleration, degree of overlap of driving paths, and road situation information such as weather, accident occurrence section, and traffic jam section. It should be noted that a certain vehicle here is all the moving bodies 103 managed by the server 101, and the calculation is performed for each vehicle.
 走行シナリオは、車線変更、右折、左折、合流、分流、一時停止、信号停止、徐行などの車両の挙動を示すものであり、移動体情報の経路情報、走行パス情報および地図情報から判断する。 The driving scenario shows the behavior of the vehicle such as lane change, right turn, left turn, merging, branching, temporary stop, signal stop, and driving, and is judged from the route information, driving path information, and map information of the moving body information.
 周辺状況のリスクは、コンフリクト指標として、2台の車両が現在の速度で同じ方向に走行した場合に接触するのに要する時間TTC(Time-to-Collision)、ある車両が接触点に進入する時間と、他の車両がこの点に到達する時間との差を表すPET(Post Encroachment Time)、先行車両との接触を回避するために後続車両が減速しなければならない最小必要減速率を規定するDRAC(Deceleration Rate to Avoid a Crash)などの値を利用して判断する。 The risk of the surrounding situation is, as a conflict index, the time required for two vehicles to make contact when traveling in the same direction at the current speed TTC (Time-to-Collision), the time for a vehicle to enter the contact point. PET (Post Encroachment Time), which indicates the difference between the time when another vehicle reaches this point, and DRAC, which defines the minimum required deceleration rate at which the following vehicle must decelerate in order to avoid contact with the preceding vehicle. Judgment is made using a value such as (Deceleration Rate to Avoid a Crash).
 混雑度は、ある地点を対象として、一定時間以内に通過する車両台数および車両の通過時間間隔で規定される車間時間に基づいて判断する。なお、ここでの、ある地点とは、車両が存在する各地点であり、あらゆる任意の場所である。 The degree of congestion is determined based on the number of vehicles passing within a certain time and the inter-vehicle time specified by the passing time interval of vehicles for a certain point. In addition, a certain point here is each point where a vehicle exists, and is any arbitrary place.
 サーバ101の走行難易度判定部2は、走行シチュエーション、走行シナリオ、周辺状況のリスクおよび混雑度に基づいて走行難易度を判定する(ステップS103)。 The driving difficulty determination unit 2 of the server 101 determines the driving difficulty based on the driving situation, the driving scenario, the risk of the surrounding situation, and the congestion degree (step S103).
 例えば、ここでは走行シチュエーション、走行シナリオ、周辺状況のリスク、混雑度に基づいて合計コストを算出して、合計コストから走行難易度を判定する。これにより、走行難易度の判定を数値で判定することができる。 For example, here, the total cost is calculated based on the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion, and the driving difficulty is determined from the total cost. As a result, the determination of the running difficulty level can be determined numerically.
 図9には、走行シチュエーション、走行シナリオ、周辺状況のリスク、混雑度に対するコストの一例を示す。なお、図9においては、走行シチュエーションとして、先に例示した以外に、各種センサの構成および性能の情報、ドライバ状態、ドライバ危機認識、ドライバ視線などのドライバ情報も含んでいる。センサの構成および性能の情報は、例えば、センサの性能を、高、中、低で表し、ドライバ状態は、ドライバが覚醒状態か、睡眠状態かで表され、ドライバ危機認識は、ドライバが危険を認識しているか(有)か、危険を認識していないか(無)で表され、ドライバ視線は、ドライバの視線の方向が正常か脇見をしているかで表される。 FIG. 9 shows an example of the cost for the driving situation, the driving scenario, the risk of the surrounding situation, and the degree of congestion. In addition, in FIG. 9, in addition to the above-exemplified driving situations, driver information such as configuration and performance information of various sensors, driver status, driver crisis recognition, and driver line of sight is also included. Sensor configuration and performance information, for example, represents sensor performance in high, medium, and low, driver state is represented by whether the driver is awake or sleeping, and driver crisis awareness is a driver's danger. It is represented by whether it is recognized (yes) or not aware of danger (no), and the driver's line of sight is represented by whether the direction of the driver's line of sight is normal or looking aside.
 図9ではコストを0、1、2の3段階で設定しており、ドライバが覚醒状態であるような場合には、走行難易度が下がるようにコストは最低(0)とし、ドライバが睡眠状態であるような場合には、走行難易度が上がるようにコストは最高(2)としている。なお、コストの階数は3段階よりも少なくてもよいし、多くてもよい。走行難易度は全項目のコストを合計して判定してもよいし、少なくとも1つ以上の項目のコストで判断してもよい。 In FIG. 9, the cost is set in three stages of 0, 1, and 2. When the driver is awake, the cost is set to the lowest (0) so that the driving difficulty is lowered, and the driver is in a sleeping state. In such a case, the cost is set to the highest (2) so that the driving difficulty level increases. The number of floors of the cost may be less than or greater than the three levels. The driving difficulty may be determined by summing up the costs of all items, or may be determined by the cost of at least one item.
 また、走行難易度判定部2は、図9に示す情報に加えて、走行支援システムまたは自動運転システムにおけるサーバ101の制約および車載システムの制約に応じて、走行難易度を判定してもよい。 Further, in addition to the information shown in FIG. 9, the driving difficulty determination unit 2 may determine the driving difficulty according to the restrictions of the server 101 in the driving support system or the automatic driving system and the restrictions of the in-vehicle system.
 ここで、サーバ101の制約とは、例えば、サーバ101の処理する性能、処理中の車両台数などであり、処理性能が低かったり処理台数が多い場合には走行難易度を高くすることとなる。車載システムの制約とは、例えば、移動体103に搭載される周辺認識センサ31の検知精度、周辺認識センサ31の認識可能距離であり、センシング距離が短かったり精度が低かったりする場合には走行難易度を高くすることとなる。 Here, the restrictions of the server 101 are, for example, the processing performance of the server 101, the number of vehicles being processed, and the like, and if the processing performance is low or the number of processed vehicles is large, the driving difficulty is increased. The restrictions of the in-vehicle system are, for example, the detection accuracy of the peripheral recognition sensor 31 mounted on the moving body 103 and the recognizable distance of the peripheral recognition sensor 31, and it is difficult to drive when the sensing distance is short or the accuracy is low. The degree will be high.
 また、図9に示す条件およびコストは、機械学習および深層学習によって判断条件を学習して更新してもよい。 Further, the conditions and costs shown in FIG. 9 may be updated by learning the judgment conditions by machine learning and deep learning.
 また、ある移動体103における走行難易度が低い場合には、次の判定タイミングを遅らせたり、判定周期を長くし、走行難易度が高い場合には次の判定タイミングを早くしたり、判定周期を短くする。また、走行難易度が低い場合には判定に利用する情報の範囲を狭くしたり、走行難易度が高い場合には判定に利用する情報の範囲を広くしたりする。 Further, when the traveling difficulty level of a certain moving body 103 is low, the next determination timing is delayed or the determination cycle is lengthened, and when the traveling difficulty level is high, the next determination timing is advanced or the determination cycle is set. shorten. Further, when the driving difficulty level is low, the range of information used for the determination is narrowed, and when the driving difficulty level is high, the range of information used for the determination is widened.
 さらに走行難易度は過去に判定した結果に対して、重み付けを行ったり、一定時間内の結果に基づいて判定したりしてもよい。 Furthermore, the driving difficulty may be weighted based on the results determined in the past, or may be determined based on the results within a certain period of time.
 この走行難易度は、移動体103に対して判定しているが、当該移動体103が存在する位置に対して走行難易度を割り当ててもよい。 Although this running difficulty level is determined for the moving body 103, the running difficulty level may be assigned to the position where the moving body 103 exists.
 サーバ101の支援レベル判定部3は、走行難易度判定部2から通知された走行難易度と、支援レベル管理テーブルに従って支援レベルを判定する(ステップS104)。例えば、図4に示した支援レベル管理テーブルでは、コストが12の場合には走行難易度が3となり、支援レベルは協調型自動運転(他車両センサ情報)を示す。ここでは、支援レベル管理テーブルに基づいて走行難易度を判定しているが、テーブルではなく、コスト関数または確率モデルを用いて判定してもよい。 The support level determination unit 3 of the server 101 determines the support level according to the driving difficulty level notified from the driving difficulty level determination unit 2 and the support level management table (step S104). For example, in the support level management table shown in FIG. 4, when the cost is 12, the driving difficulty level is 3, and the support level indicates cooperative automatic driving (other vehicle sensor information). Here, the driving difficulty level is determined based on the support level management table, but the determination may be made using a cost function or a probability model instead of the table.
 支援レベルは走行難易度と同様に、過去に判定した結果に対して重み付けを行ったり、一定時間内の結果に基づいて判定してもよい。 Similar to the driving difficulty level, the support level may be weighted based on the results determined in the past, or may be determined based on the results within a certain period of time.
 また、支援レベルは、移動体103に対して判定しているが、当該移動体103が存在する位置に対して支援レベルを割り当ててもよい。 Further, although the support level is determined for the moving body 103, the support level may be assigned to the position where the moving body 103 exists.
 サーバ101の支援レベル判定部3は、判定した支援レベルに基づいて、推奨行動生成部4に支援情報の生成を要求する(ステップS105)。支援レベルと支援対象および支援情報の例は図3を用いて説明した通りである。 The support level determination unit 3 of the server 101 requests the recommended action generation unit 4 to generate support information based on the determined support level (step S105). Examples of support levels, support targets, and support information are as described with reference to FIG.
 なお、図3の例では、他車両情報、他車両センサ情報、他車両パス情報をそれぞれ別々に配信する例を示しているが、他車両センサ情報を配信する際には他車両情報を統合し、他車両パス情報を配信する際には他車両情報および他車両センサ情報を統合して配信してもよい。 In the example of FIG. 3, another vehicle information, another vehicle sensor information, and another vehicle path information are distributed separately, but when the other vehicle sensor information is distributed, the other vehicle information is integrated. , When distributing the other vehicle path information, the other vehicle information and the other vehicle sensor information may be integrated and distributed.
 また、支援レベルの代わりに走行難易度に基づいて、推奨行動生成部4に支援情報の生成を要求してもよい。なお、支援情報の生成周期は、支援レベルまたは走行難易度に基づいて変更してもよい。 Further, instead of the support level, the recommended action generation unit 4 may be requested to generate support information based on the driving difficulty level. The generation cycle of the support information may be changed based on the support level or the driving difficulty level.
 ここで、図10および図11を用いて、走行難易度の判定および支援レベルの判定の一例を説明する。 Here, an example of determining the driving difficulty level and determining the support level will be described with reference to FIGS. 10 and 11.
 例えば、図10のように道路が混雑しておらず、支援対象の車両VAと周辺車両との相対距離が大きい場合にはコストが小さくなるため、走行難易度が低くなり、車両VAには自律型自動運転を推奨する。一方、図11のように周辺車両が多く、車両どうしが接近した状態で、並走する車両VAと車両VBとが車線変更を行おうとしている場合には、コストが大きくなるため、走行難易度が高くなり、車両制御情報を活用する管制制御を推奨する。この管制制御では、サーバ101が支援対象の車両VAだけでなく、車両VBの走行制御も行う。図11の例では、車両VAを先に車線変更させ、車両VBを後で車線変更させるように走行制御する。 For example, as shown in FIG. 10, when the road is not congested and the relative distance between the vehicle VA to be supported and the surrounding vehicles is large, the cost is small, so that the driving difficulty is low and the vehicle VA is autonomous. Type automatic operation is recommended. On the other hand, as shown in FIG. 11, when there are many peripheral vehicles and the vehicles are close to each other and the vehicle VA and the vehicle VB running in parallel are trying to change lanes, the cost becomes high and the driving difficulty level is high. Is high, and control control that utilizes vehicle control information is recommended. In this control control, the server 101 controls not only the vehicle VA to be supported but also the vehicle VB. In the example of FIG. 11, the vehicle VA is changed to the lane first, and the vehicle VB is changed to the lane later.
 サーバ101の推奨行動生成部4は、生成した支援情報を支援対象の移動体103に対して配信する。なお、生成した支援情報は、支援対象の移動体103だけでなく、支援対象の周辺の移動体103に対しても配信してもよい。 The recommended action generation unit 4 of the server 101 distributes the generated support information to the mobile body 103 to be supported. The generated support information may be distributed not only to the mobile body 103 to be supported but also to the mobile body 103 in the vicinity of the support target.
 走行難易度または支援レベルが移動体103が存在する位置に関連付けられている場合には当該地点に対して情報を配信してもよい。 If the driving difficulty level or the support level is associated with the position where the moving body 103 exists, the information may be distributed to the point.
 以上説明したステップS101~S106の処理は、データを受信するごとにシーケンシャルに処理を行ってもよいし、一定時間間隔で処理を行ってもよい。 The processes of steps S101 to S106 described above may be sequentially processed each time data is received, or may be processed at regular time intervals.
 また、ステップS102およびS103における走行難易度を移動体で判定して、走行難易度を移動体103からサーバ101に通知して、サーバ101においてステップS104以降を実施してもよい。 Further, the traveling difficulty level in steps S102 and S103 may be determined by the moving body, the traveling difficulty level may be notified from the moving body 103 to the server 101, and steps S104 and subsequent steps may be performed on the server 101.
 次に図12および図13を用いて、実施の形態1に係る走行支援システム1000における移動体103での送信および受信の動作を説明する。具体的には、図12には、移動体103からサーバ101に情報を送信する場合の処理シーケンスを示し、図13には、移動体103がサーバ101から情報を受信した場合の処理シーケンスを示す。 Next, the operation of transmission and reception by the mobile body 103 in the travel support system 1000 according to the first embodiment will be described with reference to FIGS. 12 and 13. Specifically, FIG. 12 shows a processing sequence when information is transmitted from the mobile body 103 to the server 101, and FIG. 13 shows a processing sequence when the mobile body 103 receives information from the server 101. ..
 まず、図12に示すフローチャートを用いて、移動体103からサーバ101に情報を送信する場合の処理シーケンスについて説明する。図12に示すように、移動体103の周辺状況認識部41は、移動体センサ32から移動体情報を取得し、周辺認識センサ31からセンシング(センサ)情報を取得する(ステップS201)。 First, a processing sequence when information is transmitted from the mobile body 103 to the server 101 will be described using the flowchart shown in FIG. As shown in FIG. 12, the peripheral situational awareness unit 41 of the mobile body 103 acquires the mobile body information from the mobile body sensor 32 and acquires the sensing (sensor) information from the peripheral recognition sensor 31 (step S201).
 移動体情報は、移動体103の位置、速度、方位、加速度、走行経路などの移動体情報である。また、センシング情報は、当該移動体103を起点として、周辺に存在する他の移動体、障害物の相対位置、相対速度、相対角度、検出物体の種別などの情報である。また、センシング情報は処理前の映像および点群などの生データを扱ってもよい。 The moving body information is moving body information such as the position, speed, direction, acceleration, and traveling path of the moving body 103. Further, the sensing information is information such as the relative position, relative speed, relative angle, and type of the detected object of other moving objects and obstacles existing in the vicinity of the moving object 103 as a starting point. Further, the sensing information may handle raw data such as a video before processing and a point cloud.
 移動体103の周辺状況認識部41は、取得した移動体情報およびセンシング情報をサーバ101に送信する(ステップS202)。なお、ステップS202は、サーバ101の周辺状況認識部1におけるステップS101の処理に対応する処理であるため、サーバ101からの情報取得要求を受けてから、情報を送信するようにしてもよい。 The peripheral situation awareness unit 41 of the mobile body 103 transmits the acquired mobile body information and sensing information to the server 101 (step S202). Since step S202 is a process corresponding to the process of step S101 in the peripheral situation awareness unit 1 of the server 101, the information may be transmitted after receiving the information acquisition request from the server 101.
 また、上記ではサーバ101に情報を送信するものとしたが、周辺移動体にも移動体情報およびセンシング情報を送信してもよい。これは、周辺移動体がサーバ101の処理を実施する機能を有している場合に対応する構成である。 Further, although the information is transmitted to the server 101 in the above, the mobile body information and the sensing information may also be transmitted to the peripheral mobile body. This is a configuration corresponding to the case where the peripheral mobile unit has a function of executing the processing of the server 101.
 移動体103の自動運転判断部42は、周辺状況認識部41が取得した移動体情報およびセンシング情報に基づいて、車線維持、車線変更などの自動運転の行動計画を作成して、行動計画を実現するための走行パス情報を生成する(ステップS203)。 The automatic driving judgment unit 42 of the moving body 103 creates an action plan for automatic driving such as lane keeping and lane change based on the moving body information and sensing information acquired by the surrounding situation recognition unit 41, and realizes the action plan. (Step S203).
 なお、ここでは車両単独で取得できる移動体情報およびセンシング情報に加え、地図情報を利用してもよい。また、ここではサーバ101から通知される支援情報を利用せずに判断する例を示している。 Here, in addition to the moving body information and sensing information that can be acquired by the vehicle alone, map information may be used. Further, here, an example of making a judgment without using the support information notified from the server 101 is shown.
 自動運転の行動計画の作成では、交通ルールの遵守、道路形状に沿った走行判断、周辺車両の状態推定およびリスク推定を行って行動判断を行い、ルールベース手法、最適化手法、確率的手法、学習ベース手法および、これらの手法を複数の統合した手法によって行動計画を作成することができる。 In the creation of an action plan for autonomous driving, behavior judgment is made by observing traffic rules, driving judgment along the road shape, state estimation of surrounding vehicles and risk estimation, and rule-based method, optimization method, probabilistic method, Action plans can be created using learning-based methods and methods that integrate these methods.
 ルールベース(Rule Based)手法は、Finite State Machine、Decision Treeのように定義済みの条件およびルールに基づいて判定する手法であり、定義済み状況の対応が容易で単純なシナリオに対して堅牢な処理が可能である。 The rule-based method is a method that makes decisions based on defined conditions and rules, such as FiniteStateMachine and DecisionTree, and is a robust process for simple scenarios where it is easy to handle defined situations. Is possible.
 最適化(Optimization Based)手法は、Cost Based Functionのように、データに対する関数を定義して、その関数の最小値または最大値を求める手法であり、実装、メンテナンスおよび試験が容易である。 The Optimization Based method is a method that defines a function for data and finds the minimum or maximum value of the function, like CostBasedFunction, and is easy to implement, maintain, and test.
 確率的(Probabilistic Based)手法は、マルコフ決定過程(Partially Observable Markov Decision Process:POMDP)およびベイズ推定のように、各条件を入力として結果を確率的に判断する方法であり、確実なシナリオを定式化可能である。 The probabilistic based method is a method of probabilistically judging the result by inputting each condition, such as the Markov decision process (Partially Observable Markov Decision Process: POMDP) and Bayesian estimation, and formulates a reliable scenario. It is possible.
 学習ベース(Learning Based)手法は、強化学習、機械学習のように、過去のデータから学習して、新たな入力を正しく予測する方法であり、定義できない未知の状況に対応可能である。 The learning based method is a method of learning from past data and correctly predicting new inputs, such as reinforcement learning and machine learning, and can handle unknown situations that cannot be defined.
 また、複数の手法を統合する手法の例としては、ルールベース手法と学習ベース手法を組み合わせる方法が挙げられ、学習不要な部分はルールベースで対応し、ルールベースによる制約で動作を保証する。試験が容易で、責任範囲の分担が可能である。 Also, as an example of a method that integrates multiple methods, there is a method that combines a rule-based method and a learning-based method, and the parts that do not require learning are dealt with on a rule-based basis, and operation is guaranteed by restrictions based on the rules. The test is easy and the scope of responsibility can be divided.
 また、自動運転の走行パス情報の生成は、モデル予測制御(Model Predictive Control)、経路計画アルゴリズム(Rapidly exploring random tree)などを用いて実現することができる。モデル予測制御とは、各時刻で未来の応答を予測しながら最適化を行う制御手法であり、経路計画アルゴリズムとは、ランダムに経路を探索して、ゴールに到達するまで探索を繰り返す手法である。 In addition, the generation of driving path information for autonomous driving can be realized by using model predictive control (Model Predictive Control), route planning algorithm (Rapidly exploring random tree), and the like. Model predictive control is a control method that optimizes while predicting future responses at each time, and route planning algorithm is a method that randomly searches for a route and repeats the search until the goal is reached. ..
 移動体103の自動運転判断部42は、生成した走行パス情報をサーバ101に送信する(ステップS204)。なお、走行パス情報は周辺移動体にも送信してもよい。これは、周辺移動体がサーバ101の処理を実施する機能を有している場合に対応する構成である。 The automatic driving determination unit 42 of the mobile body 103 transmits the generated travel path information to the server 101 (step S204). The travel path information may also be transmitted to surrounding moving objects. This is a configuration corresponding to the case where the peripheral mobile unit has a function of executing the processing of the server 101.
 移動体103の車両制御部43は、自動運転判断部42で生成された走行パス情報に従って移動体103(車両)の走行を制御する(ステップS205)。なお、ここでは自動運転判断部42が車両制御する例を示しているが、ドライバが手動運転している場合は、手動運転を維持してもよい。 The vehicle control unit 43 of the mobile body 103 controls the travel of the mobile body 103 (vehicle) according to the travel path information generated by the automatic driving determination unit 42 (step S205). Although an example in which the automatic driving determination unit 42 controls the vehicle is shown here, the manual driving may be maintained when the driver is manually driving.
 なお、図12に示すシーケンスでは、サーバ101にデータを送信するシーケンスであるため、サーバ101からデータを受信した場合の処理は考慮していない。 Note that the sequence shown in FIG. 12 is a sequence in which data is transmitted to the server 101, so processing when data is received from the server 101 is not considered.
 次に、図13に示すフローチャートを用いて、移動体103がサーバ101から情報を受信した場合の処理シーケンスについて説明する。 Next, using the flowchart shown in FIG. 13, a processing sequence when the mobile body 103 receives information from the server 101 will be described.
 移動体103の支援レベル切替部44は、サーバ101から支援情報を受信するまで待機し(ステップS301)、受信するとステップS302に移行する。 The support level switching unit 44 of the mobile body 103 waits until the support information is received from the server 101 (step S301), and when the support information is received, the process proceeds to step S302.
 ステップS302では、支援レベル切替部44は、受信した支援情報の支援対象を判定する。支援対象が車両制御部43の場合はステップS303に進む。支援対象が自動運転判断部42の場合はステップS305に進み、支援対象が表示部45の場合は、ステップS308に進む。 In step S302, the support level switching unit 44 determines the support target of the received support information. If the support target is the vehicle control unit 43, the process proceeds to step S303. If the support target is the automatic driving determination unit 42, the process proceeds to step S305, and if the support target is the display unit 45, the process proceeds to step S308.
 ステップS303では、支援レベル切替部44は、支援情報として受信した車両制御情報を車両制御部43に通知する。 In step S303, the support level switching unit 44 notifies the vehicle control unit 43 of the vehicle control information received as the support information.
 車両制御部43は、通知された車両制御情報に従い、車両制御インタフェース23を介して車両制御を実施する(ステップS304)。以降、ステップS301以下の処理を繰り返す。 The vehicle control unit 43 performs vehicle control via the vehicle control interface 23 according to the notified vehicle control information (step S304). After that, the process of step S301 and the like is repeated.
 なお、車両制御部43は、通知された車両制御情報を利用しているが、自動運転判断部42が生成した車両制御情報と比較して、安全性および快適性を判定し、通知された車両制御情報を利用するか否かを判断してもよく、自動運転判断部42が生成した車両制御情報の方が優れている場合は、自動運転判断部42が生成した車両制御情報に基づいて自律による自動走行を実施する。 Although the vehicle control unit 43 uses the notified vehicle control information, the vehicle is notified by determining safety and comfort by comparing with the vehicle control information generated by the automatic driving determination unit 42. It may be determined whether or not to use the control information, and if the vehicle control information generated by the automatic driving judgment unit 42 is superior, the vehicle is autonomous based on the vehicle control information generated by the automatic driving judgment unit 42. Carry out automatic driving by.
 ステップS305では、支援レベル切替部44は、支援情報として受信した自動走行支援情報を自動運転判断部42に通知する。なお、ここでは自動走行支援情報を自動運転判断部42に通知しているが、周辺状況認識部41に通知してもよい。 In step S305, the support level switching unit 44 notifies the automatic driving determination unit 42 of the automatic driving support information received as the support information. Although the automatic driving support information is notified to the automatic driving determination unit 42 here, it may be notified to the surrounding situation recognition unit 41.
 自動運転判断部42は、移動体情報、センシング情報に加え、通知された自動走行支援情報を用いて、走行パスを生成し、車両制御部43に通知する(ステップS306)。 The automatic driving determination unit 42 generates a traveling path by using the notified automatic driving support information in addition to the moving body information and the sensing information, and notifies the vehicle control unit 43 (step S306).
 支援情報として他車両情報または他車両センシング情報を受信した場合、周辺認識センサ31によって検出される周辺車両または障害物と同様に、周辺に存在する移動体情報として扱うことで、周辺認識センサが検出できない情報も活用できる。 When other vehicle information or other vehicle sensing information is received as support information, the peripheral recognition sensor detects it by treating it as moving object information existing in the vicinity, similar to the peripheral vehicle or obstacle detected by the peripheral recognition sensor 31. Information that cannot be used can also be used.
 また、支援情報として他車両パス情報を受信した場合、周辺に存在する移動体103の走行パスを認識することができるので、周辺移動体の行動を考慮した判断により安全性を高めることができる。 Further, when the other vehicle path information is received as the support information, the traveling path of the moving body 103 existing in the vicinity can be recognized, so that the safety can be enhanced by the judgment considering the behavior of the surrounding moving body.
 車両制御部43は、自動運転判断部42から通知された走行パス情報に従って車両制御インタフェース23を介して、車両制御を実施する(ステップS307)。以降、ステップS301以下の処理を繰り返す。 The vehicle control unit 43 performs vehicle control via the vehicle control interface 23 according to the travel path information notified from the automatic driving determination unit 42 (step S307). After that, the process of step S301 and the like is repeated.
 ステップS308では、支援レベル切替部44は、支援情報として受信した警報情報、推奨行動情報を表示部45に通知する。 In step S308, the support level switching unit 44 notifies the display unit 45 of the alarm information and the recommended action information received as the support information.
 表示部45は、支援情報を表示し、ドライバに通知する(ステップS309)。以降、ステップS301以下の処理を繰り返す。 The display unit 45 displays the support information and notifies the driver (step S309). After that, the process of step S301 and the like is repeated.
  <効果>
 以上説明した実施の形態1の走行支援システム1000は、走行難易度に基づいて自動運転に利用する情報量が変化するので、運転が簡単な状況では処理する情報を減らして安全性および快適性を確保しつつ処理負荷を低減することができ、運転が困難な状況では処理する情報を増やして安全性および快適性を確保しつつ効率的な処理を実現することができる。
<Effect>
Since the amount of information used for automatic driving changes based on the driving difficulty level of the driving support system 1000 of the first embodiment described above, the amount of information to be processed is reduced in a situation where driving is easy to improve safety and comfort. It is possible to reduce the processing load while ensuring, and in situations where driving is difficult, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
 また。走行難易度に応じて、処理する情報量を変更するので、不必要に複雑な処理を実施しなくて済み、処理負荷の低減、処理の高速化が実現できる。 Also. Since the amount of information to be processed is changed according to the difficulty of driving, it is not necessary to perform unnecessarily complicated processing, the processing load can be reduced, and the processing speed can be increased.
 また、個々の移動体103で利用する情報種別を判断するのではなく、サーバ101において走行難易度に基づいて利用する情報種別を判断するので、移動体単体では把握できない遠方の状況を考慮した効率的なの支援を実現することができる。 Further, since the information type to be used in the server 101 is determined based on the driving difficulty level instead of determining the information type to be used in each mobile body 103, the efficiency in consideration of the distant situation that cannot be grasped by the mobile body alone. Support can be realized.
 また、走行難易度または支援レベルに応じて、移動体103から取得する情報の種別を変更したり、情報を取得する頻度を変更したり、取得する通信経路を変更したりすることで、通信量の低減、処理負荷の低減が可能である。 In addition, the amount of communication can be changed by changing the type of information acquired from the mobile body 103, the frequency of acquiring information, or the communication route to be acquired, depending on the driving difficulty level or the support level. It is possible to reduce the processing load.
  <変形例>
 以上説明した実施の形態1の走行支援システム1000においては、サーバ101が走行難易度を判定して、移動体103に提供する情報種別を判断しているが、移動体103において走行難易度を判定して、サーバに情報の提供を要求する構成としてもよい。
<Modification example>
In the travel support system 1000 of the first embodiment described above, the server 101 determines the travel difficulty level and determines the information type to be provided to the mobile body 103, but the mobile body 103 determines the travel difficulty level. Then, the configuration may be such that the server is requested to provide information.
 <実施の形態2>
  <構成>
 次に、実施の形態2について図14および図15を用いて説明する。なお、実施の形態2に係る走行支援システム2000(図1)は、実施の形態1の走行支援システム1000と同一の部位には同一の符号を付与し、重複する詳細な説明は省略する。
<Embodiment 2>
<Structure>
Next, the second embodiment will be described with reference to FIGS. 14 and 15. In the driving support system 2000 (FIG. 1) according to the second embodiment, the same reference numerals are given to the same parts as the driving support system 1000 of the first embodiment, and detailed duplication will be omitted.
 図14は実施の形態2に係る走行支援システム2000のサーバ101の構成を示すブロック図である。図14に示すようにサーバ101は、プロセッサ100A、通信インタフェース10、通信器30および記憶装置40を備えている。 FIG. 14 is a block diagram showing the configuration of the server 101 of the driving support system 2000 according to the second embodiment. As shown in FIG. 14, the server 101 includes a processor 100A, a communication interface 10, a communication device 30, and a storage device 40.
 図15は実施の形態2に係る走行支援システム2000の移動体103の構成を示すブロック図である。図15に示すように移動体103は、プロセッサ200A、通信インタフェース20、周辺認識インタフェース21、車両センサインタフェース22および車両制御インタフェース23を有する移動体システム300と、周辺認識センサ31、移動体センサ32、車両制御ECU33および通信器34を備えている。 FIG. 15 is a block diagram showing the configuration of the moving body 103 of the traveling support system 2000 according to the second embodiment. As shown in FIG. 15, the mobile body 103 includes a mobile body system 300 having a processor 200A, a communication interface 20, a peripheral recognition interface 21, a vehicle sensor interface 22, and a vehicle control interface 23, a peripheral recognition sensor 31, and a mobile sensor 32. It includes a vehicle control ECU 33 and a communication device 34.
 プロセッサ200Aは、通信インタフェース20、周辺認識インタフェース21、車両センサインタフェース22および車両制御インタフェース23を介して周辺認識センサ31、移動体センサ32、車両制御ECU33および通信器34と通信を行い、情報を取得する。 The processor 200A communicates with the peripheral recognition sensor 31, the mobile sensor 32, the vehicle control ECU 33, and the communication device 34 via the communication interface 20, the peripheral recognition interface 21, the vehicle sensor interface 22, and the vehicle control interface 23, and acquires information. do.
 プロセッサ200Aは、機能ブロックとしてサーバ通信部46および車両制御部43を有している。サーバ通信部46は、周辺認識センサ31により検出された移動体103の周辺の障害物のセンシング情報および移動体センサ32により検出された移動体情報を、それぞれ、車両センサインタフェース22および周辺認識インタフェース21を介して取得し、取得した情報を車両制御部43に通知すると共に、通信インタフェース20を介してサーバ101に送信する。 The processor 200A has a server communication unit 46 and a vehicle control unit 43 as functional blocks. The server communication unit 46 receives the sensing information of obstacles around the moving body 103 detected by the peripheral recognition sensor 31 and the moving body information detected by the moving body sensor 32, respectively, in the vehicle sensor interface 22 and the peripheral recognition interface 21. The acquired information is notified to the vehicle control unit 43 and transmitted to the server 101 via the communication interface 20.
 実施の形態2の移動体103は、サーバ通信部46が、周辺認識センサ31により検出された移動体103の周辺の障害物のセンシング情報および移動体センサ32により検出された移動体情報をサーバ101に周期的に送信する。 In the mobile body 103 of the second embodiment, the server communication unit 46 uses the server 101 to transmit the sensing information of obstacles around the mobile body 103 detected by the peripheral recognition sensor 31 and the mobile body information detected by the mobile body sensor 32. Is sent periodically to.
 実施の形態2の走行支援システム2000においては、サーバ101のプロセッサ100Aにおいて自動運転の判断を行い、車両制御情報を移動体103に送信する。 In the driving support system 2000 of the second embodiment, the processor 100A of the server 101 determines the automatic driving and transmits the vehicle control information to the moving body 103.
 すなわち、図14に示されるように、サーバ101のプロセッサ100Aは、図2に示したプロセッサ100の構成に加えて、推奨行動生成部4の出力を受ける自動運転判断部7を有し、推奨行動生成部4は、支援レベルに応じて生成した支援情報を自動運転判断部7に通知する。 That is, as shown in FIG. 14, the processor 100A of the server 101 has, in addition to the configuration of the processor 100 shown in FIG. 2, an automatic operation determination unit 7 that receives the output of the recommended action generation unit 4, and has a recommended action. The generation unit 4 notifies the automatic driving determination unit 7 of the support information generated according to the support level.
 自動運転判断部7は、移動体103から送信された移動体103の周辺の障害物のセンシング情報および移動体情報に加えて、推奨行動生成部4から通知された自動走行の支援情報を用いて走行パス等の車両制御情報を生成する。そして、生成した車両制御情報を移動体103に対して通信インタフェース10を介して送信する。 The automatic driving determination unit 7 uses the automatic driving support information notified from the recommended action generation unit 4 in addition to the sensing information and the moving body information of obstacles around the moving body 103 transmitted from the moving body 103. Generates vehicle control information such as a travel path. Then, the generated vehicle control information is transmitted to the mobile body 103 via the communication interface 10.
 そして移動体103のサーバ通信部46は、サーバ101から受信する車両制御情報を車両制御部43に通知し、車両制御部43は通知された車両制御情報に従って、車両制御インタフェース23を経由して、移動体103のアクセル、ブレーキ、ステアリングを制御して自動走行を行う。 Then, the server communication unit 46 of the moving body 103 notifies the vehicle control unit 43 of the vehicle control information received from the server 101, and the vehicle control unit 43 passes through the vehicle control interface 23 according to the notified vehicle control information. The accelerator, brake, and steering of the moving body 103 are controlled to perform automatic traveling.
  <効果>
 以上説明した実施の形態2の走行支援システム2000は、サーバ101が走行難易度に基づいて自動運転の車両制御情報を生成するので、運転が簡単な状況では処理する情報を減らして安全性および快適性を確保しつつ処理負荷を低減することができ、運転が困難な状況では処理する情報を増やして安全性および快適性を確保しつつ効率的な処理を実現することができる。
<Effect>
In the driving support system 2000 of the second embodiment described above, since the server 101 generates vehicle control information for automatic driving based on the driving difficulty level, the information to be processed is reduced in a situation where driving is easy, and safety and comfort are achieved. It is possible to reduce the processing load while ensuring the performance, and in situations where driving is difficult, it is possible to increase the information to be processed and realize efficient processing while ensuring safety and comfort.
 また、サーバ101が自動運転の車両制御情報を生成して、移動体103を遠隔制御するので、移動体103に搭載するシステム構成を簡略化することができる。 Further, since the server 101 generates vehicle control information for automatic driving and remotely controls the moving body 103, the system configuration mounted on the moving body 103 can be simplified.
 <実施の形態3> 
  <構成>
 次に、実施の形態3について図16~図18を用いて説明する。なお、実施の形態3に係る走行支援システム3000は、実施の形態1の走行支援システム1000と同一の部位には同一の符号を付与し、重複する詳細な説明は省略する。
<Embodiment 3>
<Structure>
Next, the third embodiment will be described with reference to FIGS. 16 to 18. In the driving support system 3000 according to the third embodiment, the same reference numerals are given to the same parts as the driving support system 1000 of the first embodiment, and detailed description of duplication will be omitted.
 図16は実施の形態3に係る走行支援システム3000のサーバ101の構成を示すブロック図である。図16に示すようにサーバ101は、プロセッサ100B、通信インタフェース10、通信器30および記憶装置40を備えている。なお、移動体103の構成は図7に示した構成を採用すればよい。 FIG. 16 is a block diagram showing the configuration of the server 101 of the driving support system 3000 according to the third embodiment. As shown in FIG. 16, the server 101 includes a processor 100B, a communication interface 10, a communication device 30, and a storage device 40. As the configuration of the moving body 103, the configuration shown in FIG. 7 may be adopted.
 サーバ101のプロセッサ100Bは、図2に示したプロセッサ100の構成に加えて支援レベル判定部3に接続された群判定部13を有し、群判定部13が移動体情報およびセンシング情報に基づいて、同一方向に移動する複数の移動体103を群として判定する。群の判定方法としては、各移動体間の車間時間、相対距離および相対速度に基づいて判定する。 The processor 100B of the server 101 has a group determination unit 13 connected to the support level determination unit 3 in addition to the configuration of the processor 100 shown in FIG. 2, and the group determination unit 13 is based on the mobile body information and the sensing information. , A plurality of moving bodies 103 moving in the same direction are determined as a group. As a method for determining the group, the determination is made based on the inter-vehicle time, the relative distance, and the relative speed between the moving bodies.
 実施の形態3の走行支援システム3000においては、プロセッサ100Bの群判定部13が、同一方向に移動する複数の移動体を1つの群として判定し、判定した移動体群の情報を支援レベル判定部3に通知する。支援レベル判定部3では、移動体群の情報に基づいて、同じ群に対しては、同じ支援レベルを適用する。 In the traveling support system 3000 of the third embodiment, the group determination unit 13 of the processor 100B determines a plurality of moving objects moving in the same direction as one group, and the information of the determined moving body group is used as a support level determination unit. Notify 3. The support level determination unit 3 applies the same support level to the same group based on the information of the mobile group.
 図17には、複数の移動体を群として判定して、同じ支援レベルを適用する例を示している。図17においては、支援対象の車両VAを含む複数の車両が交差点を右折しようとする状態を示しており、群判定部13は、当該複数の車両を車両群G1と判定し、群内の車両には同じ支援レベルを適用する。また、交差点を挟んで反対側の車線にも交差点を右折しようとする複数の車両が存在しており、当該複数の車両を車両群G2と判定し、群内の車両には同じ支援レベルを適用する。 FIG. 17 shows an example in which a plurality of moving objects are determined as a group and the same support level is applied. FIG. 17 shows a state in which a plurality of vehicles including the vehicle VA to be supported are about to turn right at an intersection, and the group determination unit 13 determines that the plurality of vehicles are vehicle group G1 and vehicles in the group. Apply the same level of support to. In addition, there are multiple vehicles trying to turn right at the intersection in the opposite lane across the intersection, and the multiple vehicles are judged to be vehicle group G2, and the same support level is applied to the vehicles in the group. do.
 また、路側センサ104(図1)または移動体103に搭載される移動体センサ32によって、交差点における信号の灯色、サイクルを取得できる場合、灯色が赤から青に変化した場合および矢印信号の場合は、管制制御により車両群を同時に発進させるように制御したり、青から赤に変化した場合、管制制御により車両群を同時に減速させたりする。 Further, when the light color and cycle of the signal at the intersection can be acquired by the roadside sensor 104 (FIG. 1) or the moving body sensor 32 mounted on the moving body 103, when the light color changes from red to blue, and when the arrow signal. In that case, the control control is used to control the vehicle group to start at the same time, and when the color changes from blue to red, the control control is used to simultaneously decelerate the vehicle group.
  <動作>
 次に、図18に示すフローチャートを用いて、実施の形態3に係る走行支援システム3000におけるサーバ101の全体的な動作を説明する。なお、図18に示すステップS401~S404およびステップS407は、図8に示したフローチャートのステップS101~S104およびステップS106と同じであるので説明は省略する。
<Operation>
Next, the overall operation of the server 101 in the driving support system 3000 according to the third embodiment will be described with reference to the flowchart shown in FIG. Since steps S401 to S404 and step S407 shown in FIG. 18 are the same as steps S101 to S104 and step S106 in the flowchart shown in FIG. 8, the description thereof will be omitted.
 ステップS404において支援レベル判定部3が支援レベルを判定した後、群判定部13が同一方向に移動する複数の移動体103がある場合は、それらを1つの群として判定し、移動体群の情報を支援レベル判定部3に通知する(ステップS405)。 After the support level determination unit 3 determines the support level in step S404, if there are a plurality of moving bodies 103 that the group determination unit 13 moves in the same direction, they are determined as one group, and the information of the moving body group is determined. Is notified to the support level determination unit 3 (step S405).
 支援レベル判定部3は、群判定部13から通知された移動体群の情報に基づいて、群内の最も走行難易度の高い移動体103の支援レベルを抽出して、同じ群の移動体103には同じ走行難易度、支援レベルに基づいて、推奨行動生成部4に支援情報の生成を要求する(ステップS406)。 The support level determination unit 3 extracts the support level of the mobile body 103 having the highest running difficulty in the group based on the information of the mobile body group notified from the group determination unit 13, and the mobile body 103 of the same group. Requests the recommended action generation unit 4 to generate support information based on the same driving difficulty and support level (step S406).
 同一群内の支援レベルを合わせることで、推奨行動情報および車両制御情報を通知する場合には、同一群内の各移動体の走行速度および車線変更タイミングなどを合わせることによって、交通流をスムーズにすることができる。 When notifying recommended behavior information and vehicle control information by matching the support level within the same group, smooth the traffic flow by matching the traveling speed and lane change timing of each moving object in the same group. can do.
 なお、サーバ101の周辺状況認識部1は、複数の移動体103から路側機を経由して、移動体情報、センシング情報および走行パス情報を取得しており(ステップS401)、群判定部13は、これらの情報と、判定した移動体群の情報と付き合わせることで、移動体群内の各移動体を特定することができ、その情報を通信インタフェース10を介して各移動体に配信できる。 The peripheral situation recognition unit 1 of the server 101 acquires the mobile body information, the sensing information, and the travel path information from the plurality of mobile bodies 103 via the roadside unit (step S401), and the group determination unit 13 By associating these information with the information of the determined mobile body group, each mobile body in the mobile body group can be specified, and the information can be distributed to each mobile body via the communication interface 10.
 また、各移動体はサーバ101から走行難易度または支援レベルを通知されているので、自らの走行難易度または支援レベルを知ることができ、1つの群内に含まれる複数の移動体103の走行難易度または支援レベルが低い場合には、群内のうちの1台の移動体103が、移動体103の情報を収集してサーバ101に送信し、1つの群内に含まれる複数の移動体103の走行難易度または支援レベルが高い場合には群内の各移動体がサーバに送信するようにしてもよい。 Further, since each moving object is notified of the traveling difficulty level or the support level from the server 101, it is possible to know its own traveling difficulty level or the support level, and the traveling of a plurality of moving objects 103 included in one group. When the difficulty level or the support level is low, one mobile body 103 in the group collects information on the mobile body 103 and sends it to the server 101, and a plurality of mobile bodies included in one group. If the traveling difficulty level or the support level of 103 is high, each moving object in the group may transmit to the server.
 実施の形態3における支援レベルの切り替えを表す状態遷移図は、図5に示したように、走行難易度で状態を遷移させてもよいし、群の形成、離脱によって、自律型自動運転から協調型自動運転、協調型自動運転から管制制御への遷移を行ってもよい。 As shown in FIG. 5, the state transition diagram showing the switching of the support level in the third embodiment may change the state according to the driving difficulty level, or cooperates from the autonomous automatic driving by forming and leaving the group. The transition from type automatic operation and cooperative automatic operation to control control may be performed.
  <効果>
 以上説明した実施の形態3の走行支援システム3000は、同一方向に移動する複数の移動体103を群として扱うことによって、同一群内の移動体103をスムーズに走行させることができる。
<Effect>
The traveling support system 3000 of the third embodiment described above can smoothly travel the moving bodies 103 in the same group by treating a plurality of moving bodies 103 moving in the same direction as a group.
 <実施の形態4> 
  <構成>
 次に、実施の形態4について図19を用いて説明する。なお、実施の形態4に係る走行支援システム4000は、実施の形態1の走行支援システム1000と同一の部位には同一の符号を付与し、重複する詳細な説明は省略する。
<Embodiment 4>
<Structure>
Next, the fourth embodiment will be described with reference to FIG. In the driving support system 4000 according to the fourth embodiment, the same reference numerals are given to the same parts as the driving support system 1000 of the first embodiment, and the overlapping detailed description will be omitted.
 図19は実施の形態4に係る走行支援システム4000のサーバ101の構成を示すブロック図である。図19に示すようにサーバ101は、プロセッサ100C、通信インタフェース10、通信器30および記憶装置40Aを備えている。なお、移動体103の構成は図7に示した構成を採用すればよい。 FIG. 19 is a block diagram showing the configuration of the server 101 of the driving support system 4000 according to the fourth embodiment. As shown in FIG. 19, the server 101 includes a processor 100C, a communication interface 10, a communication device 30, and a storage device 40A. As the configuration of the moving body 103, the configuration shown in FIG. 7 may be adopted.
 サーバ101のプロセッサ100Cは、図2に示したプロセッサ100の構成に加えて走行難易度判定部2に接続された走行難易度学習部14を有し、走行難易度学習部14が移動体情報、センシング情報、走行パス情報および走行難易度判定部2で判定された走行難易度に基づいて、走行難易度判定モデル15を生成し記憶装置40Aに保存する。 In addition to the configuration of the processor 100 shown in FIG. 2, the processor 100C of the server 101 has a driving difficulty learning unit 14 connected to the traveling difficulty determination unit 2, and the traveling difficulty learning unit 14 is used for moving body information. Based on the sensing information, the travel path information, and the travel difficulty determined by the travel difficulty determination unit 2, the travel difficulty determination model 15 is generated and stored in the storage device 40A.
 走行難易度判定部2は、記憶装置40Aから走行難易度判定モデル15を読み出して、走行難易度判定モデル15を用いて走行難易度を判定する。 The driving difficulty determination unit 2 reads the driving difficulty determination model 15 from the storage device 40A, and determines the driving difficulty using the driving difficulty determination model 15.
 サーバ101において走行難易度を学習して走行難易度判定モデル15を生成し、記憶装置40Aに保存された走行難易度判定モデル15を更新することで、走行難易度の判定を改善でき、高度化することができる。 By learning the driving difficulty level on the server 101 to generate the driving difficulty level determination model 15 and updating the driving difficulty level determination model 15 stored in the storage device 40A, the driving difficulty level determination can be improved and improved. can do.
 走行難易度判定モデル15とは、入力データに対して、判定および評価を行い、難易度を設定するものである。有限状態機械、決定木およびロジスティック回帰分析などがこれに該当する。 The driving difficulty determination model 15 determines and evaluates the input data and sets the difficulty level. This includes finite state machines, decision trees and logistic regression analysis.
 走行難易度判定モデル15の生成においては、入力データのサンプルを準備し、データ分析、モデル作成およびモデル評価を繰り返し実施する。モデル評価の結果が悪かった場合には、原因を分析してモデル修正を行う。例えば、図9に示した条件に基づいてコストを算出して、図4に示したコスト範囲に基づいて走行難易度を判定するモデルを生成し、生成したモデルが妥当でなければ図9の条件および図4のコスト範囲を修正する。 In the generation of the driving difficulty determination model 15, a sample of input data is prepared, and data analysis, model creation and model evaluation are repeatedly carried out. If the result of model evaluation is bad, the cause is analyzed and the model is corrected. For example, the cost is calculated based on the conditions shown in FIG. 9, a model for determining the driving difficulty level is generated based on the cost range shown in FIG. 4, and if the generated model is not valid, the conditions in FIG. 9 are generated. And modify the cost range in Figure 4.
 なお、走行難易度学習部14は、移動体情報、センシング情報、走行パス情報、走行シチュエーション、走行シナリオ、周辺状況のリスクおよび交通状況に基づいて、走行難易度判定モデル15を生成してもよい。 The driving difficulty learning unit 14 may generate the driving difficulty determination model 15 based on the moving body information, the sensing information, the traveling path information, the driving situation, the driving scenario, the risk of the surrounding situation, and the traffic situation. ..
  <効果>
 以上説明した実施の形態4の走行支援システム4000は、サーバ101において走行難易度を学習して走行難易度判定モデル15を生成し、それを更新して次の判定にフィードバックできるので、難易度の判定精度の向上および処理の高速化が可能となる。
<Effect>
The driving support system 4000 of the fourth embodiment described above learns the driving difficulty level on the server 101, generates a driving difficulty level determination model 15, updates it, and can feed back to the next determination. It is possible to improve the judgment accuracy and speed up the processing.
 <実施の形態5> 
  <構成>
 次に、実施の形態5について図20を用いて説明する。なお、実施の形態5に係る走行支援システム5000は、実施の形態1の走行支援システム1000と同一の部位には同一の符号を付与し、重複する詳細な説明は省略する。
<Embodiment 5>
<Structure>
Next, the fifth embodiment will be described with reference to FIG. In the driving support system 5000 according to the fifth embodiment, the same reference numerals are given to the same parts as the driving support system 1000 of the first embodiment, and detailed description of duplication will be omitted.
 図20は実施の形態5に係る走行支援システム4000の移動体103の構成を示すブロック図である。図20に示すように移動体103は、記憶装置40を備えると共に、プロセッサ200Aが、走行難易度判定部2、支援レベル判定部3および推奨行動生成部4を有している。 FIG. 20 is a block diagram showing the configuration of the moving body 103 of the traveling support system 4000 according to the fifth embodiment. As shown in FIG. 20, the mobile body 103 includes a storage device 40, and the processor 200A has a running difficulty level determination unit 2, a support level determination unit 3, and a recommended action generation unit 4.
 走行難易度判定部2、支援レベル判定部3および推奨行動生成部4は、図2に示した実施の形態1のサーバ101と同様の機能を有している。すなわち、走行難易度判定部2は、周辺状況認識部1が取得した移動体情報、センシング情報および走行パス情報に基づいて、移動体103の情報を集約すると共に、記憶装置40に保存された地図データ6から取得される地図情報と統合する。支援レベル判定部3は、走行難易度判定部2で算出された走行難易度に基づいて支援レベルを判定する。 The driving difficulty determination unit 2, the support level determination unit 3, and the recommended action generation unit 4 have the same functions as the server 101 of the first embodiment shown in FIG. That is, the traveling difficulty determination unit 2 aggregates the information of the moving object 103 based on the moving object information, the sensing information, and the traveling path information acquired by the peripheral situation recognition unit 1, and the map stored in the storage device 40. It is integrated with the map information acquired from the data 6. The support level determination unit 3 determines the support level based on the driving difficulty calculated by the driving difficulty determination unit 2.
 なお、推奨行動生成部4は、支援レベル判定部3で判定された支援レベル、周辺状況認識部1が取得した移動体情報、センシング情報および走行パス情報に基づいて、支援レベルに応じた支援情報を生成し、支援情報が手動運転向け支援情報の場合は表示部45に通知し、支援情報が協調型自動運転向け支援情報の場合は自動運転判断部42に通知し、支援情報が管制制御向けの車両制御情報の場合は車両制御部43に通知する。 The recommended action generation unit 4 has support information according to the support level based on the support level determined by the support level determination unit 3, the moving object information, the sensing information, and the traveling path information acquired by the surrounding situation recognition unit 1. Is generated, and if the support information is support information for manual driving, the display unit 45 is notified, and if the support information is support information for cooperative automatic driving, the automatic driving judgment unit 42 is notified, and the support information is for control control. In the case of the vehicle control information of the above, the vehicle control unit 43 is notified.
  <効果>
 以上説明した実施の形態5の走行支援システム5000は、移動体103において走行難易度を判定して支援レベルを決定することで、どの情報を利用して自動運転をするかを決定するので、処理負荷の低減が期待できる。例えば、自律型自動運転または車両情報を用いた協調型自動運転を選択した場合には、周辺移動体から受信したセンサ情報およびパス情報を処理せずに済むため処理負荷が低減できる。
<Effect>
The driving support system 5000 of the fifth embodiment described above determines the driving difficulty level in the moving body 103 and determines the support level, thereby determining which information is used for automatic driving. The load can be expected to be reduced. For example, when autonomous automatic driving or cooperative automatic driving using vehicle information is selected, the processing load can be reduced because it is not necessary to process the sensor information and the path information received from the surrounding moving body.
 本開示は詳細に説明されたが、上記した説明は、全ての局面において、例示であって、本開示がそれに限定されるものではない。例示されていない無数の変形例が、本開示の範囲から外れることなく想定され得るものと解される。 Although the present disclosure has been described in detail, the above description is exemplary in all aspects and the present disclosure is not limited thereto. It is understood that a myriad of variants not illustrated can be envisioned without departing from the scope of the present disclosure.
 なお、本開示は、その開示の範囲内において、各実施の形態を自由に組み合わせたり、各実施の形態を適宜、変形、省略することが可能である。 It should be noted that, within the scope of the disclosure, each embodiment can be freely combined, and each embodiment can be appropriately modified or omitted.

Claims (14)

  1.  移動体の走行を支援する走行支援装置であって、
     前記走行支援装置は、
     前記移動体に搭載された移動体センサで取得された前記移動体の位置、速度、および方位を少なくとも含む移動体情報と、
     前記移動体に搭載された周辺認識センサで取得された前記移動体の周辺の障害物の位置、速度を少なくとも含むセンシング情報と、に基づいて、
     前記移動体の走行の難しさを表す走行難易度を少なくとも3段階以上で判定し、判定した前記走行難易度に応じて、前記移動体の走行支援に利用する支援情報を設定する、走行支援装置。
    It is a running support device that supports the running of moving objects.
    The driving support device is
    The moving body information including at least the position, velocity, and direction of the moving body acquired by the moving body sensor mounted on the moving body, and
    Based on the sensing information including at least the position and speed of obstacles around the moving body acquired by the peripheral recognition sensor mounted on the moving body.
    A running support device that determines the running difficulty level indicating the running difficulty of the moving body in at least three stages, and sets support information used for running support of the moving body according to the determined running difficulty level. ..
  2.  前記走行支援装置は、
     前記走行難易度に応じて、前記移動体情報および前記センシング情報から取得する情報の種別を変更する、請求項1記載の走行支援装置。
    The driving support device is
    The driving support device according to claim 1, wherein the type of information acquired from the moving body information and the sensing information is changed according to the driving difficulty level.
  3.  前記走行支援装置は、
     前記走行難易度に応じて、前記移動体情報および前記センシング情報を取得する頻度を変更する、請求項1記載の走行支援装置。
    The driving support device is
    The traveling support device according to claim 1, wherein the frequency of acquiring the moving body information and the sensing information is changed according to the traveling difficulty level.
  4.  前記走行支援装置は、
     前記移動体が自動運転を行う際の走行軌跡を含む走行パス情報を前記移動体から取得し、
     前記移動体情報、前記センシング情報および前記走行パス情報に基づいて前記走行難易度を判定する、請求項1記載の走行支援装置。
    The driving support device is
    The traveling path information including the traveling locus when the moving body performs automatic driving is acquired from the moving body, and the traveling body is obtained.
    The traveling support device according to claim 1, wherein the traveling difficulty level is determined based on the moving body information, the sensing information, and the traveling path information.
  5.  前記走行支援装置は、
     前記移動体情報、前記センシング情報および前記走行パス情報を取得する周辺状況認識部と、
     前記周辺状況認識部で取得された前記移動体情報、前記センシング情報および前記走行パス情報に基づいて、前記走行難易度を判定する走行難易度判定部と、
     前記走行難易度に基づいて、前記移動体の支援レベルを前記移動体が自律して自動走行する自律型自動運転、前記移動体が周辺移動体と協調して自動走行する協調型自動運転、前記走行支援装置による管制を受けて自動走行する管制制御および手動運転の少なくとも4段階で判定する支援レベル判定部と、
     前記支援レベル、前記移動体情報、前記センシング情報および前記走行パス情報に基づいて、前記支援レベルに応じて前記支援情報を生成し、前記移動体に送信する推奨行動生成部と、を備える、請求項4記載の走行支援装置。
    The driving support device is
    A peripheral situational awareness unit that acquires the moving body information, the sensing information, and the traveling path information, and the peripheral situation recognition unit.
    A traveling difficulty determination unit that determines the traveling difficulty level based on the moving body information, the sensing information, and the traveling path information acquired by the peripheral situation awareness unit.
    Autonomous automatic driving in which the moving body autonomously and automatically runs on the support level of the moving body based on the running difficulty level, cooperative automatic driving in which the moving body automatically runs in cooperation with peripheral moving bodies, and the above. A support level determination unit that determines at least four stages of control control and manual driving that automatically travels under the control of a travel support device.
    A claim including a recommended action generation unit that generates the support information according to the support level based on the support level, the moving body information, the sensing information, and the traveling path information, and transmits the support information to the moving body. Item 4. The traveling support device according to item 4.
  6.  前記走行難易度判定部は、
     前記移動体情報、前記センシング情報および前記走行パス情報に基づいて、
     車両間の位置関係、車線変更の有無、加速度の変化および減速度の変化を含む走行シチュエーション、
     車線変更、右左折、合流を含む走行シナリオ、
     前記移動体の前記周辺移動体との接触可能性を含む周辺状況のリスク、
     および前記周辺移動体の台数および車間時間を含む混雑度を算出して、前記走行シチュエーション、前記走行シナリオ、前記周辺状況のリスクおよび前記混雑度から前記走行難易度を判定する、請求項5記載の走行支援装置。
    The driving difficulty determination unit is
    Based on the moving body information, the sensing information, and the traveling path information,
    Driving situations including positional relationships between vehicles, lane changes, changes in acceleration and deceleration,
    Driving scenarios including lane changes, left / right turns, merging,
    Risk of surrounding conditions, including the possibility of contact of the moving object with the peripheral moving object,
    4. Driving support device.
  7.  前記走行難易度判定部は、
     前記走行シチュエーション、前記走行シナリオ、前記周辺状況のリスクおよび前記混雑度に合わせて設定されたコストの合計値に基づいて前記走行難易度を判定する、請求項6記載の走行支援装置。
    The driving difficulty determination unit is
    The driving support device according to claim 6, wherein the driving difficulty level is determined based on the total value of the driving situation, the driving scenario, the risk of the surrounding situation, and the cost set according to the congestion degree.
  8.  前記支援レベル判定部は、
     前記協調型自動運転を、
     前記周辺移動体から取得した移動体情報を利用する第1の協調型自動運転、および前記周辺移動体から取得したセンシング情報を利用する第2の協調型自動運転、
     前記周辺移動体から取得した走行パス情報を利用する第3の協調型自動運転の3段階に分類して判定し、
     前記管制制御を、
     推奨速度、推奨車線、推奨車線変更タイミングを含む推奨行動を前記移動体に通知する第1の管制制御、
     前記移動体の速度および走行パスを制御する第2の管制制御の2段階に分類して判定する、請求項5記載の走行支援装置。
    The support level determination unit
    The cooperative automatic driving
    A first cooperative automatic driving that uses the moving body information acquired from the peripheral moving body, and a second cooperative automatic driving that uses the sensing information acquired from the peripheral moving body.
    Judgment is made by classifying into three stages of the third cooperative automatic driving using the travel path information acquired from the peripheral moving body.
    The control control,
    A first control control that notifies the moving body of recommended actions including recommended speed, recommended lane, and recommended lane change timing,
    The traveling support device according to claim 5, wherein the traveling support device according to claim 5 is classified into two stages of a second control control for controlling the speed and traveling path of the moving body.
  9.  前記走行支援装置は、
     前記移動体情報および前記センシング情報に基づいて、前記移動体が、障害物および前記周辺移動体と接触せずに走行するための車線維持、車線変更、加速および減速を含む行動計画を作成し、前記行動計画を実現する走行パス情報を生成する自動運転判断部をさらに備え、
     前記自動運転判断部は前記走行パス情報を前記移動体に通知し、
     前記移動体は前記走行パス情報に従って制御される、請求項5記載の走行支援装置。
    The driving support device is
    Based on the moving body information and the sensing information, an action plan including lane keeping, lane change, acceleration and deceleration for the moving body to travel without contact with obstacles and surrounding moving bodies is created. It is further equipped with an automatic driving judgment unit that generates driving path information that realizes the action plan.
    The automatic driving determination unit notifies the moving body of the traveling path information, and the automatic driving determination unit notifies the moving body.
    The travel support device according to claim 5, wherein the moving body is controlled according to the travel path information.
  10.  前記走行支援装置は、
     前記移動体情報および前記センシング情報に基づいて、同一方向に移動する複数の移動体を1つの群として判定する群判定部をさらに備え、
     前記1つの群に含まれる複数の移動体に対しては、前記走行難易度および前記支援レベルを同じとする、請求項5記載の走行支援装置。
    The driving support device is
    Further, a group determination unit for determining a plurality of moving bodies moving in the same direction as one group based on the moving body information and the sensing information is provided.
    The driving support device according to claim 5, wherein the traveling difficulty level and the support level are the same for a plurality of moving objects included in the one group.
  11.  前記走行支援装置は、
     前記移動体情報、前記センシング情報、前記走行パス情報および前記走行難易度が入力され、前記走行難易度の判定および評価を行う走行難易度判定モデルを生成する走行難易度学習部をさらに備え、
     前記走行難易度判定部は、
     前記走行難易度判定モデルを用いて前記走行難易度を判定する、請求項5記載の走行支援装置。
    The driving support device is
    The vehicle further includes a travel difficulty learning unit that inputs the moving body information, the sensing information, the travel path information, and the travel difficulty level, and generates a travel difficulty level determination model that determines and evaluates the travel difficulty level.
    The driving difficulty determination unit is
    The driving support device according to claim 5, wherein the driving difficulty determination model is used to determine the driving difficulty.
  12.  前記走行支援装置は、
     前記移動体とは別個に設けられる、請求項1記載の走行支援装置。
    The driving support device is
    The traveling support device according to claim 1, which is provided separately from the moving body.
  13.  前記走行支援装置は、
     前記移動体に搭載される、請求項1記載の走行支援装置。
    The driving support device is
    The traveling support device according to claim 1, which is mounted on the moving body.
  14.  移動体の走行を支援する走行支援方法であって、
     前記移動体に搭載された移動体センサで取得された前記移動体の位置、速度、および方位を少なくとも含む移動体情報と、
     前記移動体に搭載された周辺認識センサで取得された前記移動体の周辺の障害物の位置、速度を少なくとも含むセンシング情報と、に基づいて、
     前記移動体の走行の難しさを表す走行難易度を少なくとも3段階以上で判定し、判定した前記走行難易度に応じて、前記移動体の走行支援に利用する支援情報を設定し、
     前記支援情報を用いて前記移動体の走行を支援する、走行支援方法。
    It is a running support method that supports the running of moving objects.
    The moving body information including at least the position, velocity, and direction of the moving body acquired by the moving body sensor mounted on the moving body, and
    Based on the sensing information including at least the position and speed of obstacles around the moving body acquired by the peripheral recognition sensor mounted on the moving body.
    The running difficulty level indicating the running difficulty of the moving body is determined in at least three stages, and the support information used for the running support of the moving body is set according to the determined running difficulty level.
    A running support method for supporting the running of the moving body by using the support information.
PCT/JP2020/018910 2020-05-12 2020-05-12 Travel assistance device and travel assistance method WO2021229671A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020552062A JPWO2021229671A1 (en) 2020-05-12 2020-05-12
PCT/JP2020/018910 WO2021229671A1 (en) 2020-05-12 2020-05-12 Travel assistance device and travel assistance method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/018910 WO2021229671A1 (en) 2020-05-12 2020-05-12 Travel assistance device and travel assistance method

Publications (1)

Publication Number Publication Date
WO2021229671A1 true WO2021229671A1 (en) 2021-11-18

Family

ID=78526235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/018910 WO2021229671A1 (en) 2020-05-12 2020-05-12 Travel assistance device and travel assistance method

Country Status (2)

Country Link
JP (1) JPWO2021229671A1 (en)
WO (1) WO2021229671A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024069844A1 (en) * 2022-09-29 2024-04-04 日立Astemo株式会社 Information processing device, driving assistance system, and information processing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016076131A (en) * 2014-10-07 2016-05-12 株式会社デンソー Device and method for estimating psychological state
JP2016177555A (en) * 2015-03-20 2016-10-06 株式会社ゼンリン Driving support system, and data structure
JP2019188867A (en) * 2018-04-19 2019-10-31 トヨタ自動車株式会社 Vehicular control apparatus
WO2020031611A1 (en) * 2018-08-06 2020-02-13 日立オートモティブシステムズ株式会社 Vehicle control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016076131A (en) * 2014-10-07 2016-05-12 株式会社デンソー Device and method for estimating psychological state
JP2016177555A (en) * 2015-03-20 2016-10-06 株式会社ゼンリン Driving support system, and data structure
JP2019188867A (en) * 2018-04-19 2019-10-31 トヨタ自動車株式会社 Vehicular control apparatus
WO2020031611A1 (en) * 2018-08-06 2020-02-13 日立オートモティブシステムズ株式会社 Vehicle control device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024069844A1 (en) * 2022-09-29 2024-04-04 日立Astemo株式会社 Information processing device, driving assistance system, and information processing method

Also Published As

Publication number Publication date
JPWO2021229671A1 (en) 2021-11-18

Similar Documents

Publication Publication Date Title
CN108068815B (en) Decision improvement system based on planning feedback for autonomous vehicles
US11807247B2 (en) Methods and systems for managing interactions between vehicles with varying levels of autonomy
CN111123933B (en) Vehicle track planning method and device, intelligent driving area controller and intelligent vehicle
WO2021135371A1 (en) Automatic driving method, related device and computer-readable storage medium
US20190391582A1 (en) Apparatus and method for controlling the driving of a vehicle
CN113460042B (en) Vehicle driving behavior recognition method and recognition device
CN113460081B (en) Vehicle control device, vehicle control method, and storage medium
CN110573978A (en) Dynamic sensor selection for self-driving vehicles
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
US11402844B2 (en) Vehicle control apparatus, vehicle control method, and storage medium
JPWO2018158875A1 (en) Vehicle control system, vehicle control method, and vehicle control program
CN116249643A (en) Method and system for predicting actions of an object by an autonomous vehicle to determine a viable path through a conflict area
US20220065644A1 (en) Vehicle routing using connected data analytics platform
CN112512887B (en) Driving decision selection method and device
EP3900412A1 (en) Adaptive multi-network vehicle architecture
EP4316935A1 (en) Method and apparatus for obtaining lane change area
KR20220121824A (en) Collaborative vehicle headlight orientation
JP7313298B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
US20230080281A1 (en) Precautionary observation zone for vehicle routing
CN111464972A (en) Prioritized vehicle messaging
JP6906175B2 (en) Driving support method and driving support device, automatic driving control device, vehicle, program, driving support system using it
CN113460080B (en) Vehicle control device, vehicle control method, and storage medium
EP3732541B1 (en) Method for controlling a vehicle and corresponding system and non-transitory computer-readable medium
CN112238862A (en) Open and safety monitoring system for autonomous driving platform
US20230256999A1 (en) Simulation of imminent crash to minimize damage involving an autonomous vehicle

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020552062

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935142

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935142

Country of ref document: EP

Kind code of ref document: A1