WO2024080191A1 - Dispositif de commande pour véhicule autonome, programme, dispositif de commande de signal, dispositif de feu de signalisation, système de feu de signalisation, programme de commande de signal, dispositif de notification d'informations et programme de notification d'informations - Google Patents

Dispositif de commande pour véhicule autonome, programme, dispositif de commande de signal, dispositif de feu de signalisation, système de feu de signalisation, programme de commande de signal, dispositif de notification d'informations et programme de notification d'informations Download PDF

Info

Publication number
WO2024080191A1
WO2024080191A1 PCT/JP2023/036092 JP2023036092W WO2024080191A1 WO 2024080191 A1 WO2024080191 A1 WO 2024080191A1 JP 2023036092 W JP2023036092 W JP 2023036092W WO 2024080191 A1 WO2024080191 A1 WO 2024080191A1
Authority
WO
WIPO (PCT)
Prior art keywords
intersection
information
vehicle
autonomous vehicle
traffic
Prior art date
Application number
PCT/JP2023/036092
Other languages
English (en)
Japanese (ja)
Inventor
正義 孫
Original Assignee
ソフトバンクグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022172347A external-priority patent/JP2024058513A/ja
Application filed by ソフトバンクグループ株式会社 filed Critical ソフトバンクグループ株式会社
Publication of WO2024080191A1 publication Critical patent/WO2024080191A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to a control device, a program, a signal control device, a signal device, a signal system, a signal control program, an information notification device, and an information notification program for an autonomous vehicle.
  • Patent document 1 describes a vehicle with an autonomous driving function.
  • a control device for controlling a vehicle, the control device including an information acquisition unit that acquires a plurality of pieces of information detected by a sensor installed at a traffic light, and a control unit that controls the vehicle using the plurality of pieces of information acquired by the information acquisition unit and a trained model.
  • the control unit may control the vehicle in units of one billionth of a second using the plurality of pieces of information and the trained model.
  • the control unit may control the vehicle using a plurality of pieces of information detected by a sensor installed in the traffic light and the trained model when the vehicle is entering an intersection where the traffic light is installed, and may control the vehicle using a plurality of pieces of information detected by a sensor installed in the vehicle and the trained model when the vehicle is traveling on a road other than the intersection.
  • the control unit may control the vehicle using the multiple pieces of information detected by the sensor installed in the traffic light and the trained model when the vehicle is entering the intersection and the output value of the trained model when multiple pieces of information detected by the sensor installed in the traffic light are input to the trained model matches the output value of the trained model when multiple pieces of information detected by the sensor installed in the vehicle are input to the trained model.
  • a program for causing a computer to function as the information acquisition unit and the control unit.
  • a signal control device includes a first acquisition unit that acquires traffic conditions around the intersection from sensors installed around the intersection, a second acquisition unit that acquires a driving plan of an autonomous vehicle that is scheduled to pass through the intersection, a determination unit that determines whether a delay will occur in the driving plan of the autonomous vehicle when the autonomous vehicle passes through the intersection based on the traffic conditions acquired by the first acquisition unit, and a control unit that controls the traffic lights at the intersection so as to suppress the delay when the determination unit determines that the delay will occur.
  • the control unit may control the traffic lights at the intersection so that they remain green while the autonomous vehicle is passing through the intersection when the determination unit determines that the delay will occur.
  • the traffic lights at the intersection are controlled to suppress delays in the driving plan of the autonomous vehicle by keeping the traffic lights at the intersection green while the autonomous vehicle passes through the intersection. This ensures safety when the autonomous vehicle passes through the intersection while suppressing the time the traffic lights at the intersection are green from becoming longer than necessary, compared to when control is performed such as extending the time the traffic lights at the intersection are green for a certain period of time.
  • the autonomous vehicle in which the control unit controls the traffic lights at the intersection so as to suppress the delay may be an autonomous vehicle with a preset urgency level equal to or greater than a predetermined value.
  • this aspect it is possible to prevent delays in the driving plan for autonomous vehicles with an urgency level equal to or higher than a predetermined value, and by reducing the number of times that traffic lights at an intersection are controlled, it is also possible to reduce the number of vehicles other than autonomous vehicles with an urgency level equal to or higher than a predetermined value, whose driving may be affected by the control of traffic lights at an intersection.
  • the signal control device may further include a cooperative control unit that controls the traffic lights of multiple intersections through which the autonomous vehicle is scheduled to pass in sequence so as to suppress the delay when the determination unit determines that the delay will occur.
  • the traffic lights at multiple intersections that the autonomous vehicle is scheduled to pass through in sequence are each controlled, making it possible to eliminate the delay in the driving plan of the autonomous vehicle while the autonomous vehicle passes through the multiple intersections in sequence.
  • a traffic light device includes the traffic light control device and the traffic light, and is provided at each intersection.
  • the signal control device since the signal control device is included, it is possible to prevent delays in the driving plan of an autonomous vehicle.
  • a traffic light system includes the traffic light devices provided at a plurality of intersections, and a cooperative control device that controls the traffic lights at a plurality of intersections through which the autonomous vehicle is scheduled to pass in sequence, when the determination unit of any of the plurality of traffic light devices determines that the delay will occur, so as to suppress the delay.
  • the cooperative control device since the cooperative control device is included, it is possible to eliminate delays in the driving plan of the autonomous vehicle while the autonomous vehicle passes through the multiple intersections in sequence.
  • a signal control program causes a computer to execute processing including acquiring traffic conditions around an intersection from sensors installed around the intersection, acquiring a driving plan for an autonomous vehicle that is scheduled to pass through the intersection, determining whether or not a delay will occur in the driving plan for the autonomous vehicle when the autonomous vehicle passes through the intersection based on the acquired traffic conditions, and controlling the traffic lights at the intersection to reduce the delay if it is determined that a delay will occur.
  • This aspect makes it possible to prevent delays in the driving plan of an autonomous vehicle.
  • an information notification device includes an acquisition unit that acquires traffic conditions around an intersection from a sensor installed around the intersection, a generation unit that generates notification information for an autonomous vehicle that is about to enter the intersection based on the traffic conditions acquired by the acquisition unit, and a display control unit that displays the notification information generated by the generation unit as code information on a display unit installed around the intersection.
  • traffic conditions around the intersection are obtained from sensors installed around the intersection, and notification information is generated for an autonomous vehicle about to enter the intersection based on the obtained traffic conditions around the intersection.
  • the generated notification information is then displayed as code information on a display unit installed around the intersection. This allows the autonomous vehicle to obtain the notification information by photographing the display unit displaying the code information and decoding the code information contained in the photographed image.
  • notification information can be notified to an autonomous vehicle without using a mobile communications network, so information can be notified to an autonomous vehicle without being affected by the communication conditions of the mobile communications network.
  • the generating unit may generate, as the notification information, information including driving instruction information that instructs each of a plurality of autonomous vehicles that are about to enter the intersection to drive.
  • the notification information includes multiple pieces of driving instruction information that instruct each of the multiple autonomous vehicles about to enter the intersection to drive, and this notification information is displayed as code information on the display unit. In this way, by displaying a single piece of notification information as code information on the display unit, driving instructions can be given to each of the multiple autonomous vehicles about to enter the intersection.
  • the generating unit may generate the driving instruction information taking into consideration traffic conditions in blind spot areas around the intersection that are blind spots for the autonomous vehicle.
  • the traffic conditions in the blind spot area around the intersection that is a blind spot for the autonomous vehicle are taken into consideration. This makes it possible to give driving instructions to the autonomous vehicle that take into account the traffic conditions in the blind spot area that is a blind spot for the autonomous vehicle.
  • the display control unit may cause the display unit to display a two-dimensional barcode as the code information.
  • a two-dimensional code is displayed on the display unit as code information, so the amount of notification information that can be displayed on the display unit as code information can be increased compared to an embodiment in which a one-dimensional code is displayed as code information.
  • a traffic light device includes the information notification device and a traffic light, and is provided at each intersection.
  • information notification device since the information notification device is included, information can be notified to an autonomous vehicle without being affected by the communication conditions of the mobile communication network.
  • an information notification program causes a computer to execute a process including acquiring traffic conditions around an intersection from a sensor installed around the intersection, generating notification information for an autonomous vehicle that is about to enter the intersection based on the acquired traffic conditions, and displaying the generated notification information as code information on a display unit installed around the intersection.
  • information can be sent to autonomous vehicles without being affected by the communication conditions of the mobile communication network.
  • FIG. 2 is a diagram for explaining a blind spot of a vehicle.
  • FIG. 2 is a diagram for explaining a sensor installed in a traffic light.
  • a schematic diagram of Perfect Bell Curves is shown.
  • FIG. 1 A block diagram showing an example of the functional configuration of a Central Brain.
  • FIG. 1 is a diagram for explaining a trained model.
  • This is a block diagram showing an example of the hardware configuration of a computer that functions as a Central Brain and a control device.
  • FIG. 11 is a block diagram showing a schematic configuration of a signal control system according to a second embodiment. 4 is a flowchart showing an example of a signal control process. 5 is a timing chart for explaining the operation of the signal control process. 10 is a flowchart showing another example of the signal control process.
  • FIG. 13 is a block diagram showing a schematic configuration of an information notification system according to a third embodiment.
  • FIG. 11 is a front view showing a traffic light and a display unit according to a third embodiment.
  • Figure 1 shows an overview of the risk prediction capabilities of the AI of the ultra-high performance autonomous driving according to this embodiment.
  • multiple types of sensor information are converted into AI data and stored in the cloud.
  • the AI predicts and judges the best mix of situations every nanosecond, optimizing the operation of the vehicle.
  • FIG. 2 shows a schematic diagram of the Central Brain in the ultra-high performance autonomous driving according to this embodiment.
  • the Central Brain is an example of a control device that controls a Level 6 autonomous vehicle.
  • Level 6 is a level that represents automated driving, and is equivalent to a level higher than Level 5, which represents fully automated driving. Although Level 5 represents fully automated driving, it is at the same level as a human driving, and there is still a chance of accidents occurring. Level 6 represents a level higher than Level 5, and is equivalent to a level where the chance of accidents occurring is lower than at Level 5.
  • examples of sensors installed in the vehicle include radar, LiDAR, high-pixel, telephoto, ultra-wide-angle, 360-degree, high-performance cameras, vision recognition, minute sounds, ultrasound, vibration, infrared rays, ultraviolet rays, electromagnetic waves, temperature, humidity, spot AI weather forecasts, high-precision multi-channel GPS, low-altitude satellite information, and long-tail incident AI data.
  • Long-tail incident AI data is trip data for vehicles that have Level 5 implemented.
  • Sensor information collected from multiple types of sensors includes the shift in the center of gravity of body weight, detection of road material, detection of outside air temperature, detection of outside air humidity, detection of the up, down, side, and diagonal inclination angle of a slope, detection of how frozen the road is, detection of the amount of moisture, detection of the material of each tire, wear condition, detection of air pressure, road width, whether or not overtaking is prohibited, oncoming vehicles, information on the vehicle models in front and behind, the cruising state of those vehicles, and surrounding conditions (birds, animals, soccer balls, wrecked vehicles, earthquakes, housework, wind, typhoons, heavy rain, light rain, blizzards, fog, etc.), and in this embodiment, these detections are performed every nanosecond.
  • the Central Brain may use this information to match the weather forecast with the highest accuracy rate for the entire road + minimum spot by AI.
  • the Central Brain may also use this information to match with the location information of other vehicles.
  • the Central Brain may also use this information to match with the best estimated vehicle type (matching the remaining amount and speed for that journey every nanosecond).
  • the Central Brain may also use this information to match with the mood of the music, etc., that the passengers are listening to.
  • the Central Brain may also use this information to instantly rearrange the conditions to change the desired mood.
  • the Central Brain may, for example, upload AI data to the cloud when the vehicle is charging.
  • a Data Lake may be formed, and the AI may analyze the data and upload it to the cloud in a constantly updated state.
  • a sensor mounted on a vehicle can detect objects such as other vehicles at long distances in a straight line along the direction of travel, but at intersections and other locations there are blind spots where the sensor cannot detect objects.
  • the solid-line rectangle surrounded by the dash-dotted rectangle represents the vehicle on which the sensor is mounted, and the dash-dotted arrow represents the direction of travel of that vehicle.
  • the shaded area represents the blind spot where the sensor mounted on the vehicle cannot detect objects.
  • sensors 110 capable of communicating with the Central Brain of a Level 6 autonomous vehicle are installed at all traffic lights 100 in the city.
  • sensors 110 include radar, LiDAR, and high-pixel, telephoto, ultra-wide-angle, 360-degree, high-performance digital cameras.
  • the sensor 110 is installed at the top of the traffic light 100, but the installation location of the sensor 110 is not limited to the top of the traffic light 100.
  • the sensor 110 may be installed on the side of the traffic light 100 or on the pillar part of the traffic light 100.
  • the sensor 110 collects information detected in areas that are blind spots for autonomous vehicles, and transmits information about road conditions to Level 6 autonomous vehicles via wireless communication.
  • the Central Brain acquires multiple pieces of information detected by sensors 110 installed in the traffic lights 100, and uses the acquired information and AI to control the vehicle.
  • Central Brain may use both software and hardware as a method to optimize vehicle traffic.
  • Central Brain uses AI to best mix multiple pieces of information detected by sensors 110 installed on traffic lights 100, cloud-stored information, and vehicle sensor information, and the AI makes decisions every nanosecond to realize automatic driving that meets the passengers' needs.
  • the vehicle micro-controls the motor's rotation output every 1/1 billion second.
  • the vehicle is equipped with electricity and a motor that can communicate and be controlled in nanoseconds.
  • AI predicts crises, making it possible to make a perfect stop without the need for braking and without spilling a cup of water. It also consumes low power and does not generate brake friction.
  • Figure 5 shows an outline of the Perfect Speed Control achieved by the control of the Central Brain in this embodiment.
  • the principle shown in Figure 5 is an index for calculating the braking distance of the vehicle, and is controlled by this basic equation. In the system in this embodiment, because there is ultra-high performance input data, calculations can be made with a beautiful bell curve.
  • Figure 6 shows a schematic diagram of the Perfect Bell Curves achieved by the control of the Central Brain in this embodiment.
  • the computational speed required to realize ultra-high performance autonomous driving is 1 million TOPS.
  • the Central Brain may realize Perfect Cruise Control.
  • the Central Brain may perform control according to the wishes of the occupants aboard the vehicle. Examples of passenger preferences include “shortest time,” “longest battery remaining,” “I want to avoid car sickness as much as possible,” “I want to feel the most G-forces (safely),” “I want to enjoy the scenery with a mix of the above,” “I want to experience a different scenery than last time,” “For example, I want to retrace the memories of a road I took with someone years ago,” “I want to minimize the chance of an accident,” etc.
  • the Central Brain consults with passengers about various other conditions, and executes the perfect mix with the vehicle based on the number of passengers, weight, position, and shift in the center of gravity of the weight (calculated every nanosecond), detection of the road material every nanosecond, detection of the outside air temperature every nanosecond, detection of the outside air humidity every nanosecond, and the total of the above conditions selected every nanosecond.
  • the central brain may consider and execute things like “up, down, side, and diagonal slope of the road,” “matching with the weather forecast with the highest accuracy rate for the entire route + the smallest spot by AI,” “matching with the location information of other cars every nanosecond,” “matching with the best estimated car model (matching the remaining amount and speed on that route every nanosecond), “matching with the mood of the music the passengers are listening to, etc.,” “instantaneous reconfiguration of conditions when the desired mood changes,” “estimation of the optimal mix of the road's freezing condition, moisture content, wear of the material of each tire (4, 2, 8, 16 tires, etc.), air pressure, and the remaining road,” “lane width, angle, and whether it is a no-passing lane on the road at that time,” “vehicle models in the oncoming lane and in front and behind and the cruising state of those cars (every nanosecond),” and “best mix of all other conditions.”
  • the position that should be taken within the width of each lane is different and not the center. It varies depending on the speed, angle and road information at the time. For example, it performs best probability inference matching of flying birds, animals, oncoming cars, flying soccer balls, children, accident cars, earthquakes, fires, wind, typhoons, heavy rain, light rain, blizzards, fog and other influences every nanosecond.
  • a perfect match is then performed using the capabilities of the current version of the Central Brain and the latest updated information of the brain cloud accumulated up to that point.
  • ultra-high performance autonomous driving requires 1 million TOPs to provide the best battery power management and temperature AI synchronized burst chilling function at that time.
  • Figures 7 to 13 are schematic diagrams of Perfect Cruising.
  • FIG. 14 is a block diagram showing an example of the functional configuration of the Central brain.
  • the Central brain includes an information acquisition unit 30, a judgment unit 32, an inference unit 34, and a control unit 36.
  • a trained model 40 is stored in a storage device included in the Central brain. The trained model 40 realizes the functions of the AI.
  • the trained model 40 receives sensor information detected by various sensors as input, and outputs indexed values (hereinafter referred to as "index values") related to vehicle control as control information for controlling the operation of the vehicle.
  • the trained model 40 is a model obtained by machine learning, more specifically, deep learning.
  • the information acquisition unit 30 acquires multiple pieces of information detected by sensors mounted on the vehicle.
  • the information acquisition unit 30 also acquires multiple pieces of information detected by sensors 110 installed in the traffic lights 100.
  • the determination unit 32 determines whether the vehicle is entering an intersection where a traffic light 100 is installed, or whether the vehicle is traveling on a road other than an intersection. For this determination, the determination unit 32 uses, for example, the position information of the vehicle measured by a GPS device mounted on the vehicle and map information. Note that the determination unit 32 may use, for this determination, an image of the surroundings of the vehicle taken by a digital camera included in the sensor group mounted on the vehicle. Furthermore, the determination unit 32 may determine that the vehicle is entering an intersection when communication with the sensor 110 installed on the traffic light 100 becomes possible.
  • the inference unit 34 inputs the multiple pieces of information detected by the sensor 110 installed on the traffic light 100 acquired by the information acquisition unit 30 to the trained model 40.
  • the inference unit 34 inputs the multiple pieces of information detected by the sensor mounted on the vehicle acquired by the information acquisition unit 30 to the trained model 40.
  • the trained model 40 outputs multiple index values according to the multiple pieces of input information.
  • the index values are an example of output values of the trained model 40.
  • the inference unit 34 infers an index value based on multiple sensor information.
  • This inference unit 34 can obtain an accurate index value by performing multivariate analysis using the integral method shown in formula (1) below (see formula (2)) using the computing power of Level 6 on data collected every nanosecond from a large number of sensor groups, etc. More specifically, while calculating the integral value of the delta values of various Ultra High Resolutions using the computing power of Level 6, it can obtain the indexed value of each variable at the edge level and in real time, and obtain the most probabilistic value of the result that will occur in the next nanosecond.
  • DL indicates deep learning
  • A, B, C, D, ..., N indicate air resistance, road resistance, road elements (e.g., garbage), and slip coefficient, etc.
  • the indexed values of each variable obtained by the inference unit 34 can be further refined by increasing the number of Deep Learning rounds. For example, more accurate index values can be calculated using a huge amount of data such as tires, motor rotation, steering angle, road material, weather, garbage, effects of quadratic deceleration, slippage, and steering and speed control methods for losing balance and regaining balance.
  • the control unit 36 may execute driving control of the vehicle based on the multiple index values identified by the inference unit 34.
  • This control unit 36 may be capable of realizing automatic driving control of the vehicle.
  • the control unit 36 may obtain the most probabilistic value of the result that will occur in the next nanosecond from the multiple index values, and perform driving control of the vehicle taking into consideration the probabilistic value.
  • This control may be performed, for example, using a lookup table in which combinations of multiple index values are associated with control parameters that control the driving of the vehicle.
  • This control may also be performed, for example, using a trained model that uses multiple index values as input and outputs control parameters that control the driving of the vehicle. Examples of the control parameters include parameters that control the speed, acceleration, and traveling direction of the vehicle.
  • the central brain repeatedly executes the flowchart shown in Figure 16.
  • step S10 the determination unit 32 determines whether the vehicle is entering an intersection. If the determination in step S10 is positive, the process proceeds to step S12. In step S12, the information acquisition unit 30 acquires multiple pieces of information detected by the sensor 110 installed in the traffic light 100.
  • step S14 the inference unit 34 infers multiple index values by inputting the multiple pieces of information detected by the sensor 110 installed in the traffic light 100 obtained in step S12 into the trained model 40, as described above.
  • step S16 the control unit 36 executes driving control of the host vehicle based on the multiple index values identified in step S14, as described above.
  • step S10 determines that the vehicle is traveling on a road other than an intersection
  • the determination in step S10 is negative, and the process proceeds to step S18.
  • step S18 the information acquisition unit 30 acquires multiple pieces of information detected by sensors mounted on the vehicle.
  • step S20 the inference unit 34 infers multiple index values by inputting the multiple pieces of information detected by the sensors mounted on the vehicle and acquired in step S18 into the trained model 40, as described above.
  • step S22 the control unit 36 executes driving control of the vehicle based on the multiple index values identified in step S20, as described above.
  • the control unit 36 may control the host vehicle using the multiple pieces of information detected by the sensor 110 installed in the traffic light 100 and the learned model 40. In this case, it is possible to prevent the behavior of the host vehicle from changing suddenly when the host vehicle enters an intersection. If these output values do not match, the control unit 36 continues to control the host vehicle using the multiple pieces of information detected by the sensor mounted on the host vehicle and the learned model 40.
  • sensors 110 capable of communicating with the Central Brain of a Level 6 autonomous vehicle are installed at all traffic lights 100 in the city.
  • the Central Brain of a Level 6 autonomous vehicle can obtain information about blind spots from the sensors 110.
  • FIG. 17 shows a schematic diagram of an example of a hardware configuration of a computer 1200 functioning as a Central Brain, which is an example of a control device.
  • a program installed on the computer 1200 can cause the computer 1200 to function as one or more "parts" of the device according to this embodiment, or to execute operations or one or more "parts” associated with the device according to this embodiment, and/or to execute a process or steps of the process according to this embodiment.
  • Such a program can be executed by the CPU 1212 to cause the computer 1200 to execute specific operations associated with some or all of the blocks of the flowcharts and block diagrams described in this specification.
  • the computer 1200 includes a CPU 1212, a RAM 1214, and a graphics controller 1216, which are connected to each other by a host controller 1210.
  • the computer 1200 also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
  • the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
  • the storage device 1224 may be a hard disk drive, a solid state drive, etc.
  • the computer 1200 also includes a ROM 1230 and a legacy input/output unit such as a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
  • the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • the storage device 1224 stores programs and data used by the CPU 1212 in the computer 1200.
  • the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
  • the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
  • ROM 1230 stores therein a boot program or the like executed by computer 1200 upon activation, and/or a program that depends on the hardware of computer 1200.
  • I/O chip 1240 may also connect various I/O units to I/O controller 1220 via USB ports, parallel ports, serial ports, keyboard ports, mouse ports, etc.
  • the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
  • the programs are read from the computer-readable storage medium, installed in storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by CPU 1212.
  • the information processing described in these programs is read by computer 1200, and brings about cooperation between the programs and the various types of hardware resources described above.
  • An apparatus or method may be constructed by realizing the operation or processing of information according to the use of computer 1200.
  • CPU 1212 may execute a communication program loaded into RAM 1214 and instruct communication interface 1222 to perform communication processing based on the processing described in the communication program.
  • communication interface 1222 reads transmission data stored in a transmission buffer area provided in RAM 1214, storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
  • the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
  • an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
  • CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequence of the program, and write back the results to RAM 1214.
  • CPU 1212 may also search for information in a file, database, etc. in the recording medium.
  • CPU 1212 may search for an entry whose attribute value of the first attribute matches a specified condition from among the multiple entries, read the attribute value of the second attribute stored in the entry, and thereby obtain the attribute value of the second attribute associated with the first attribute that satisfies a predetermined condition.
  • the above-described programs or software modules may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the programs to the computer 1200 via the network.
  • the blocks in the flowcharts and block diagrams in this embodiment may represent stages of a process where an operation is performed or "parts" of a device responsible for performing the operation. Particular stages and “parts" may be implemented by dedicated circuitry, programmable circuitry provided with computer-readable instructions stored on a computer-readable storage medium, and/or a processor provided with computer-readable instructions stored on a computer-readable storage medium.
  • the dedicated circuitry may include digital and/or analog hardware circuitry and may include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuitry may include reconfigurable hardware circuitry including AND, OR, XOR, NAND, NOR, and other logical operations, flip-flops, registers, and memory elements, such as, for example, field programmable gate arrays (FPGAs) and programmable logic arrays (PLAs).
  • FPGAs field programmable gate arrays
  • PDAs programmable logic arrays
  • a computer-readable storage medium may include any tangible device capable of storing instructions that are executed by a suitable device, such that a computer-readable storage medium having instructions stored thereon comprises an article of manufacture that includes instructions that can be executed to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of computer-readable storage media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer-readable storage media may include floppy disks, diskettes, hard disks, random access memories (RAMs), read-only memories (ROMs), erasable programmable read-only memories (EPROMs or flash memories), electrically erasable programmable read-only memories (EEPROMs), static random access memories (SRAMs), compact disk read-only memories (CD-ROMs), digital versatile disks (DVDs), Blu-ray disks, memory sticks, integrated circuit cards, and the like.
  • RAMs random access memories
  • ROMs read-only memories
  • EPROMs or flash memories erasable programmable read-only memories
  • EEPROMs electrically erasable programmable read-only memories
  • SRAMs static random access memories
  • CD-ROMs compact disk read-only memories
  • DVDs digital versatile disks
  • Blu-ray disks memory sticks, integrated circuit cards, and the like.
  • the computer readable instructions may include either assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including object-oriented programming languages such as Smalltalk (registered trademark), JAVA (registered trademark), C++, etc., and conventional procedural programming languages such as the "C" programming language or similar programming languages.
  • ISA instruction set architecture
  • machine instructions machine-dependent instructions
  • microcode firmware instructions
  • state setting data or source or object code written in any combination of one or more programming languages, including object-oriented programming languages such as Smalltalk (registered trademark), JAVA (registered trademark), C++, etc., and conventional procedural programming languages such as the "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided to a processor of a general-purpose computer, special-purpose computer, or other programmable data processing apparatus, or to a programmable circuit, either locally or over a local area network (LAN), a wide area network (WAN) such as the Internet, so that the processor of the general-purpose computer, special-purpose computer, or other programmable data processing apparatus, or to a programmable circuit, executes the computer-readable instructions to generate means for performing the operations specified in the flowcharts or block diagrams.
  • processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 18 shows a traffic light system 10 according to the second embodiment.
  • the traffic light system 10 includes a plurality of traffic light devices 12 installed at each intersection of roads, a plurality of autonomous vehicles 16, and a traffic light control device 22.
  • the traffic light device 12 includes the traffic light 100 and sensor 110 described in the first embodiment, and a wireless communication unit 14 for wireless communication with the traffic light control device 22.
  • the sensor 110 in the second embodiment is capable of detecting traffic conditions such as an emergency vehicle (e.g., a police vehicle, an ambulance, a fire engine, etc. traveling with a siren sounding) about to pass through the intersection where the traffic light device 12 is installed.
  • an emergency vehicle e.g., a police vehicle, an ambulance, a fire engine, etc. traveling with a siren sounding
  • the autonomous vehicle 16 includes a driving plan creation unit 18 and a wireless communication unit 20 for wireless communication with the signal control device 22.
  • the driving plan creation unit 18 is realized by the Central Brain described in the first embodiment executing a specified program.
  • the driving plan creation unit 18 is triggered when the destination of the autonomous vehicle 16 is set, and performs processing to subdivide the route to the set destination into driving plans such as going straight at intersections and turning right or left, and to create a driving plan that also specifies the scheduled execution time of each driving plan.
  • the Central Brain controls the autonomous vehicle 16 to drive autonomously according to the driving plan created by the driving plan creation unit 18.
  • the signal control device 22 includes a CPU, memories such as ROM and RAM, a non-volatile storage unit such as an HDD and SSD, and a wireless communication unit 43.
  • a signal control program is stored in the storage unit.
  • the signal control device 22 functions as a first acquisition unit 24, a second acquisition unit 26, a determination unit 28, a control unit 41, and a cooperative control unit 42 by the CPU executing the signal control program, and performs the signal control process (FIG. 19) described below.
  • the signal control device 22 is an example of a signal control device in this disclosure.
  • the first acquisition unit 24 acquires traffic conditions around the intersection from sensors 110 installed around the intersection.
  • the second acquisition unit 26 acquires a driving plan for the autonomous vehicle 16 that is scheduled to pass through the intersection.
  • the determination unit 28 determines whether a delay in the driving plan of the autonomous vehicle 16 will occur when the autonomous vehicle 16 passes through the intersection, based on the traffic conditions around the intersection acquired by the first acquisition unit 24.
  • the control unit 41 controls the traffic lights 100 at the intersection so as to suppress delays in the driving plan of the autonomous vehicle 16 when the autonomous vehicle 16 passes through the intersection.
  • the cooperative control unit 42 controls each of the traffic lights 100 at multiple intersections through which the autonomous vehicle 16 is scheduled to pass in sequence so as to suppress delays in the driving plan of the autonomous vehicle 16.
  • the signal control device 22 constantly monitors the position and speed of each autonomous vehicle 16 by periodically communicating with each autonomous vehicle 16 traveling on the road. The signal control device 22 then performs the signal control process shown in FIG. 19 when an autonomous vehicle 16 approaches within a predetermined distance of an intersection where a signal device 12 is installed (hereinafter referred to as the intersection to be controlled).
  • step 50 of the signal control process the first acquisition unit 24 of the signal control device 22 acquires from the sensor 110 the traffic conditions at the intersection to be controlled, such as whether an emergency vehicle is about to pass through the intersection to be controlled.
  • the second acquisition unit 26 acquires a driving plan from the autonomous vehicle 16 that is scheduled to pass through the controlled intersection.
  • the driving plan acquired by the second acquisition unit 26 from the autonomous vehicle 16 includes information on the planned driving (straight ahead/left turn/right turn) of the autonomous vehicle 16 at the controlled intersection and the planned execution time of the driving plan (scheduled time of passing through the controlled intersection).
  • FIG. 20 shows an example of a driving plan for the autonomous vehicle 16 acquired by the second acquisition unit 26, labeled as the "initial driving plan.”
  • This "initial driving plan” is a driving plan that allows the autonomous vehicle 16 to pass through the intersection to be controlled without waiting for the traffic light while the traffic light 100 at the intersection to be controlled is green.
  • step 54 the determination unit 28 calculates the time at which the autonomous vehicle 16 will pass through the intersection to be controlled, based on the traffic conditions at the intersection to be controlled acquired by the first acquisition unit 24 in step 50.
  • step 56 the determination unit 28 determines whether the intersection passing time calculated in step 54 is delayed by a predetermined time or more from the driving plan of the autonomous vehicle 16 (scheduled time to pass through the intersection to be controlled).
  • step 58 is skipped and the signal control process ends.
  • the time required to pass the controlled intersection will include the time to wait for the emergency vehicle to pass and the time to wait for the traffic light, as shown in FIG. 20 as "Actual driving plan estimated from surrounding traffic conditions", as an example.
  • the time calculated in step 54 is delayed by a predetermined time or more with respect to the driving plan of the autonomous vehicle 16 (scheduled time to pass the controlled intersection), and the determination in step 56 is affirmative, and the process proceeds to step 58.
  • step 58 the control unit 41 controls the traffic light 100 at the controlled intersection so that the traffic light 100 at the controlled intersection is kept green while the autonomous vehicle 16 passes through the controlled intersection (see also "traffic light color after control” in FIG. 20), and ends the signal control process.
  • the control unit 41 controls the traffic light 100 at the controlled intersection so that the traffic light 100 at the controlled intersection is kept green while the autonomous vehicle 16 passes through the controlled intersection (see also "traffic light color after control” in FIG. 20), and ends the signal control process.
  • the time required to pass through the controlled intersection is shortened by the time spent waiting for the traffic light (see also “delay suppression (t2)"), and delays in the driving plan of the autonomous vehicle 16 are suppressed.
  • step 60 the cooperative control unit 42 determines whether or not the delay in the travel plan of the autonomous vehicle 16 has been resolved following the control of the traffic light 100 at the intersection to be controlled in step 58. If the determination in step 60 is positive, the signal control process ends.
  • step 60 determines whether the traffic light 100 at the next intersection is maintained at a green light while the autonomous vehicle 16 passes through the next intersection.
  • step 62 the cooperative control unit 42 controls the traffic light 100 at the next intersection so that the traffic light 100 at the next intersection is maintained at a green light while the autonomous vehicle 16 passes through the next intersection.
  • step 62 the process returns to step 60, and steps 60 and 62 are repeated until the determination in step 60 is positive. In this way, the traffic lights 100 at multiple intersections that the autonomous vehicle 16 passes through in sequence are cooperatively controlled so that delays in the driving plan of the autonomous vehicle 16 are eliminated.
  • the first acquisition unit 24 of the signal control device 22 acquires the traffic conditions around the intersection to be controlled from the sensor 110 installed around the intersection to be controlled, and the second acquisition unit 26 acquires the driving plan of the autonomous vehicle 16 that is scheduled to pass through the intersection to be controlled.
  • the determination unit 28 determines whether or not a delay will occur in the driving plan of the autonomous vehicle 16 when the autonomous vehicle 16 passes through the intersection to be controlled, based on the traffic conditions acquired by the first acquisition unit 24. Then, when the determination unit 28 determines that the delay will occur, the control unit 41 controls the traffic light 100 of the intersection to be controlled so that the delay is suppressed. This makes it possible to suppress delays in the driving plan of the autonomous vehicle 16, and to suppress the imposition of a large load, such as re-creating a driving plan, on the on-board computer (Central Brain) that performs autonomous driving control, etc., while driving.
  • a large load such as re-creating a driving plan, on the on-board computer (Central Brain) that performs autonomous driving control, etc.
  • the control unit 41 controls the traffic light 100 at the controlled intersection so that the traffic light 100 at the controlled intersection is maintained at a green light while the autonomous vehicle 16 passes through the controlled intersection. This ensures safety when the autonomous vehicle 16 passes through the controlled intersection, while preventing the traffic light at the controlled intersection from being green for an unnecessarily long time, compared to when control is performed such as extending the time that the traffic light 100 at the controlled intersection is green for a certain period of time.
  • the cooperative control unit 42 controls the traffic lights 100 at multiple intersections through which the autonomous vehicle 16 is scheduled to pass in sequence so as to suppress the delay ( FIG. 21 ). This makes it possible to eliminate delays in the driving plan of the autonomous vehicle 16 while the autonomous vehicle 16 passes through multiple intersections in sequence.
  • the process of controlling the traffic lights 100 at the intersection to be controlled when the driving plan of the autonomous vehicle 16 is delayed is performed for all autonomous vehicles 16 passing through the intersection to be controlled.
  • an urgency level may be set in advance for each autonomous vehicle 16, and the process of controlling the traffic lights 100 at the intersection to be controlled when the driving plan of the autonomous vehicle 16 is delayed may be performed for autonomous vehicles 16 with an urgency level equal to or higher than a predetermined value. In this way, for example, by setting the urgency level of an autonomous vehicle 16 transporting a sick person to a predetermined value or higher, it is possible to preferentially suppress delays in the driving plan of the autonomous vehicle 16.
  • a case where the autonomous vehicle 16 encounters an emergency vehicle at an intersection has been described as an example of a traffic situation in which a delay in the driving plan of the autonomous vehicle 16 occurs.
  • the present disclosure is not limited to this, and other examples of traffic situations in which a delay in the driving plan of the autonomous vehicle 16 occurs include a case where a pedestrian is present that interferes with the autonomous vehicle 16 when the autonomous vehicle 16 turns right or left at an intersection.
  • one signal control device 22 is provided for multiple signal devices 12, but the present disclosure is not limited to this.
  • a signal control device 22 including each functional unit (first acquisition unit 24, second acquisition unit 26, judgment unit 28, and control unit 41) other than the cooperative control unit 42 may be provided for each intersection corresponding to each signal device 12.
  • the device (signal device 12 and signal control device 22) provided for each intersection is an example of a signal device according to the present disclosure.
  • one cooperative control device functioning as the cooperative control unit 42 may be provided for multiple traffic light devices (signal device 12 and signal control device 22).
  • the traffic light system 10 in the embodiment in which this cooperative control device is provided is an example of a traffic light system according to the present disclosure.
  • FIG. 22 shows an information notification system 210 according to the third embodiment.
  • the information notification system 210 includes a plurality of traffic light devices 211 installed at each intersection of a road, and a plurality of autonomous vehicles 224 traveling on the road.
  • the traffic light device 211 includes the traffic light 100 and sensor 110 described in the first embodiment, a display unit 212, and an information notification device 214.
  • the sensor 110 was configured to be capable of wireless communication with the Central Brain of the autonomous vehicle 224, but the sensor 110 in this third embodiment may omit the function of wireless communication with the autonomous vehicle 224, etc.
  • the display unit 212 is installed near the traffic light 100, and has a resolution that allows it to display a specified two-dimensional code. Note that while FIG. 23 shows only one display unit 212, a display unit 212 (and traffic light 100) is provided for each road with a different approach direction to the intersection. For example, as shown in FIG. 24, in the case of an intersection where a road running in an east-west direction intersects with a road running in a north-south direction, a separate display unit 212 is provided for each of the approach directions to the intersection: "E (East)", “W (West)", “S (South)” and "N (North)".
  • the information notification device 214 includes a CPU, memory such as ROM or RAM, and a non-volatile storage unit such as an HDD or SSD, and an information notification program is stored in the storage unit.
  • the information notification device 214 functions as an acquisition unit 216, a generation unit 218, and a display control unit 220 by the CPU executing the information notification program, and performs the information notification process (FIG. 25) described below.
  • the information notification device 214 is an example of an information notification device related to the present disclosure.
  • the acquisition unit 216 acquires traffic conditions around the intersection from sensors 110 installed around the intersection.
  • the generation unit 218 generates notification information for an autonomous vehicle 224 about to enter the intersection based on the traffic conditions around the intersection acquired by the acquisition unit 216.
  • the display control unit 220 then displays the notification information generated by the generation unit 218 as code information (a two-dimensional code in this third embodiment) on a display unit 212 installed around the intersection.
  • the autonomous vehicle 224 includes a camera 226 capable of capturing an image of the display unit 212, and an autonomous driving control unit 228.
  • the autonomous driving control unit 228 is realized by the Central Brain described in the first embodiment executing a predetermined program.
  • the autonomous driving control unit 228 acquires notification information by decoding code information displayed in an area of the image captured by the camera 226 that corresponds to the display unit 212.
  • the autonomous driving control unit 228 (Central Brain) then controls the autonomous vehicle 224 to travel autonomously in accordance with the acquired notification information (more specifically, driving instruction information for the vehicle contained in the notification information).
  • the information notification process shown in FIG. 25 is a process for an autonomous vehicle 224 that enters an intersection from a specific entry direction (hereinafter, referred to as entry direction X), and the information notification device 214 also performs the information notification process of FIG. 25 for entry directions other than entry direction X.
  • step 250 of the information notification process the acquisition unit 216 of the information notification device 214 acquires from the sensor 110 the traffic conditions at the intersection where the traffic light device 211 is installed (hereinafter simply referred to as the "intersection") and its surroundings.
  • the generation unit 218 identifies an autonomous vehicle 224 that is about to enter the intersection from the entry direction X based on the traffic conditions acquired in step 250, and identifies information about each of the identified autonomous vehicles 224 (ID, position, vehicle speed, direction of travel (straight ahead/right turn/left turn), etc.).
  • ID information about each of the identified autonomous vehicles 224
  • the ID of the autonomous vehicle 224 for example, a character string written on the number plate (license plate) can be applied.
  • the direction of travel of the autonomous vehicle 224 can be identified, for example, from whether or not the turn signal lamp is flashing.
  • the autonomous vehicle 224 is provided with a lamp in a position that can be identified from the outside, such as on the roof, and the lamp is turned on when the autonomous driving control unit 228 is performing autonomous driving.
  • the generation unit 218 determines whether or not a lamp is provided on the roof or the like and whether or not this lamp is turned on for each vehicle that is about to enter the intersection from the entry direction X, thereby identifying the autonomous vehicle 224 that is about to enter the intersection from the entry direction X.
  • the generation unit 218 identifies the traffic conditions in a blind spot area (e.g., the area shown by diagonal lines in FIG. 3) that is a blind spot for a vehicle entering the intersection from the approach direction X, based on the traffic conditions acquired in step 250.
  • the traffic conditions in this blind spot area include, for example, information on the presence or absence or number, position, traveling direction, and moving speed of vehicles, pedestrians, and other traffic participants in the blind spot area.
  • step 256 the generation unit 218 generates driving instruction information for each autonomous vehicle 224 based on the information about the autonomous vehicle 224 that is about to enter the intersection from the approach direction X identified in step 252 and the traffic conditions in the blind spot area identified in step 254.
  • the generation unit 218 determines whether or not each of the autonomous vehicles 224 that will soon enter the intersection from the entry direction X and whose traveling direction is "straight ahead" can pass through the intersection while the intersection has a green light when traveling at its current vehicle speed.
  • the generation unit 218 generates driving notification information instructing the first autonomous vehicle 224 to "keep the current vehicle speed while traveling," and for a second autonomous vehicle 224 that has been determined to be unable to pass through the intersection while the intersection has a green light, the generation unit 218 generates driving notification information instructing the second autonomous vehicle 224 to "slow down and stop before the intersection.”
  • the driving notification information for each autonomous vehicle 224 includes the ID of the corresponding autonomous vehicle 224 as information.
  • the generation unit 218 determines whether or not the autonomous vehicles 224 that are about to enter the intersection from the entry direction X and whose travel direction is "turn right” or “turn left” will interfere with pedestrians or the like in a blind spot when turning right or left.
  • the generation unit 218 then generates driving notification information for a third autonomous vehicle 224 that is determined not to interfere with pedestrians or the like in a blind spot when turning right or left, instructing the vehicle to "slowly pass through the crosswalk when turning right or left,” and generates driving notification information for a fourth autonomous vehicle 224 that is determined to interfere with pedestrians or the like in a blind spot when turning right or left, instructing the vehicle to "stop temporarily before the crosswalk when turning right or left.”
  • step 258 the display control unit 220 generates a two-dimensional code that encodes the notification information, including driving instruction information, generated in step 256 for each autonomous vehicle 224 that is about to enter the intersection from the entry direction X. Then, in step 260, the display control unit 220 displays the two-dimensional code generated in step 258 on the display unit 212 for the autonomous vehicle 224 that is about to enter the intersection from the entry direction X, and ends the information notification process.
  • the code information when the code information is displayed on the display unit 212 and notified to the autonomous vehicle 224, if the color of the traffic light 100 changes from blue to yellow to red, the code information is changed to match the color of the traffic light 100.
  • the timing for changing the code information displayed on the display unit 212 may be simultaneous with the color change of the traffic light 100, or may be a predetermined time before the color change of the traffic light 100.
  • the autonomous driving control unit 228 obtains notification information by decoding the code information displayed in the area of the image captured by the camera 226 that corresponds to the display unit 212.
  • the autonomous driving control unit 228 then extracts driving instruction information for the vehicle itself from the ID included in the obtained notification information, and controls the autonomous vehicle 224 to drive autonomously in accordance with the extracted driving instruction information.
  • the first autonomous vehicle 224 described above is controlled to "keep driving at the current vehicle speed" in accordance with the driving instruction information for its own vehicle
  • the second autonomous vehicle 224 described above is controlled to "slow down and stop before the intersection” in accordance with the driving instruction information for its own vehicle.
  • the third autonomous vehicle 224 described above is controlled to "slow down and pass the crosswalk when turning right or left” in accordance with the driving instruction information for its own vehicle
  • the fourth autonomous vehicle 224 described above is controlled to "make a temporary stop before the crosswalk when turning right or left” in accordance with the driving instruction information for its own vehicle.
  • the acquisition unit 216 of the information notification device 214 acquires the traffic conditions around the intersection from the sensors 110 installed around the intersection. Furthermore, the generation unit 218 generates notification information for an autonomous vehicle about to enter the intersection, based on the traffic conditions around the intersection acquired by the acquisition unit 216.
  • the display control unit 220 displays the notification information generated by the generation unit 218 as code information on the display unit 212 installed around the intersection. This makes it possible to notify the autonomous vehicle 224 of the notification information without using a mobile communication network, and therefore makes it possible to notify the autonomous vehicle 224 of information without being affected by the communication conditions of the mobile communication network.
  • the generation unit 218 generates, as notification information, information including driving instruction information that instructs each of the multiple autonomous vehicles 224 about to enter the intersection to drive. In this way, by displaying a single piece of notification information as code information on the display unit 212, driving instructions can be given to each of the multiple autonomous vehicles 224 about to enter the intersection.
  • the generation unit 218 generates driving instruction information taking into consideration the traffic conditions in the blind spot area around the intersection that is a blind spot from the autonomous vehicle 224. This makes it possible to give driving instructions to the autonomous vehicle 224 that take into consideration the traffic conditions in the blind spot area that is a blind spot from the autonomous vehicle 224.
  • the display control unit 220 causes the display unit 212 to display a two-dimensional barcode as code information. This makes it possible to increase the amount of notification information that can be displayed as code information on the display unit 212, compared to a mode in which a one-dimensional code is displayed as code information.
  • a mode of generating driving instruction information taking into account the traffic conditions in the blind spot area has been described, but the present disclosure is not limited to this, and information indicating the condition of the blind spot area may be included in the notification information as blind spot area information.
  • the above blind spot area information may be included in the notification information only for intersections where visibility is poor and blind spots are created by the on-board sensor.
  • the notification information was displayed on the display unit 212 as a two-dimensional code, which is an example of code information in this disclosure, but the code information in this disclosure may be something other than a two-dimensional code, such as a one-dimensional barcode.
  • the information notification device 214 according to the present disclosure is attached to the traffic light 100 to form part of the traffic light device 211, but the present disclosure is not limited to this, and the information notification device 214 according to the present disclosure can also be installed together with the sensor 110 at an intersection where no traffic light 100 is installed, or at a junction where no traffic light 100 is installed and multiple roads merge.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif de commande commandant un véhicule et comprenant : une unité d'acquisition d'informations qui acquiert une pluralité d'éléments d'informations détectés par un capteur installé sur un feu de signalisation ; et une unité de commande qui commande le véhicule à l'aide d'un modèle entraîné et de la pluralité d'éléments d'informations acquis par l'unité d'acquisition d'informations.
PCT/JP2023/036092 2022-10-14 2023-10-03 Dispositif de commande pour véhicule autonome, programme, dispositif de commande de signal, dispositif de feu de signalisation, système de feu de signalisation, programme de commande de signal, dispositif de notification d'informations et programme de notification d'informations WO2024080191A1 (fr)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2022-165875 2022-10-14
JP2022165875 2022-10-14
JP2022-172347 2022-10-27
JP2022172347A JP2024058513A (ja) 2022-10-14 2022-10-27 自動運転車両の制御装置及びプログラム
JP2023-036942 2023-03-09
JP2023036942A JP2024058543A (ja) 2022-10-14 2023-03-09 信号制御装置、信号機装置、システムおよびプログラム
JP2023-041210 2023-03-15
JP2023041210A JP2024058545A (ja) 2022-10-14 2023-03-15 情報通知装置、信号機装置およびプログラム

Publications (1)

Publication Number Publication Date
WO2024080191A1 true WO2024080191A1 (fr) 2024-04-18

Family

ID=90669180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/036092 WO2024080191A1 (fr) 2022-10-14 2023-10-03 Dispositif de commande pour véhicule autonome, programme, dispositif de commande de signal, dispositif de feu de signalisation, système de feu de signalisation, programme de commande de signal, dispositif de notification d'informations et programme de notification d'informations

Country Status (1)

Country Link
WO (1) WO2024080191A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019077685A1 (fr) * 2017-10-17 2019-04-25 本田技研工業株式会社 Système de génération de modèle de fonctionnement, véhicule dans un système de génération de modèle de fonctionnement, procédé de traitement et programme
JP2020535572A (ja) * 2017-09-25 2020-12-03 コンチネンタル オートモーティブ システムズ インコーポレイテッドContinental Automotive Systems, Inc. インフラストラクチャセンサの自己較正のシステムおよび方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020535572A (ja) * 2017-09-25 2020-12-03 コンチネンタル オートモーティブ システムズ インコーポレイテッドContinental Automotive Systems, Inc. インフラストラクチャセンサの自己較正のシステムおよび方法
WO2019077685A1 (fr) * 2017-10-17 2019-04-25 本田技研工業株式会社 Système de génération de modèle de fonctionnement, véhicule dans un système de génération de modèle de fonctionnement, procédé de traitement et programme

Similar Documents

Publication Publication Date Title
US11755025B2 (en) Guiding vehicles through vehicle maneuvers using machine learning models
CN110641472B (zh) 基于神经网络的用于自主车辆的安全监控系统
US20230418299A1 (en) Controlling autonomous vehicles using safe arrival times
US12001958B2 (en) Future trajectory predictions in multi-actor environments for autonomous machine
CN111247495B (zh) 用于自动驾驶车辆的低速场景的行人交互系统
US10606270B2 (en) Controlling an autonomous vehicle using cost maps
EP3605491A1 (fr) Utilisation d'un sous-système de véhicules à conduite autonome (adv) pour des patrouilles de voitures de police
CN111328411B (zh) 用于自动驾驶车辆的行人概率预测系统
CN111380534B (zh) 用于自动驾驶车辆的基于st图学习的方法
CN111476079B (zh) 将用于对象检测的地图特征与lidar合并的全面且有效的方法
CN110621541B (zh) 用于生成轨迹以操作自动驾驶车辆的方法和系统
WO2018232681A1 (fr) Prédiction de circulation basée sur des images de carte pour conduite autonome
JP2018116705A (ja) ブレーキライトを利用して自動運転車両と追従車両との間の距離を保持する方法
CN109085818B (zh) 基于车道信息控制自动驾驶车辆的车门锁的方法和系统
CN114945493A (zh) 协作式交通工具前灯引导
CN111857118A (zh) 对停车轨迹分段以控制自动驾驶车辆停车
CN111259712B (zh) 用于车辆行为预测的压缩环境特征的表示
CN112041773A (zh) 自动驾驶车辆的规划和控制之间的通信协议
WO2020132938A1 (fr) Procédés de filtrage d'obstacles pour système de planification d'évitement de collision dans un véhicule à conduite autonome
WO2024080191A1 (fr) Dispositif de commande pour véhicule autonome, programme, dispositif de commande de signal, dispositif de feu de signalisation, système de feu de signalisation, programme de commande de signal, dispositif de notification d'informations et programme de notification d'informations
JP2024058543A (ja) 信号制御装置、信号機装置、システムおよびプログラム
WO2024075657A1 (fr) Régulateur de vitesse parfait
JP2024054017A (ja) Perfect Cruise Control
WO2024090288A1 (fr) Réglage de redondance pour commande cérébrale centrale
JP2024054006A (ja) Perfect Cruise Control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23877196

Country of ref document: EP

Kind code of ref document: A1