WO2018127962A1 - Dispositif et procédé de commande de rapport - Google Patents

Dispositif et procédé de commande de rapport Download PDF

Info

Publication number
WO2018127962A1
WO2018127962A1 PCT/JP2017/000193 JP2017000193W WO2018127962A1 WO 2018127962 A1 WO2018127962 A1 WO 2018127962A1 JP 2017000193 W JP2017000193 W JP 2017000193W WO 2018127962 A1 WO2018127962 A1 WO 2018127962A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
unit
notification control
primary
recognition
Prior art date
Application number
PCT/JP2017/000193
Other languages
English (en)
Japanese (ja)
Inventor
真規 塚田
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018560292A priority Critical patent/JP6661796B2/ja
Priority to PCT/JP2017/000193 priority patent/WO2018127962A1/fr
Publication of WO2018127962A1 publication Critical patent/WO2018127962A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a notification control apparatus and a notification control method for notifying a driver of a situation around a host vehicle.
  • Patent Documents 1, 2, 3).
  • Patent Documents 1, 2, and 3 merely notify the driver of the actual situation, and do not notify the driver of a situation that may occur after the actual situation. Therefore, in Patent Documents 1, 2, and 3, it cannot be said that proper notification is given to the driver.
  • the present invention has been made to solve such a problem, and an object of the present invention is to provide a notification control device and a notification control method capable of performing appropriate notification to a driver.
  • a notification control apparatus includes a surrounding information acquisition unit that acquires at least surrounding information indicating a situation around the own vehicle, and the surrounding information acquired by the surrounding information acquisition unit.
  • a first event recognition unit for recognizing a first event which is a situation of an object or a person existing in the vicinity of the first event, and a situation that can occur after the first event recognized by the first event recognition unit,
  • a secondary event estimator that estimates a secondary event, which has a causal relationship with the situation of the primary event, according to a predetermined rule, and a secondary event estimated by the secondary event estimator is reported.
  • a notification control unit that performs control.
  • the notification control method acquires at least peripheral information indicating a situation around the host vehicle, and obtains a first event that is a situation of an object or person existing around the host vehicle from the acquired peripheral information. Recognizing and recognizing a secondary event that can occur after the recognized primary event and has a causal relationship with the situation of the primary event according to a predetermined rule Control to notify the secondary event is performed.
  • the notification control device includes at least a peripheral information acquisition unit that acquires peripheral information indicating a situation around the host vehicle, and an object existing around the host vehicle from the peripheral information acquired by the peripheral information acquisition unit.
  • a first event recognition unit that recognizes a first event that is a person's situation, and a situation that can occur after the first event recognized by the first event recognition unit, the situation and the causality of the first event
  • a secondary event estimator that estimates a secondary event that is a related situation according to a predetermined rule, and a notification controller that performs control to notify the secondary event estimated by the secondary event estimator Therefore, it is possible to appropriately notify the driver.
  • the notification control method acquires at least peripheral information indicating a situation around the host vehicle, and recognizes a primary event that is a situation of an object or a person existing around the host vehicle from the acquired surrounding information, A secondary event that is a situation that can occur after the recognized primary event and has a causal relationship with the situation of the primary event is estimated according to a predetermined rule. Therefore, it is possible to appropriately notify the driver.
  • FIG. 1 is a block diagram showing an example of the configuration of the notification control apparatus 1 according to the first embodiment. Note that FIG. 1 shows the minimum necessary components constituting the notification control apparatus according to the present embodiment.
  • the notification control device 1 includes a peripheral information acquisition unit 2, a primary event recognition unit 3, a secondary event estimation unit 4, and a notification control unit 5.
  • the peripheral information acquisition unit 2 acquires peripheral information indicating at least the situation around the host vehicle.
  • the primary event recognition unit 3 recognizes a primary event that is a situation of an object or person existing around the host vehicle from the peripheral information acquired by the peripheral information acquisition unit 2.
  • the primary event recognized by the primary event recognition unit may be a situation where the object or person is stationary, or a situation where the object or person is moving.
  • the secondary event estimation unit 4 is a situation that can occur after the primary event recognized by the primary event recognition unit 3 and has a causal relationship with the situation of the primary event. Is estimated according to a predetermined rule.
  • the notification control unit 5 performs control to notify the second event estimated by the second event estimation unit 4.
  • FIG. 2 is a block diagram showing an example of the configuration of the notification control device 6 according to another configuration.
  • the notification control device 6 includes a surrounding information acquisition unit 2, a primary event recognition unit 3, a secondary event estimation unit 4, a notification control unit 5, and a host vehicle information acquisition unit 7.
  • the peripheral information acquisition unit 2 is connected to the peripheral recognition device 10.
  • the notification control unit 5 is connected to a HUD (Head Up Display) 11.
  • the periphery recognition device 10 recognizes at least the situation around the host vehicle and outputs the recognized situation around the host vehicle to the periphery information acquisition unit 2 as the periphery information.
  • the periphery recognition device 10 is installed in the host vehicle, and is configured by, for example, a camera, a millimeter wave radar, or a combination thereof.
  • the periphery recognition device 10 is not limited to the camera and the millimeter wave radar, and may be any device that recognizes the surrounding situation of the host vehicle.
  • the peripheral information acquisition unit 2 acquires peripheral information output from the peripheral recognition device 10.
  • the own vehicle information acquisition unit 7 acquires own vehicle information, which is various information about the own vehicle, via an in-vehicle network provided in the own vehicle.
  • the host vehicle information acquisition unit 7 acquires the current position information of the host vehicle from the GPS (Global Positioning System) as host vehicle information.
  • GPS Global Positioning System
  • the map information acquisition unit 8 is constituted by a storage device such as a hard disk (HDD: Hard Disk Drive) or a semiconductor memory, for example, and acquires and stores map information.
  • the map information acquisition unit 8 may acquire map information from the outside.
  • the map information acquisition unit 8 may be acquired by downloading from an external server or the like via a communication network, or may be acquired by reading from a storage medium such as a memory.
  • the primary event recognition unit 3 recognizes a primary event that is a situation of an object or a person from an image that is peripheral information acquired by the peripheral information acquisition unit 2.
  • the first event recognition unit 3 may recognize the first event based on the current position information of the host vehicle acquired by the host vehicle information acquisition unit 7 and the map information acquired by the map information acquisition unit 8. Good.
  • the primary event recognition unit 3 may recognize the primary event by arbitrarily combining the peripheral information, the host vehicle information, and the map information.
  • the secondary event estimation unit 4 estimates the secondary event from the primary event recognized by the primary event recognition unit 3 according to a predetermined rule.
  • the predetermined rule is data that is converted into data and held by the secondary event estimation unit 4, and as an example, the primary event and the secondary event as shown in FIG. 3 are associated with each other.
  • the secondary event estimation unit 4 estimates “child jumping out” as the secondary event. That is, the secondary event estimation unit 4 estimates that when a ball pops out, a child pops out to take the ball.
  • the primary event recognition unit 3 can recognize the ball from an image captured by a camera that is the periphery recognition device 10, for example.
  • the primary event recognition unit 3 is not limited to recognizing the movement of the ball as the primary event, but may recognize that the ball is stationary as the primary event.
  • the secondary event estimating unit 4 performs “second jump of a person or motorcycle from behind the vehicle”. Estimated as the next event. That is, the secondary event estimation unit 4 estimates that a person or a motorcycle jumps out from behind the vehicle when there is a stopped vehicle.
  • the primary event recognition unit 3 can recognize a stopped vehicle from an image captured by a camera that is the periphery recognition device 10, for example.
  • the secondary event estimation unit 4 estimates “passing the train” as the secondary event. That is, the secondary event estimation unit 4 estimates that the train passes when there is a railroad crossing.
  • the primary event recognition unit 3 can recognize a railroad crossing from, for example, an image captured by a camera that is the periphery recognition device 10.
  • the primary event recognition unit 3 can recognize a railroad crossing based on the current position information of the host vehicle acquired by the host vehicle information acquisition unit 7 and the map information acquired by the map information acquisition unit 8.
  • the secondary event estimation unit 4 estimates “interrupt vehicle” as the secondary event. That is, when the vehicle lane on the road on which the host vehicle travels decreases, the secondary event estimation unit 4 estimates that the other vehicle will enter the vehicle lane on which the host vehicle is traveling.
  • the primary event recognition unit 3 can recognize that the vehicle lane of the road on which the host vehicle is traveling decreases from an image captured by a camera that is the periphery recognition device 10, for example.
  • the primary event recognition unit 3 is based on the current position information of the host vehicle acquired by the host vehicle information acquisition unit 7 and the map information acquired by the map information acquisition unit 8. It can be recognized that the lane decreases.
  • the first event when the second event is “interrupted vehicle” is not limited to the case where the vehicle lane decreases, but may be a vehicle lane in which the host vehicle is traveling, such as a junction of an expressway. A situation in which another vehicle is supposed to interrupt may be used as the first event.
  • the secondary event estimating unit 4 determines that “the object has fallen”. Is estimated as a secondary event.
  • the overhead road that intersects include a pedestrian bridge. That is, for example, when there is a person holding an object on a pedestrian bridge that intersects the road on which the host vehicle is traveling, the secondary event estimation unit 4 determines that the object that the person has is the road on which the host vehicle is traveling. Estimated to fall. In this case, the primary event recognizing unit 3 recognizes that there is a person who is facing the object on the intersecting overhead road from the image taken by the camera that is the periphery recognition device 10, for example. Can do.
  • the alarm image generation unit 9 generates a primary event alarm image indicating the primary event recognized by the primary event recognition unit 3. Further, the warning image generation unit 9 generates a secondary event warning image indicating the secondary event estimated by the secondary event estimation unit 4.
  • the notification control unit 5 performs control to display the first event alarm image or the second event alarm image generated by the alarm image generation unit 9 on the HUD 11.
  • a primary event warning image or a secondary event warning image is displayed on the HUD 11 using AR (Augmented Reality) technology. The driver can see the first event warning image or the second event warning image superimposed on the scenery seen through the windshield of the host vehicle.
  • the notification control unit 5 may perform control to simultaneously display the first event warning image and the second event warning image on the HUD 11.
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of the notification control device 6. The same applies to the notification control apparatus 1.
  • Peripheral information acquisition unit 2, primary event recognition unit 3, secondary event estimation unit 4, notification control unit 5, own vehicle information acquisition unit 7, map information acquisition unit 8, and alarm image generation unit in notification control device 6 Each function of 9 is realized by a processing circuit. That is, the notification control device 6 acquires peripheral information, recognizes a first event, estimates a second event, controls notification, acquires own vehicle information, acquires map information, and acquires first information.
  • a processing circuit is provided for generating a next event warning image and a second event warning image.
  • the processing circuit is a processor 12 (also called a central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, DSP (Digital Signal Processor)) that executes a program stored in the memory 13.
  • Peripheral information acquisition unit 2 primary event recognition unit 3, secondary event estimation unit 4, notification control unit 5, own vehicle information acquisition unit 7, map information acquisition unit 8, and alarm image generation unit in notification control device 6
  • Each function of 9 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 13.
  • the processing circuit reads out and executes the program stored in the memory 13, thereby realizing the function of each unit. That is, the notification control device 6 obtains peripheral information, a step of recognizing a first event, a step of estimating a second event, a step of controlling notification, a step of acquiring own vehicle information, and map information.
  • a memory 13 is provided for storing a program in which the step of obtaining, the step of generating the primary event warning image and the secondary event warning image will be executed as a result.
  • these programs are the surrounding information acquisition part 2, the 1st event recognition part 3, the 2nd event estimation part 4, the alerting
  • the computer executes the procedure or method of the generation unit 9.
  • the memory is non-volatile or volatile such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), etc. May be any other semiconductor memory, magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), or any other storage medium used in the future.
  • FIG. 5 is a flowchart showing an example of the operation of the notification control device 6.
  • the periphery recognition device 10 is a camera and photographs the front of the host vehicle.
  • step S11 the peripheral information acquisition unit 2 acquires an image captured by the peripheral recognition device 10 as peripheral information.
  • step S ⁇ b> 12 the primary event recognition unit 3 recognizes the primary event from the image acquired by the peripheral information acquisition unit 2.
  • the primary event recognition unit 3 recognizes the ball existing in front of the host vehicle as the primary event.
  • step S13 the alarm image generation unit 9 generates a primary event alarm image indicating the primary event recognized by the primary event recognition unit 3.
  • the notification control unit 5 performs control to display the first event warning image generated by the warning image generation unit 9 on the HUD 11.
  • FIG. 6 is a diagram illustrating an example when the first event warning image is displayed on the HUD 11.
  • the primary event warning image 14 is an image that emphasizes the ball itself.
  • the first event warning image 15 is an image that alerts the presence of an object that obstructs traveling ahead of the host vehicle, and an image that alerts the presence of a ball in the example of FIG. 6. It is.
  • the primary event warning image 14 and the primary event warning image 15 are not limited to being displayed together as shown in FIG. 6, and either one may be displayed.
  • step S14 the secondary event estimation unit 4 estimates the secondary event from the primary event recognized by the primary event recognition unit 3. Specifically, the secondary event estimation unit 4 estimates the secondary event based on data as shown in FIG. Here, the secondary event estimation unit 4 estimates “child's jumping out” as the secondary event.
  • step S15 the warning image generation unit 9 generates a secondary event warning image indicating the secondary event estimated by the secondary event estimation unit 4.
  • the notification control unit 5 performs control to display the second event warning image generated by the warning image generation unit 9 on the HUD 11.
  • FIG. 7 is a diagram showing an example when the second event warning image is displayed on the HUD 11.
  • the secondary event warning image 16 is an image in which a child jumps out in front of the host vehicle.
  • the secondary event warning image 16 is displayed at a predetermined position.
  • the position at which the second event warning image 16 is displayed may be arbitrarily set by the user.
  • the notification control devices 1 and 6 not only provide the primary event that is the actual situation, but also the secondary event that can occur after the primary event. Since the notification is made, it is possible to appropriately notify the driver. Therefore, the driver can take an appropriate response in consideration of the secondary event.
  • FIG. 8 is a block diagram showing an example of the configuration of the notification control device 17 according to the second embodiment.
  • the notification control device 17 is characterized by including a second event position estimation unit 18. Since other configurations and operations are the same as those in the first embodiment, detailed description thereof is omitted here.
  • the secondary event position estimation unit 18 estimates the occurrence position of the secondary event estimated by the secondary event estimation unit 4 based on the position where the primary event recognition unit 3 has recognized the primary event. .
  • the alarm image generation unit 9 generates a secondary event alarm image corresponding to the occurrence position estimated by the secondary event position estimation unit 18.
  • the function of the secondary event position estimation unit 18 in the notification control device 17 is realized by a processing circuit. That is, the secondary event position estimation unit 18 includes a processing circuit for estimating the occurrence position of the secondary event.
  • the processing circuit is, for example, a processor 12 that executes a program stored in the memory 13 as shown in FIG.
  • the function of the secondary event position estimation unit 18 in the notification control device 17 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 13.
  • the processing circuit implements the function of the secondary event position estimating unit 18 by reading and executing the program stored in the memory 13. That is, the notification control device 17 includes a memory 13 for storing a program in which the step of estimating the occurrence position of the second event is executed as a result. It can also be said that this program causes the computer to execute the procedure or method of the secondary event position estimation unit 18.
  • FIG. 9 is a flowchart showing an example of the operation of the notification control device 17. Note that steps S21 to S24 in FIG. 9 correspond to steps S11 to S14 in FIG. Below, step S25 and step S26 are demonstrated.
  • the periphery recognition device 10 is a camera and shoots ahead of the host vehicle.
  • step S25 the secondary event position estimation unit 18 determines a predetermined position near the position where the primary event recognition unit 3 has recognized the primary event in the image acquired by the peripheral information acquisition unit 2. Presumed as the occurrence position of the secondary event.
  • step S26 the alarm image generation unit 9 generates a secondary event alarm image corresponding to the occurrence position of the secondary event estimated by the secondary event position estimation unit 18.
  • the notification control unit 5 performs control to display the second event warning image generated by the warning image generation unit 9 on the HUD 11.
  • FIG. 10 is a diagram illustrating an example when the second event warning image is displayed on the HUD 11.
  • the secondary event warning image 16 is displayed at the occurrence position estimated by the secondary event position estimation unit 18.
  • a state in which a child follows the ball and jumps out ahead of the host vehicle is expressed.
  • the notification control device 17 displays the second event at a position where the second event is highly likely to occur, so the driver performs the second event. It is possible to grasp more accurately than the first form.
  • FIG. 11 is a block diagram illustrating an example of the configuration of the notification control device 19 according to the third embodiment.
  • the notification control device 19 includes a recognition range restriction unit 20. Since other configurations and operations are the same as those in the second embodiment, detailed description thereof is omitted here.
  • the recognition range restriction unit 20 restricts the recognition range by the first event recognition unit 3 to a predetermined range including the occurrence position of the second event estimated by the second event position estimation unit 18.
  • the primary event recognition unit 3 recognizes the primary event within the range restricted by the recognition range restriction unit 20. For example, when the primary event recognition unit 3 recognizes the primary event from the image, the primary event recognition unit 3 recognizes the primary event within the range of the image restricted by the recognition range restriction unit 20. I do.
  • the function of the recognition range restriction unit 20 in the notification control device 19 is realized by a processing circuit. That is, the recognition range restriction unit 20 restricts the recognition range by the primary event recognition unit 3 to a predetermined range including the occurrence position of the secondary event estimated by the secondary event position estimation unit 18.
  • the processing circuit is provided.
  • the processing circuit is, for example, a processor 12 that executes a program stored in the memory 13 as shown in FIG.
  • the function of the recognition range restriction unit 20 in the notification control device 19 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 13.
  • the processing circuit implements the function of the recognition range restriction unit 20 by reading and executing the program stored in the memory 13. That is, the notification control device 19 includes a step of limiting the recognition range by the primary event recognition unit 3 to a predetermined range including the occurrence position of the secondary event estimated by the secondary event position estimation unit 18.
  • a memory 13 is provided for storing a program to be executed as a result. It can also be said that this program causes a computer to execute the procedure or method of the recognition range restriction unit 20.
  • FIG. 12 is a flowchart showing an example of the operation of the notification control device 19. Note that steps S31 to S36 in FIG. 12 correspond to steps S21 to S26 in FIG. Below, step S37 and step S38 are demonstrated.
  • the periphery recognition device 10 is a camera and shoots ahead of the host vehicle.
  • step S37 the notification control device 19 determines whether or not to limit the recognition range by the primary event recognition unit 3.
  • the determination of whether or not to limit the recognition range by the primary event recognition unit 3 may be performed according to the resolution of the image acquired by the peripheral information acquisition unit 2. Specifically, when the resolution of the image is high, the processing load on the primary event recognition unit 3 increases, and therefore it may be determined that the recognition range by the primary event recognition unit 3 is limited. When the resolution of the image is low, it may be determined that the recognition range by the primary event recognition unit 3 is not limited because the processing load on the primary event recognition unit 3 is small.
  • the determination as to whether or not to limit the recognition range by the primary event recognition unit 3 may be arbitrarily set by the user. When limiting the recognition range by the primary event recognition part 3, it transfers to step S38. On the other hand, when the recognition range by the primary event recognition unit 3 is not limited, the process proceeds to step S31.
  • the recognition range restriction unit 20 restricts the recognition range by the primary event recognition unit 3 to a predetermined range including the occurrence position of the secondary event estimated by the secondary event position estimation unit 18. To do. Specifically, the recognition range restriction unit 20 restricts the recognition range by the primary event recognition unit 3 to, for example, a recognition range 21 as shown in FIG. Note that the recognition range 21 shown in FIG. 13 is shown for explanation, and is not actually displayed on the HUD 11.
  • the efficiency of recognizing the primary event by the primary event recognition unit 3 can be improved.
  • the secondary event can be displayed quickly.
  • the primary event recognition unit 3 can smoothly recognize the primary event. it can.
  • the primary event recognition unit 3 recognizes the primary event within the range limited by the recognition range limitation unit 20 for all images acquired by the peripheral information acquisition unit 2.
  • the present invention is not limited to this.
  • the primary event recognition unit 3 recognizes the primary event for all images within the range restricted by the recognition range restriction unit 20 and is outside the range restricted by the recognition range restriction unit 20. May recognize the primary event for the image acquired by the peripheral information acquisition unit 2 at predetermined intervals.
  • FIG. 14 is a block diagram showing an example of the configuration of the notification control device 22 according to the fourth embodiment.
  • the notification control device 22 is characterized by including a communication unit 23.
  • Other configurations and operations are the same as those in the third embodiment, and thus detailed description thereof is omitted here.
  • the communication unit 23 is communicably connected to each of the image recognition learning server 24 and the second event estimation data server 25 provided outside the notification control device 22, and the image recognition learning server 24 and the second event. Parameters and data to be described later are acquired from each of the estimated data servers 25.
  • the image recognition learning server 24 improves the recognition accuracy of the primary event by the primary event recognition unit 3 for parameters used when the primary event recognition unit 3 recognizes the primary event from the image.
  • Machine learning The secondary event estimation data server 25 accumulates data used when the secondary event estimation unit 4 estimates the secondary event.
  • the image recognition learning server 24 and the second event estimation data server 25 may be provided separately or may be provided integrally.
  • the function of the communication unit 23 in the notification control device 22 is realized by a processing circuit. That is, the communication unit 23 includes a processing circuit for communicating with each of the image recognition learning server 24 and the second event estimation data server 25.
  • the processing circuit is, for example, a processor 12 that executes a program stored in the memory 13 as shown in FIG.
  • the function of the communication unit 23 in the notification control device 22 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 13.
  • the processing circuit implements the function of the communication unit 23 by reading and executing the program stored in the memory 13. That is, the notification control device 22 includes a memory 13 for storing a program in which a step of communicating with each of the image recognition learning server 24 and the second event estimation data server 25 is executed as a result. It can also be said that this program causes a computer to execute the procedure or method of the communication unit 23.
  • FIG. 15 is a flowchart showing an example of the operation of the notification control device 22. Note that steps S41 to S48 in FIG. 15 correspond to steps S31 to S38 in FIG. In step S47 of FIG. 15, if the recognition range by the primary event recognition unit 3 is not limited, the process proceeds to step S49. Hereinafter, step S49 will be described.
  • the periphery recognition device 10 is a camera and shoots ahead of the host vehicle.
  • step S49 the communication unit 23 acquires, from the image recognition learning server 24, parameters used when the primary event recognition unit 3 recognizes the primary event from the image.
  • the parameter used by the primary event recognition unit 3 is updated to the parameter acquired by the communication unit 23.
  • the communication unit 23 acquires data used when the secondary event estimation unit 4 estimates the secondary event from the secondary event estimation data server 25.
  • the data used by the second event estimation unit 4 is updated to the data acquired by the communication unit 23. Note that the process of step S49 is not limited to being performed after step S47 or step S48, and may be performed at an arbitrary timing.
  • FIG. 16 is a flowchart showing an example of operations of the image recognition learning server 24 and the secondary event estimation data server 25.
  • step S51 the image recognition learning server 24 recognizes the primary event recognition accuracy by the primary event recognition unit 3 for parameters used when the primary event recognition unit 3 recognizes the primary event from the image. Do machine learning that will improve. Machine-learned parameters are updated.
  • the secondary event estimation data server 25 accumulates data used when the secondary event estimation unit 4 estimates the secondary event. Specifically, the secondary event estimation data server 25 accumulates data associating the primary event and the secondary event as shown in FIG. 3, for example, and updates it as necessary.
  • step S ⁇ b> 53 the image recognition learning server 24 transmits parameters used when the primary event recognition unit 3 recognizes the primary event from the image to the communication unit 23 of the notification control device 22. Further, the secondary event estimation data server 25 transmits data used when the secondary event estimation unit 4 estimates the secondary event to the communication unit 23 of the notification control device 22. Note that the image recognition learning server 24 and the second event estimation data server 25 may transmit parameters and data simultaneously or separately. Further, the timing at which the image recognition learning server 24 and the second event estimation data server 25 transmit parameters and data may be a predetermined timing or an arbitrary timing set by the user.
  • the primary event recognition unit 3 updates the parameters used when the primary event recognition unit 3 recognizes the primary event from the image, so that the primary event recognition unit 3 It becomes possible to improve the recognition accuracy of the primary event. Moreover, the accuracy which the secondary event estimation part 4 estimates a secondary event can be improved by updating the data used when the secondary event estimation part 4 estimates a secondary event. .
  • FIG. 14 demonstrated the case where the notification control apparatus 19 shown in FIG. 11 was further equipped with the communication part 23, it is not restricted to this.
  • FIG. 17 is a block diagram showing an example of the configuration of the notification control device 26 according to the fifth embodiment.
  • the notification control device 26 includes a vehicle control unit 27. Since other configurations and operations are the same as those in the first embodiment, detailed description thereof is omitted here.
  • the vehicle control unit 27 controls the driving operation of the host vehicle based on the secondary event estimation result by the secondary event estimation unit 4. Specifically, the vehicle control unit 27 controls each operation of the accelerator, the brake, and the handle of the host vehicle.
  • the function of the vehicle control unit 27 in the notification control device 26 is realized by a processing circuit. That is, the vehicle control unit 27 includes a processing circuit for controlling the driving of the host vehicle based on the secondary event estimation result by the secondary event estimation unit 4.
  • the processing circuit is, for example, a processor 12 that executes a program stored in the memory 13 as shown in FIG.
  • the function of the vehicle control unit 27 in the notification control device 26 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 13.
  • the processing circuit implements the function of the vehicle control unit 27 by reading and executing the program stored in the memory 13. That is, the vehicle control unit 27 stores a program in which the step of controlling the driving of the host vehicle is executed as a result based on the estimation result of the secondary event by the secondary event estimation unit 4.
  • a memory 13 is provided. It can also be said that this program causes a computer to execute the procedure or method of the vehicle control unit 27.
  • FIG. 18 is a flowchart showing an example of the operation of the notification control device 26. Note that steps S61 to S65 in FIG. 18 correspond to steps S11 to S15 in FIG. Hereinafter, step S66 will be described.
  • the periphery recognition device 10 is a camera and shoots ahead of the host vehicle.
  • step S66 the vehicle control unit 27 controls the driving operation of the host vehicle based on the secondary event estimation result by the secondary event estimation unit 4. Specifically, the vehicle control unit 27 reduces the reaction force of the steering wheel in the direction of avoiding the secondary event, operates the steering wheel in the direction of avoiding the secondary event, applies the brake early, etc. Control the driving operation of the vehicle. For example, referring to FIG. 3, when the second event is “interrupt vehicle”, vehicle control unit 27 performs control to decelerate the host vehicle.
  • the vehicle control unit 27 may control the driving operation of the host vehicle based on the recognition result by the primary event recognition unit 3. For example, referring to FIG. 3, when the first event is “ball jumping out”, the vehicle control unit 27 may perform control so that the steering wheel operation avoids the ball.
  • the notification control device 26 can not only notify the second event but also support the driving operation of the host vehicle.
  • FIG. 17 demonstrated the case where the notification control apparatus 6 shown in FIG. 2 was further provided with the vehicle control part 27, it is not restricted to this.
  • the vehicle control unit 27 may be provided in each of the notification control device 17 shown in FIG. 8, the notification control device 19 shown in FIG. 11, and the notification control device 22 shown in FIG.
  • the primary event recognition unit 3 recognizes the primary event has been described, but the present invention is not limited to this.
  • the first event recognized by another vehicle may be acquired from the other vehicle by inter-vehicle communication.
  • you may acquire the 1st event recognized by infrastructure facilities, such as a camera installed in the road, from the said infrastructure facility by road-to-vehicle communication.
  • the host vehicle can estimate the second event using the first event regarding the place that cannot be captured by the periphery recognition device 10 provided in the host vehicle.
  • the secondary event estimated by the secondary event estimation unit 4 may be transmitted to another vehicle.
  • Embodiments 1 to 5 the case where the camera that is the periphery recognition device 10 captures the front of the host vehicle has been described, but the present invention is not limited to this.
  • the camera that is the periphery recognition device 10 may capture all directions around the host vehicle or any direction.
  • the notification control device described above is not only a vehicle navigation device, that is, a car navigation device, but also a navigation device or navigation constructed as a system by appropriately combining a PND (Portable Navigation Device) and a server that can be mounted on a vehicle.
  • the present invention can be applied to apparatuses other than the apparatus. In this case, each function or each component of the notification control device is distributed and arranged in each function that constructs the system.
  • the function of the notification control device can be arranged in the server.
  • the vehicle side includes a peripheral recognition device 10 and a HUD 11
  • the server 28 includes a peripheral information acquisition unit 2, a primary event recognition unit 3, a secondary event estimation unit 4, and a notification control unit 5.
  • a notification control system can be constructed.
  • the notification control device 17 shown in FIG. 8 the notification control device 19 shown in FIG. 11, the notification control device 22 shown in FIG. 14, and the notification control device 26 shown in FIG.
  • the function of the notification control device can be arranged in the server and the mobile communication terminal 30.
  • the vehicle is provided with the periphery recognition device 10 and the HUD 11, and the server 29 includes the periphery information acquisition unit 2, the primary event recognition unit 3, the secondary event estimation unit 4, and the own vehicle information acquisition.
  • the unit 7, the map information acquisition unit 8, and the alarm image generation unit 9, and the notification control unit 5 in the mobile communication terminal 30 a notification control system can be constructed.
  • the notification control device 17 shown in FIG. 8 the notification control device 19 shown in FIG. 11, the notification control device 22 shown in FIG. 14, and the notification control device 26 shown in FIG.
  • software for executing the operation in the above embodiment may be incorporated in, for example, a server or a mobile communication terminal.
  • the above-described notification control method acquires at least peripheral information indicating a situation around the host vehicle, and is a situation of an object or a person existing around the host vehicle from the acquired peripheral information. Recognizing a primary event and estimating a secondary event that can occur after the recognized primary event and has a causal relationship with the situation of the primary event according to a predetermined rule Then, control for notifying the estimated secondary event is performed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Le but de la présente invention est de fournir un dispositif de commande de rapport et un procédé de commande de rapport qui permettent d'effectuer un rapport approprié pour un conducteur. Un dispositif de commande de rapport selon la présente invention est pourvu : d'une unité d'obtention d'informations périphériques qui obtient des informations périphériques indiquant au moins la situation périphérique d'un véhicule local ; d'une unité de reconnaissance d'événement primaire qui reconnaît, à partir des informations périphériques obtenues par l'unité d'obtention d'informations périphériques, un événement primaire, qui est une situation concernant un objet ou une personne présente autour du véhicule local ; d'une unité d'estimation d'événement secondaire qui estime, conformément à une règle prédéterminée, un événement secondaire, qui est une situation qui peut se produire après l'événement primaire reconnu par l'unité de reconnaissance d'événement primaire et qui a une relation causale avec la situation de l'événement primaire ; d'une unité de commande de rapport qui effectue une commande de rapport de l'événement secondaire estimé par l'unité d'estimation d'événement secondaire.
PCT/JP2017/000193 2017-01-06 2017-01-06 Dispositif et procédé de commande de rapport WO2018127962A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018560292A JP6661796B2 (ja) 2017-01-06 2017-01-06 報知制御装置および報知制御方法
PCT/JP2017/000193 WO2018127962A1 (fr) 2017-01-06 2017-01-06 Dispositif et procédé de commande de rapport

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/000193 WO2018127962A1 (fr) 2017-01-06 2017-01-06 Dispositif et procédé de commande de rapport

Publications (1)

Publication Number Publication Date
WO2018127962A1 true WO2018127962A1 (fr) 2018-07-12

Family

ID=62791235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/000193 WO2018127962A1 (fr) 2017-01-06 2017-01-06 Dispositif et procédé de commande de rapport

Country Status (2)

Country Link
JP (1) JP6661796B2 (fr)
WO (1) WO2018127962A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241677A (zh) * 2019-07-17 2021-01-19 本田技研工业株式会社 信息提供装置、信息提供方法及存储介质
US20220126818A1 (en) * 2020-10-28 2022-04-28 Toyota Research Institute, Inc. Systems and methods for identifying high-risk driving situations from driving data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014021421A1 (fr) * 2012-08-03 2014-02-06 クラリオン株式会社 Dispositif de calcul de paramètres d'appareil de prise de vues, système de navigation, et procédé de calcul de paramètres d'appareil de prise de vues
JP2014203349A (ja) * 2013-04-08 2014-10-27 スズキ株式会社 車両運転支援装置
JP2016192104A (ja) * 2015-03-31 2016-11-10 キヤノンマーケティングジャパン株式会社 情報処理装置、その制御方法及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006072830A (ja) * 2004-09-03 2006-03-16 Aisin Aw Co Ltd 運転支援システム及び運転支援モジュール
JP2006259895A (ja) * 2005-03-15 2006-09-28 Omron Corp 移動体の発進制御装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014021421A1 (fr) * 2012-08-03 2014-02-06 クラリオン株式会社 Dispositif de calcul de paramètres d'appareil de prise de vues, système de navigation, et procédé de calcul de paramètres d'appareil de prise de vues
JP2014203349A (ja) * 2013-04-08 2014-10-27 スズキ株式会社 車両運転支援装置
JP2016192104A (ja) * 2015-03-31 2016-11-10 キヤノンマーケティングジャパン株式会社 情報処理装置、その制御方法及びプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241677A (zh) * 2019-07-17 2021-01-19 本田技研工业株式会社 信息提供装置、信息提供方法及存储介质
US20220126818A1 (en) * 2020-10-28 2022-04-28 Toyota Research Institute, Inc. Systems and methods for identifying high-risk driving situations from driving data

Also Published As

Publication number Publication date
JP6661796B2 (ja) 2020-03-11
JPWO2018127962A1 (ja) 2019-06-27

Similar Documents

Publication Publication Date Title
KR102070530B1 (ko) 모션 계획에 기초한 자율 주행 차량의 운행 방법 및 시스템
JP6894471B2 (ja) 自動運転車(adv)のサブシステムによるパトロールカーのパトロール
US11040726B2 (en) Alarm system of autonomous driving vehicles (ADVs)
EP3324332B1 (fr) Procédé et système pour prédire le comportement de la circulation de véhicule pour des véhicules autonomes pour prendre des décisions de pilotage
US10816973B2 (en) Utilizing rule-based and model-based decision systems for autonomous driving control
US20190071101A1 (en) Driving assistance method, driving assistance device which utilizes same, autonomous driving control device, vehicle, driving assistance system, and program
US10183641B2 (en) Collision prediction and forward airbag deployment system for autonomous driving vehicles
JP6552992B2 (ja) 情報処理装置、車載装置および情報処理方法
JP6412070B2 (ja) 運転支援装置及び運転支援方法
KR20190013688A (ko) 자율 주행을 위한 지도 이미지에 기반한 교통 예측
JP6305650B2 (ja) 自動運転装置及び自動運転方法
US11508161B2 (en) Driving support system and server device
US11092458B2 (en) Navigation system with operation obstacle alert mechanism and method of operation thereof
WO2018127962A1 (fr) Dispositif et procédé de commande de rapport
WO2019131388A1 (fr) Dispositif, système et procédé d'assistance à la conduite, et support d'enregistrement dans lequel est stocké un programme d'assistance à la conduite
JP2020147148A (ja) 情報処理装置及び情報処理装置を備える自動走行制御システム
US10543852B2 (en) Environmental driver comfort feedback for autonomous vehicle
JP2019016227A (ja) 情報処理方法、情報処理装置及び情報処理プログラム
JP7047001B2 (ja) 交通リスク低減プログラム、情報処理装置及び方法
JP2018169945A (ja) 運転支援装置、運転支援方法及び運転支援プログラム
KR20210102212A (ko) 화상 처리 장치, 화상 처리 방법 및 화상 처리 시스템
US20240051569A1 (en) Long-term evolution computing platform for autonomous vehicles based on shell and nut architecture
JP7050606B2 (ja) 警告制御装置、ナビゲーションシステム、及び、警告制御プログラム
JP7417891B2 (ja) 報知制御装置、報知装置、報知制御方法、報知制御プログラム、および、車両情報送信装置
US12017585B2 (en) Abnormality detection device, abnormality detection method, program, and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17889571

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018560292

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17889571

Country of ref document: EP

Kind code of ref document: A1