EP3288005B1 - Dispositif de commande en cas d'occultation - Google Patents

Dispositif de commande en cas d'occultation Download PDF

Info

Publication number
EP3288005B1
EP3288005B1 EP15889886.6A EP15889886A EP3288005B1 EP 3288005 B1 EP3288005 B1 EP 3288005B1 EP 15889886 A EP15889886 A EP 15889886A EP 3288005 B1 EP3288005 B1 EP 3288005B1
Authority
EP
European Patent Office
Prior art keywords
host vehicle
proportion
close observation
blind spot
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP15889886.6A
Other languages
German (de)
English (en)
Other versions
EP3288005A1 (fr
EP3288005A4 (fr
Inventor
Masanori YOSHIHIRA
Seigo Watanabe
Norimasa Kishi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of EP3288005A1 publication Critical patent/EP3288005A1/fr
Publication of EP3288005A4 publication Critical patent/EP3288005A4/fr
Application granted granted Critical
Publication of EP3288005B1 publication Critical patent/EP3288005B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0132Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to vehicle motion parameters, e.g. to vehicle longitudinal or transversal deceleration or speed value
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1253Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks

Definitions

  • the present invention relates to an occlusion control device.
  • a driving assistance device has been known for predicting a risk of contact between a host vehicle and an obstacle around the host vehicle in a case where the host vehicle is traveling under driving behavior according to at least one or more normative behavior candidates, the normative behavior candidates being candidates for normative driving behavior of the host vehicle for a situation around the host vehicle (see Patent Literature 1).
  • Patent Literature 1 calculates a potential contact risk and determines the driving behavior based on the contact risk.
  • Patent Literature 1 Japanese Patent Application Publication No. 2011-096105
  • Patent Literature 1 proportion of the blind spot for the host vehicle with respect to a region on a map, which affects the contact risk, is not taken into account. Thus, driving behavior excessively focusing on safety may be determined, and this may make surrounding people uncomfortable.
  • US 2011/0102195 A1 relates to a system for providing assistance information when a vehicle attempts to enter a priority road, wherein a visibility determination is performed by comparing object information detected by on-board sensors with object information received from an exterior infrastructure facility installed near the intersection.
  • An object of the present invention is to provide an occlusion control device that suppresses driving behavior excessively reducing a risk of contact with a moving object.
  • the occlusion control device 1 calculates proportion of a blind spot 33 for a host vehicle 41 with respect to a close observation detecting criterion (31) set on a map, determines coping behavior of the host vehicle 41 based on this proportion, and performs driving assistance for the host vehicle 41 according to this coping behavior.
  • the close observation detecting criterion (31) is a criterion set in one or more close observation regions in which presence or absence of a moving object should be closely observed while the host vehicle 41 is traveling.
  • the close observation detecting criterion (31) is illustrated with one or more frames (close observation frames 31; Figs.
  • the blind spot 33 for the host vehicle 41 also includes a blind spot for a camera or a laser sensor mounted in the host vehicle.
  • the coping behavior of the host vehicle 41 at least includes coping behavior that takes account of the blind spot 33 for the host vehicle 41 (blind spot coping behavior) and normal coping behavior that does not take account of the blind spot 33 for the host vehicle 41 (normal coping behavior).
  • the occlusion control device 1 includes a GPS 11, a map database 12, a vehicle-mounted camera 13 and a laser sensor 14, an operating unit 15, a close observation frame database 16, and a arithmetic circuit 17a.
  • the GPS 11 is an example of a host vehicle position detector that detects a current position and an orientation of the host vehicle.
  • the GPS 11 receives an electric wave from a NAVSTAR satellite in a global positioning system and determines the position and the orientation of the host vehicle in real time.
  • the host vehicle position detector may be a yaw rate sensor and a vehicle speed sensor for performing odometry (self position estimation) or may be used in combination with the GPS 11.
  • the map database 12 is an example of a map storage for storing map data showing shapes of roads on which the host vehicle can travel.
  • the close observation frame database 16 stores data on a position and a size of the close observation frame 31 (see Fig. 4 ), which is an example of the close observation detecting criterion, on the map.
  • Map data in which the close observation frame is set in advance may be stored in one database.
  • the close observation detecting criterion including the close observation frame is set in the one or more close observation regions on the map in which presence or absence of the moving object should be closely observed.
  • the "moving object" includes a vehicle and a light vehicle that are traveling or standing on the road as well as a pedestrian.
  • the vehicle-mounted camera 13 and the laser sensor 14 are an example of an obstacle detector that detects positions of obstacles around the host vehicle.
  • the vehicle-mounted camera 13 is mounted in the host vehicle and takes an image of surroundings of the host vehicle to obtain a surroundings image.
  • the arithmetic circuit 17a analyzes the surroundings image to determine presence or absence and the position of the obstacle.
  • buildings 36, 37 present around the host vehicle see Fig. 4
  • a wall, a tree, a signboard that are fixed on the ground the "obstacle” also includes the above-mentioned moving object.
  • the laser sensor 14 emits pulses of laser light and detects light reflected from the obstacle, thereby detecting a distance and a direction from the host vehicle to the obstacle.
  • the operating unit 15 includes a touch panel, a steering switch, and the like, which are members for receiving an instruction from the occupant of the host vehicle, arranged on a mike and an instrument panel.
  • the arithmetic circuit 17a uses information on the host vehicle position, the map, the obstacle, and the close observation frame to calculate proportion of the blind spot 33 for the host vehicle 41 with respect to the close observation frame 31. Then, the arithmetic circuit 17a determines the coping behavior of the host vehicle 41 based on this proportion and performs a series of computing processing for performing the driving assistance for the host vehicle 41.
  • the arithmetic circuit 17a is a general-purpose microcomputer including a CPU, a RAM, a ROM, a memory, and an input/output control circuit.
  • a computer program in which the series of arithmetic processes is described is installed in the microcomputer in advance. Executing the computer program, the micro computer constructs multiple processing circuits for executing the above-mentioned series of arithmetic processes.
  • the multiple processing circuits constructed by the arithmetic circuit 17a are described later by reference to Fig. 2 .
  • the arithmetic circuit 17a includes a scene understanding unit 21 and a driving assistance unit 22.
  • the scene understanding unit 21 calculates proportion of the blind spot for the host vehicle and determines the coping behavior of the host vehicle based on this proportion.
  • the driving assistance unit 22 performs the driving assistance for the host vehicle.
  • the driving assistance may be autonomous driving control that the driving assistance unit 22 drives various actuators so that the driving assistance unit 22 proactively performs all driving operation including steering operation and pedal operation.
  • driving operation that the driver should perform may be indicated via the five senses such as hearing, eyesight, and touch of the driver.
  • the scene understanding unit 21 includes a map obtaining unit 23, a route calculator 24, a close observation frame obtaining unit 25, a sensing range calculator 26, a blind spot calculator 27, an visibility proportion calculator 28 (proportion calculator), and a proportion determining unit 29 (behavior determining unit).
  • the route calculator 24 calculates a scheduled traveling route 51 (see Fig. 4 ) from the current position of the host vehicle determined by the GPS 11 to a destination that the operating unit 15 receives. Note that, in the embodiments, a description for a case where the occlusion control device 1 has a function of computing the scheduled traveling route by itself is given. However, the occlusion control device 1 may obtain the scheduled traveling route 51 calculated by another device from outside.
  • the map obtaining unit 23 obtains the map data according to the scheduled traveling route 51 from the map database 12.
  • a digital map can be used as the map data.
  • the digital map includes curb information indicating a position of a curb or road network information.
  • the curb information is used for calculating a travelable region of the host vehicle.
  • the road network information is used for finding a region where the host vehicle 41 can travel at the later-mentioned time.
  • the close observation frame obtaining unit 25 obtains data on the position and the size of the close observation frame 31 on the map from the close observation frame database 16.
  • the map obtaining unit 23 uses the obtained data on the close observation frame 31 to generate map data in which the close observation frame 31 is set. In this way, the map obtaining unit 23 can obtain the map data with the close observation frame 31 set in the one or more close observation region in which presence or absence of the moving object should be closely observed.
  • the sensing range calculator 26 calculates a sensing range 32 (see Fig. 4 ) on the map.
  • the "sensing range 32" represents a range in which the vehicle-mounted camera 13 and the laser sensor 14 can detect the obstacle if there is no obstacle around the host vehicle 41.
  • the sensing range 32 can be calculated for each of the vehicle-mounted camera 13 and the laser sensor 14 while the sensing range 32 is determined by the attached positions and angles of the vehicle-mounted camera 13 and the laser sensor 14 with respect to the host vehicle 41.
  • the sensing range 32 on the map can be calculated based on the current position and orientation of the host vehicle 41 as well as the map data.
  • the blind spot calculator 27 calculates presence or absence of the blind spot 33 for the host vehicle 41 created by the obstacle and the range of the blind spot 33. Within the blind spots for the host vehicle 41 created by the obstacle (e.g., the buildings 36, 37), the blind spot calculator 27 detects a part that overlaps with the sensing range 32 as the blind spot 33 for the host vehicle.
  • the visibility proportion calculator 28 calculates proportion of the blind spot 33 for the host vehicle 41 with respect to the close observation frame 31 (the close observation detecting criterion). For example, with respect to the entire area of the close observation frame 31, the visibility proportion calculator 28 calculates proportion of the area of the close observation frame that overlaps with the blind spot 33 for the host vehicle.
  • the proportion determining unit 29 determines the coping behavior of the host vehicle 41. Specifically, based on the above-mentioned proportion, the proportion determining unit 29 selects either the blind spot coping behavior or the normal coping behavior.
  • FIG. 4 an example where the close observation detecting criterion is the close observation frames 31 surrounding the outer peripheries of the close observation regions is described.
  • the Figs. 4(a), 4(c), and 4(d) shows the host vehicle 41 traveling along the scheduled traveling route 51 that turns right at an intersection of a trifurcate road where three roads converge. Since the buildings 36 and 37 as the obstacle are standing on both sides of the host vehicle 41 entering into the intersection from a traveling direction 43, the blind spots 33 for the host vehicle 41 are created by the buildings 36 and 37. As described above, the blind spots 33 are calculated by the blind spot calculator 27.
  • the close observation frame obtaining unit 25 obtains the close observation frames 31 in which presence or absence of the moving object should be closely observed for the scheduled traveling route 51 turning right at the intersection.
  • the visibility proportion calculator 28 calculates proportion of each area of the close observation frames 31 that overlaps with the corresponding blind spot 33 for the host vehicle 41.
  • the close observation frames 31 are set in areas that tend to be the blind spots 33 for the host vehicle 41 because of the buildings 36 and 37.
  • the visibility proportion calculator 28 performs no proportion calculating operation; thus, the proportion is 0%.
  • the driving assistance unit 22 performs the normal coping behavior.
  • the visibility proportion calculator 28 starts proportion calculating.
  • proportion of each area of the close observation frames 31 overlapping with the blind spot 33 for the host vehicle 41 becomes greater than a predetermined value (e.g., 20%).
  • the proportion determining unit 29 selects the blind spot coping behavior as the coping behavior of the host vehicle 41.
  • a vehicle speed of the host vehicle 41 is a speed based on the normal coping behavior.
  • the host vehicle 41 once decelerates to zero. Thereafter, the host vehicle 41 crawls until the visibility proportion decreases to smaller than or equal to the predetermined value and moves to a position where the host vehicle 41 is able to look the entirety of the close observation frames 31. Getting back to the normal coping behavior, the host vehicle 41 stops temporarily and then resumes the operation for turning right.
  • occlusion control method an example of an occlusion control method according to the first embodiment is described.
  • the occlusion control method is performed repeatedly by a predetermined cycle and is ended with termination of the occlusion control device.
  • step S01 the map obtaining unit 23 obtains the map data from the map database 12.
  • the processing proceeds to step S03; based on the information on the position and the destination of the host vehicle 41, the route calculator 24 computes the scheduled traveling route 51 of the host vehicle.
  • the map data of an area according to the scheduled traveling route 51 may be obtained after the route computing. This makes it possible to reduce an amount of the obtained data.
  • step S05 the close observation frame obtaining unit 25 obtains the data on the positions and the sizes of the close observation frames 31 on the map from the close observation frame database 16.
  • the map obtaining unit 23 uses the obtained data on the close observation frames 31 to generate the map data in which the close observation frames 31 are set.
  • the processing proceeds to step S07; the sensing range calculator 26 computes the sensing range 32 on the map based on the current position and orientation of the host vehicle as well as the map data.
  • the blind spot calculator 27 detects a part that overlaps with the sensing range 32 as the blind spot 33 for the host vehicle. Specifically, the blind spot calculator 27 obtains position information of the buildings 36, 37 present around the vehicle that is detected by the vehicle-mounted camera 13 and the laser sensor 14. By comparing the sensing range 32 and the positions of the buildings 36, 37, the blind spot calculator 27 can calculate the blind spots 33 for the host vehicle that overlap with the sensing range 32.
  • step S11 the scene understanding unit 21 determines whether the close observation frames 31 and the blind spots 33 of the host vehicle overlap with each other. If they overlap with each other (YES in S11), the processing proceeds to step S13. If they do not overlap with each other (NO in S11), the processing returns to step S07. In step S13, with respect to the entire area of the close observation frame 31, the visibility proportion calculator 28 calculates proportion of each area of the close observation frames that overlaps with the blind spot 33 for the host vehicle.
  • Ts a predetermined starting threshold
  • the coping behavior switches from the normal coping behavior to the blind spot coping behavior (S19). This makes the vehicle speed decelerate from a usual speed to zero. Performing such blind spot coping behavior lowers the contact risk in a scene with bad visibility and allows safe traveling. Then, the blind spot coping behavior is continued until the proportion becomes smaller than the predetermined ending threshold (Te). Specifically, as shown in Fig. 10 , the host vehicle is crawled and moved until the proportion becomes smaller than the predetermined ending threshold (Te). When the proportion becomes smaller than the predetermined ending threshold (Ts), the host vehicle is stopped.
  • the starting threshold (Ts) and the ending threshold (Te) may be the same value; however, the ending threshold (Te) is desirably smaller than the starting threshold (Ts) so that hysteresis is made. This improves stability of a vehicle control system. For example, when the starting threshold (Ts) is set to 10%, the ending threshold (Te) may be set to 5%.
  • the vehicle in a case where the moving object is detected in the region with good visibility while the vehicle is moving to make the proportion smaller than the predetermined ending threshold (Te), the vehicle is stopped. Otherwise, if moving speed of the moving object is slow, the vehicle may be accelerated to pass through in front of the moving object.
  • Te predetermined ending threshold
  • Determining the coping behavior of the host vehicle 41 based on the proportion of the blind spot 33 for the host vehicle 41 with respect to the close observation frame 31 (close observation detecting criterion) can suppress the driving behavior excessively focusing on safety.
  • the driving behavior excessively focusing on safety can be suppressed, and discomfort that surrounding people feel can be lessened.
  • the visibility proportion calculator 28 calculates proportion of the area of the close observation frame 31 that overlaps with the blind spot 33 of the host vehicle 41. Since the coping behavior is determined according to area ratio between the blind spot 33 and the frame surrounding the outer periphery of the close observation region, risk computing with high accuracy can be achieved with a simple model.
  • the proportion determining unit 29 decides to start the coping behavior that takes account of the blind spot 33 for the host vehicle 41. Switching the traveling control based on the starting threshold (Ts) as a boundary can achieve appropriate control for each scene. Performing the coping behavior excessively reducing the contact risk is suppressed.
  • the starting threshold (Ts) is set small. In this way, sudden speed control or steering control when starting control is suppressed, and thus the control can be started smoothly.
  • the driving assistance unit 22 allows the host vehicle to move until the proportion becomes smaller than the predetermined ending threshold (Te), and allows the host vehicle to stop thereafter. In this way, safe traveling can be performed even in a situation with bad visibility.
  • the close observation detecting criterion may be a close observation point group including multiple close observation points 42 provided in the close observation region.
  • the visibility proportion calculator 28 calculates proportion of the number of the close observation points 42 that overlap with the blind spots 33 for the host vehicle 41.
  • Other configuration and operation procedure are the same as that in the first embodiment; thus, descriptions thereof are omitted.
  • the close observation points 42 are arranged irregularly within the close observation region.
  • density distribution of the close observation points 42 within the close observation region changes according to magnitude of the contact risk.
  • the close observation points 42 are distributed in high density at: a position where the moving object is likely to exist; a position with bad visibility; or a position where the risk of having the contact between the moving object and the host vehicle 41 is high if the moving object exists. This makes it possible to calculate appropriate proportion according to the contact risk.
  • the visibility proportion calculator 28 just simply obtains the number of the close observation points 42 that overlap with the blind spot 33 for the host vehicle 41.
  • the starting threshold (Ts) and the ending threshold (Te) can be determined based on the number of the close observation points 42.
  • the proportion may be calculated like the first embodiment by using the area ratio between the overlapping part and the close observation frame surrounding an outer periphery of the close observation point 42. That is, as far as computing processing load allows, each close observation point 42 can be used as one or more close observation frame 31 in Fig. 4 .
  • the proportion can be simply obtained from the number of the close observation points by setting the close observation point group including the multiple close observation points 42 provided in the close observation region as the close observation detecting criterion.
  • risk computing with high accuracy can be achieved without increasing calculation load.
  • the close observation detecting criterion may be a line bundle including one or more close observation line segments 44 provided in the close observation region.
  • the visibility proportion calculator 28 calculates proportion of the lengths of the close observation line segments 44 that overlap with the blind spot 33 for the host vehicle 41.
  • Other configuration and operation procedure are the same as that in the first embodiment; thus, descriptions thereof are omitted.
  • the lengths, the number, and arrangement of the close observation line segments 44 can be changed arbitrarily. According to the magnitude of the contact risk, density distribution of the close observation line segments 44 within the close observation region is changed. That is, the close observation line segments 44 are distributed in high density at: a position where the moving object is likely to exist; a position with bad visibility; or a position where the risk of having the contact between the moving object and the host vehicle 41 is high if the moving object exists. This makes it possible to calculate appropriate proportion according to the contact risk.
  • the visibility proportion calculator 28 may just simply obtain the length of the close observation line segment 44 that overlap with the blind spot 33 for the host vehicle 41.
  • the starting threshold (Ts) and the ending threshold (Te) can be determined based on the length of the close observation line segment 44.
  • the proportion can be simply obtained from the length of the close observation line segment 44 by setting the one or more close observation line segments 44 provided in the close observation region as the close observation detecting criterion.
  • risk computing with high accuracy can be achieved without increasing calculation load.
  • the close observation frame 31 shown in Fig. 4 can be interpreted as a line bundle including four close observation line segments 44 provided in the close observation region.
  • the visibility proportion calculator 28 may just calculate proportion of the length of the close observation frame 31 that overlaps with the blind spot 33 for the host vehicle 41 with respect to the total length of the close observation frame 31.
  • the whole close observation frame 31 may be one line segment including one curved line as shown in Fig. 7 .
  • the close observation frame 31 is not limited to be a quadrangle as shown in Fig. 4 ; for example, the close observation frame 31 may be an oval 45, a true circle, a triangle, or a polygon with five or more corners.
  • proportion of the blind spot that overlaps with the area surrounded by the frame may be calculated like the first embodiment.
  • the close observation detecting criterion may include weighting that is changed according to a position on the map.
  • proportion of the blind spot for the host vehicle with respect to the close observation detecting criterion changes according to the weighting applied to the position on the map where the close observation detecting criterion overlaps with the blind spot for the host vehicle.
  • the close observation detecting criterion is the close observation frame
  • proportion of the area where the close observation frame and the blind spot for the host vehicle overlap with each other with respect to the entire area of the close observation frame changes according to the weighting.
  • Fig. 8 shows an example of a close observation frame 48 to which weighting is applied.
  • the close observation frame 48 includes overlapped multiple (three in Fig. 8 ) ovals with different sizes.
  • each oval is applied with information on accident probability Z as an example of the weighting.
  • an inner small oval is applied with information indicating the accident probability Z that is possibility higher than that associated with an outer larger oval.
  • the contact risk is higher in a case where the blind spot 33 for the host vehicle 41 overlaps with the inner smaller oval than in a case where the blind spot 33 for the host vehicle 41 overlaps with the outer larger oval, and the proportion (visibility proportion) becomes greater.
  • the information for the accident probability Z may be obtained by downloading past accident data (46, 47) from an accident information management server as shown in Fig. 8(a) , for example.
  • a position on the map to which high accident probability Z is applied may be not only a position where an accident is occurred in the past, but may also be another position that is similar to the position where an accident is occurred. For example, if there was an accident of hitting another vehicle when turning right or left at an intersection, the high accident probability Z can also be applied to a position where similar accident can be occurred in a case of entering into different intersection or the same intersection from another direction.
  • a value of the accident probability Z may be changed according to a detail of the accident.
  • the accident probability Z of a fatal accident may be higher than that of a property damage accident.
  • Distribution of the accident probability Z shown in Fig. 8(c) can be calculated as Gauss distribution, for example.
  • proportion of the blind spot 33 for the host vehicle 41 with respect to the close observation detecting criterion is changed according to the weighting (the accident probability Z) applied to the position on the map that overlaps with the blind spot 33 for the host vehicle 41. This makes it possible to accurately evaluate the contact risk, and to calculate appropriate proportion according to the contact risk.
  • the close observation detecting criterion is made of the oval frame.
  • the close observation line segment and the close observation point can also be weighted in the same way.
  • the close observation detecting criterion may be changed according to a date and time, an environment around the host vehicle 41, or a way of motion of the moving object present around the host vehicle 41.
  • the close observation detecting criterion is changed according to the date and time.
  • the close observation detecting criterion is set widely.
  • the starting threshold and the ending threshold of the blind spot coping control are decreased. This lowers the contact risk and improves safety.
  • a side closer to the host vehicle 41 and a side farther from the host vehicle 41 may be respectively applied with divided close observation frames (31aa, 31ab).
  • the close observation frame 31ab in the closer side is set for detecting a pedestrian
  • the close observation frame 31aa in the farther side is set for detecting a two-wheel vehicle.
  • Different close observation detecting criteria can be set for a case where the other vehicle 53 is standing at a stop line on an entry of the intersection and for a case where the other vehicle 53 is traveling around the stop line.
  • the close observation detecting criterion is the close observation frame
  • a close observation frame with a short length in a traveling direction of the other vehicle 53 is set as the speed of the other vehicle 53 is higher
  • a close observation frame with a long length in the traveling direction of the other vehicle 53 is set.
  • the reason why the length of the close observation frame made long when the other vehicle 53 is standing is that long-distance vision is needed for predicting the two-wheel vehicle coming into the close observation region.
  • the starting threshold and the ending threshold of the blind spot coping control are decreased. This makes it possible to lower the contact risk and to improve safety.
  • Safety can be improved under a situation such as a situation where the road is crowded on holiday or during commuting time, at sunset time, or at night time, where it is difficult for the other vehicle to confirm safety around the vehicle.
  • the blind spot coping control for the speed of the host vehicle 41 has been described as shown in Fig. 4(b) ; however, the blind spot coping control is not only for the speed of the host vehicle 41 but also for the scheduled traveling route 51 of the host vehicle 41.
  • the host vehicle 41 turns right at the intersection along the scheduled traveling route 51.
  • the host vehicle 41 is started to move toward the left side before entering into the intersection and turns right at the intersection along a scheduled traveling route 52 that is on the outer side of the scheduled traveling route 51.
  • the blind spot that overlaps with the close observation region 31a can be decreased when entering into the intersection. This makes it possible to allow the host vehicle 41 to pass through the intersection safely without excessive deceleration control.
  • the blind spot coping control for the scheduled traveling route of the host vehicle 41 can be performed.
  • a close observation region 31c exists on the left side of the host vehicle 41.
  • the host vehicle 41 is started to move toward the right side before entering into the intersection and goes straight through the intersection along the scheduled traveling route 52 that is on the right side of the scheduled traveling route 51. In this way, the blind spot that overlaps with the close observation region 31c can be decreased when entering into the intersection.
  • Patent Literature 1 calculates a potential contact risk and determines the driving behavior based on the contact risk.
  • Patent Literature 1 cannot take account of the static condition and thus determines that the host vehicle will collide with the motionless object.
  • the host vehicle cannot accelerate.
  • the host vehicle does not expect that there is no moving object.
  • the embodiments take account of proportion of the blind spot for the host vehicle with respect to the close observation region on the map, which affects the contact risk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Claims (5)

  1. Dispositif de contrôle d'occultation (1), caractérisé en ce qu'il comprend :
    une unité de contrôle de véhicule (22) qui met en œuvre un contrôle de déplacement d'un véhicule hôte (41) ; et
    une unité de compréhension de scène (21) qui détermine un comportement d'adaptation du véhicule hôte (41),
    dans lequel
    l'unité de compréhension de scène (21) comprend
    une unité d'obtention de carte (23) qui obtient des données cartographiques conformément à un itinéraire de déplacement prévu ;
    un calculateur d'itinéraire (24) qui soit calcule, soit se procure l'itinéraire de déplacement prévu ;
    une unité d'obtention de trame d'observation proche (25) qui obtient des données concernant la position et la taille de régions d'observation sur la carte, les régions d'observation étant des régions dans lesquelles la présence ou l'absence d'un objet mobile doit être observée ;
    un calculateur de plage de détection (26) qui calcule une plage de détection (32) sur la carte sur base d'une position et d'une orientation du véhicule hôte (41) ;
    un calculateur d'angle mort (27) qui calcule comme angle mort (33) du véhicule hôte (41) une région de la plage de détection (32) qui chevauche l'angle mort (33) du véhicule hôte (41) créé par des obstacles ;
    un calculateur de proportion (28) qui calcule une proportion des régions d'observation qui chevauchent l'angle mort (33) pour le véhicule hôte (41) par rapport à l'entièreté de la zone des régions d'observation ; et
    une unité de détermination de comportement (29) qui décide de démarrer un comportement d'adaptation d'angle mort pour décélérer le véhicule hôte (41) jusqu'à vitesse nulle quand la proportion devient supérieure à un seuil de démarrage prédéterminé ;
    dans lequel
    l'unité de contrôle de véhicule (22) contrôle la vitesse du véhicule hôte (41) de manière à décélérer le véhicule hôte (41) jusqu'à vitesse nulle quand l'unité de détermination de comportement (29) décide de démarrer le comportement d'adaptation d'angle mort,
    l'unité d'obtention de carte (23) utilise les données obtenues par l'unité d'obtention de trame d'observation proche (25) dans les régions d'observation et les données cartographiques obtenues pour générer des données cartographiques dans lesquelles les régions d'observation sont configurées, et
    le calculateur de proportion (28) calcule la proportion si les régions d'observation et l'angle mort (33) du véhicule hôte (41) se chevauchent.
  2. Dispositif de contrôle d'occultation (1) selon la revendication 1, dans lequel
    une pluralité de points d'observation (42) est pourvue comme régions d'observation, et
    le calculateur de proportion (28) calcule la proportion du nombre de points d'observation qui chevauchent l'angle mort (33) pour le véhicule hôte (41) par rapport au nombre de tous les points d'observation comme étant la proportion des régions d'observation.
  3. Dispositif de contrôle d'occultation (1) selon la revendication 1, dans lequel
    un ou plusieurs segments de ligne d'observation (44) sont pourvus comme régions d'observation, et
    le calculateur de proportion (28) calcule la proportion des longueurs de segments de ligne d'observation qui chevauchent l'angle mort (33) pour le véhicule hôte (41) par rapport à la valeur totale des longueurs de tous les segments de ligne d'observation comme étant la proportion des régions d'observation.
  4. Dispositif de contrôle d'occultation (1) selon la revendication 1, dans lequel le seuil de démarrage est baissé quand la vitesse du véhicule hôte (41) augmente.
  5. Dispositif de contrôle d'occultation (1) selon la revendication 1 ou 4, dans lequel
    l'unité de détermination de comportement (29) détermine si la proportion descend ou non sous un seuil de fin prédéterminé, et
    l'unité de contrôle de véhicule (22) permet au véhicule hôte (41) d'avancer lentement jusqu'à ce que la proportion devienne inférieure au seuil de fin prédéterminé, et arrête le véhicule hôte (41) quand la proportion devient inférieure au seuil de fin prédéterminé.
EP15889886.6A 2015-04-23 2015-04-23 Dispositif de commande en cas d'occultation Active EP3288005B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/062406 WO2016170647A1 (fr) 2015-04-23 2015-04-23 Dispositif de commande en cas d'occultation

Publications (3)

Publication Number Publication Date
EP3288005A1 EP3288005A1 (fr) 2018-02-28
EP3288005A4 EP3288005A4 (fr) 2018-08-01
EP3288005B1 true EP3288005B1 (fr) 2021-04-07

Family

ID=57143820

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15889886.6A Active EP3288005B1 (fr) 2015-04-23 2015-04-23 Dispositif de commande en cas d'occultation

Country Status (10)

Country Link
US (1) US9988007B2 (fr)
EP (1) EP3288005B1 (fr)
JP (1) JP6428928B2 (fr)
KR (2) KR20190038675A (fr)
CN (1) CN107533803B (fr)
BR (1) BR112017022775B1 (fr)
CA (1) CA2983682C (fr)
MX (1) MX360379B (fr)
RU (1) RU2663261C1 (fr)
WO (1) WO2016170647A1 (fr)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6038423B1 (ja) * 2016-01-28 2016-12-07 三菱電機株式会社 事故確率計算装置、事故確率計算方法及び事故確率計算プログラム
US20190011913A1 (en) * 2017-07-05 2019-01-10 GM Global Technology Operations LLC Methods and systems for blind spot detection in an autonomous vehicle
KR102198809B1 (ko) * 2017-12-26 2021-01-05 한국자동차연구원 객체 추적 시스템 및 방법
JP7121532B2 (ja) 2018-04-27 2022-08-18 株式会社小松製作所 積込機械の制御装置及び積込機械の制御方法
JP7182376B2 (ja) * 2018-05-14 2022-12-02 日産自動車株式会社 運転支援方法及び運転支援装置
CN108805042B (zh) * 2018-05-25 2021-10-12 武汉东智科技股份有限公司 道路区域监控视频被树叶遮挡的检测方法
EP3588006B1 (fr) * 2018-06-28 2021-08-11 Continental Automotive GmbH Détermination des distances de visibilité basée sur le champ de vision dynamique d'un véhicule
CN110874549A (zh) * 2018-08-31 2020-03-10 奥迪股份公司 目标视野确定方法、系统、计算机设备和存储介质
CN109624858B (zh) * 2019-01-04 2021-02-02 斑马网络技术有限公司 外后视镜的图像显示方法及装置
US11433894B2 (en) * 2019-03-27 2022-09-06 Nissan Motor Co., Ltd. Driving assistance method and driving assistance device
JP7148453B2 (ja) * 2019-04-19 2022-10-05 トヨタ自動車株式会社 運転支援システム
JP2020179808A (ja) * 2019-04-26 2020-11-05 トヨタ自動車株式会社 車両制御装置
JP7282167B2 (ja) * 2019-05-08 2023-05-26 三菱電機株式会社 運転支援装置および運転支援方法
JP7295012B2 (ja) * 2019-12-24 2023-06-20 日立Astemo株式会社 車両制御システム、および、車両制御方法
CN111556295A (zh) * 2020-05-12 2020-08-18 新石器慧通(北京)科技有限公司 可移动监控云台的控制方法、装置及无人车
DE102020130069B4 (de) 2020-11-13 2024-05-23 Audi Aktiengesellschaft Steuerung eines Kraftfahrzeugs bei teilweiser Sichtfeldverdeckung
US11733054B2 (en) * 2020-12-11 2023-08-22 Motional Ad Llc Systems and methods for implementing occlusion representations over road features
JP7487658B2 (ja) 2020-12-24 2024-05-21 トヨタ自動車株式会社 駐車支援装置

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0724742B2 (ja) * 1988-07-25 1995-03-22 テルモ株式会社 ポリプロピレン多孔質中空糸膜およびその製造方法
JP2552728B2 (ja) * 1989-05-31 1996-11-13 富士通株式会社 赤外線監視システム
US5724475A (en) * 1995-05-18 1998-03-03 Kirsten; Jeff P. Compressed digital video reload and playback system
US6532038B1 (en) * 1999-08-16 2003-03-11 Joseph Edward Haring Rail crossing video recorder and automated gate inspection
DE10132681C1 (de) * 2001-07-05 2002-08-22 Bosch Gmbh Robert Verfahren zur Klassifizierung von einem Hindernis anhand von Precrashsensorsignalen
JP4335651B2 (ja) * 2003-12-03 2009-09-30 富士通テン株式会社 周辺監視装置
US7671725B2 (en) * 2006-03-24 2010-03-02 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, vehicle surroundings monitoring method, and vehicle surroundings monitoring program
JP4760562B2 (ja) 2006-06-19 2011-08-31 日産自動車株式会社 車両用周辺情報提示装置及び車両用周辺情報提示方法
KR101075615B1 (ko) 2006-07-06 2011-10-21 포항공과대학교 산학협력단 주행 차량의 운전자 보조 정보 생성 장치 및 방법
US8068986B1 (en) * 2007-04-27 2011-11-29 Majid Shahbazi Methods and apparatus related to sensor signal sniffing and/or analysis
US8265800B2 (en) * 2007-08-20 2012-09-11 Raytheon Company Unmanned vehicle message conversion system
US8031062B2 (en) * 2008-01-04 2011-10-04 Smith Alexander E Method and apparatus to improve vehicle situational awareness at intersections
JP5146142B2 (ja) * 2008-06-20 2013-02-20 トヨタ自動車株式会社 飲酒状態検出装置
JP4527204B2 (ja) * 2008-09-26 2010-08-18 パナソニック株式会社 死角車両検出装置及びその方法
JP5228928B2 (ja) * 2009-01-13 2013-07-03 トヨタ自動車株式会社 運転支援装置
JP4784659B2 (ja) * 2009-02-16 2011-10-05 トヨタ自動車株式会社 車両用周辺監視装置
JP5613398B2 (ja) * 2009-10-29 2014-10-22 富士重工業株式会社 交差点運転支援装置
JP5407764B2 (ja) 2009-10-30 2014-02-05 トヨタ自動車株式会社 運転支援装置
JP2011118482A (ja) * 2009-11-30 2011-06-16 Fujitsu Ten Ltd 車載装置および認知支援システム
JP5269755B2 (ja) * 2009-12-10 2013-08-21 株式会社日立製作所 人横断支援車両システム及び人横断支援方法
US8384534B2 (en) * 2010-01-14 2013-02-26 Toyota Motor Engineering & Manufacturing North America, Inc. Combining driver and environment sensing for vehicular safety systems
DE102010015079A1 (de) * 2010-04-15 2011-10-20 Valeo Schalter Und Sensoren Gmbh Verfahren zum Anzeigen eines Bildes auf einer Anzeigeeinrichtung in einem Fahrzeug. Fahrerassistenzsystem und Fahrzeug
WO2011129014A1 (fr) * 2010-04-16 2011-10-20 トヨタ自動車株式会社 Dispositif de support de conduite
JP2011248870A (ja) * 2010-04-27 2011-12-08 Denso Corp 死角領域検出装置、死角領域検出プログラム、および死角領域検出方法
JP5601930B2 (ja) 2010-08-09 2014-10-08 本田技研工業株式会社 車両用表示装置
US9058247B2 (en) * 2010-09-08 2015-06-16 Toyota Jidosha Kabushiki Kaisha Risk potential calculation apparatus
US9987506B2 (en) * 2010-12-15 2018-06-05 Robert Marcus UAV—or personal flying device—delivered deployable descent device
JP5779768B2 (ja) * 2012-09-14 2015-09-16 株式会社吉澤 織物
US10029621B2 (en) * 2013-05-16 2018-07-24 Ford Global Technologies, Llc Rear view camera system using rear view mirror location
US9335178B2 (en) * 2014-01-28 2016-05-10 GM Global Technology Operations LLC Method for using street level images to enhance automated driving mode for vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
BR112017022775A2 (pt) 2018-07-10
CA2983682C (fr) 2018-06-12
MX360379B (es) 2018-10-31
US20180118144A1 (en) 2018-05-03
CN107533803B (zh) 2019-03-12
WO2016170647A1 (fr) 2016-10-27
CN107533803A (zh) 2018-01-02
KR20190038675A (ko) 2019-04-08
BR112017022775B1 (pt) 2023-11-07
MX2017013408A (es) 2018-01-30
JP6428928B2 (ja) 2018-11-28
JPWO2016170647A1 (ja) 2018-04-05
US9988007B2 (en) 2018-06-05
EP3288005A1 (fr) 2018-02-28
KR20170129251A (ko) 2017-11-24
CA2983682A1 (fr) 2016-10-27
RU2663261C1 (ru) 2018-08-03
EP3288005A4 (fr) 2018-08-01
KR102009585B1 (ko) 2019-08-09

Similar Documents

Publication Publication Date Title
EP3288005B1 (fr) Dispositif de commande en cas d'occultation
US11692848B2 (en) Map information system
CN107867289B (zh) 行驶辅助装置和行驶辅助方法
US9527529B2 (en) Method for operating a driver assistance system, and driver assistance system
CA3077124C (fr) Procede et appareil d'aide a la conduite
US20160194003A1 (en) Vehicle travelling control device
US10795374B2 (en) Vehicle control device
EP3925845B1 (fr) Procédé de prédiction d'action d'un autre véhicule et dispositif de prédiction d'action d'un autre véhicule
KR101999079B1 (ko) 표시 장치의 제어 방법 및 표시 장치
US10324472B2 (en) Vehicle control device
JPWO2019043847A1 (ja) 走行制御装置、車両および走行制御方法
EP3660455B1 (fr) Procédé d'aide au déplacement et dispositif d'aide au déplacement
EP4134288B1 (fr) Procédé d'estimation de comportement de véhicule, procédé de commande de véhicule et dispositif d'estimation de comportement de véhicule
JP7124784B2 (ja) 車両制御装置
US12036978B2 (en) Driving assistance method and driving assistance device
US20210291736A1 (en) Display control apparatus, display control method, and computer-readable storage medium storing program
JP7334107B2 (ja) 車両制御方法及び車両制御装置
JP2020199787A (ja) 車両の走行制御方法及び走行制御装置
EP4360998A1 (fr) Dispositif d'aide à la conduite, procédé d'aide à la conduite et support d'informations non transitoire

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20171110

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

A4 Supplementary search report drawn up and despatched

Effective date: 20180703

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101ALI20180627BHEP

Ipc: B60R 21/0132 20060101ALI20180627BHEP

Ipc: G06F 3/147 20060101ALI20180627BHEP

Ipc: B60W 30/095 20120101ALI20180627BHEP

Ipc: B60R 1/12 20060101ALI20180627BHEP

Ipc: G08G 1/16 20060101AFI20180627BHEP

Ipc: B60R 21/00 20060101ALI20180627BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190411

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200417

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

INTC Intention to grant announced (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20201210

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1380711

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210415

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015067950

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210407

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1380711

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210407

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210708

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210807

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210809

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210707

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210423

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602015067950

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210406

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220110

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210423

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210807

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20150423

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240321

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240320

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240320

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210407