WO2017009934A1 - 信号機認識装置及び信号機認識方法 - Google Patents
信号機認識装置及び信号機認識方法 Download PDFInfo
- Publication number
- WO2017009934A1 WO2017009934A1 PCT/JP2015/070042 JP2015070042W WO2017009934A1 WO 2017009934 A1 WO2017009934 A1 WO 2017009934A1 JP 2015070042 W JP2015070042 W JP 2015070042W WO 2017009934 A1 WO2017009934 A1 WO 2017009934A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- traffic light
- image
- detection area
- self
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 30
- 238000001514 detection method Methods 0.000 claims abstract description 107
- 238000006073 displacement reaction Methods 0.000 claims abstract description 10
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 abstract description 23
- 230000006399 behavior Effects 0.000 description 60
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 238000009434 installation Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 206010011878 Deafness Diseases 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09623—Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Definitions
- the present invention relates to a traffic signal recognition device and a traffic signal recognition method for recognizing a traffic signal existing on a traveling path of a vehicle.
- Patent Document 1 Japanese Patent Laid-Open No. 2007-241469
- the position of a traffic signal existing on a vehicle traveling path is estimated based on map information and a self-position estimation result, and the traffic signal is detected in an image captured by a camera. It describes that an area is set and the traffic lights existing in the detection area are subjected to image processing to detect the lighting state of the traffic lights.
- the detection area of the traffic light is past by the time difference. May be set at the position of, and the traffic light may be out of the detection area.
- the present invention has been made in order to solve such a conventional problem, and an object of the present invention is between the timing at which the self-position estimation result is updated and the timing at which an image is captured by the camera. It is an object of the present invention to provide a signal recognition device and a signal recognition method capable of appropriately setting a detection region of a traffic light even when a time difference occurs in the traffic light.
- a traffic light recognition apparatus includes an imaging unit that captures an image around a vehicle, a map information acquisition unit that acquires map information around the vehicle, a self-position detection unit that detects a self-position of the vehicle, A traffic signal position estimation unit that estimates the position of the traffic signal on the image based on the self position and the map information is provided. Further, a vehicle behavior estimation unit that estimates the behavior of the vehicle, and a traffic signal detection region that sets a traffic signal detection region on the image based on the position of the traffic signal on the image and the amount of displacement of the traffic signal position on the image due to the vehicle behavior A setting unit and a traffic signal recognition unit for detecting a traffic signal from the detection area are provided.
- a traffic light recognition method captures an image around a vehicle, acquires map information around the vehicle, detects a self-location on a map of the vehicle, and detects a traffic signal based on the self-location and the map information. Estimate the position on the image. Further, the behavior of the vehicle is estimated, and based on the estimated position of the traffic light on the image and the amount of displacement of the position of the traffic light on the image due to the vehicle behavior, a traffic light detection area is set on the image. Detect traffic lights.
- FIG. 1 is a block diagram showing a configuration of a traffic signal recognition device and its peripheral devices according to the first embodiment of the present invention.
- FIG. 2 is a block diagram showing in detail the traffic signal recognition apparatus 100 shown in FIG.
- the traffic signal recognition device 100 is mounted on a vehicle 51, and map information D02, camera information D03, self-position information D05, vehicle behavior information D07, and images from various devices mounted on the vehicle 51. Data D09 is input. And the lighting color information D04 which shows the lighting color of a traffic light is output to a latter apparatus (illustration omitted).
- the lighting color information D04 is used, for example, for automatic operation control.
- the camera information D03 is information regarding the installation position of the camera 11 (see FIG. 2) with respect to the vehicle 51.
- the imaging area around the vehicle by the camera 11 can be estimated based on the camera information D03.
- the map information D02 is information given from a map database including map data (map information around the vehicle) of the travel path on which the vehicle travels.
- the map information D02 is the position information of a target such as a ground landmark existing on the travel path and the traffic signal. Location information and the like are included.
- the traffic signal recognition apparatus 100 includes a camera 11 (imaging unit), a self-position detection unit 12, a traffic signal position estimation unit 13, a vehicle behavior estimation unit 14, a traffic signal detection region calculation unit 15 (signal device detection region setting unit). ), A traffic light recognition unit 16, a map information acquisition unit 17, and a landmark information acquisition unit 18.
- the camera 11 is a digital camera provided with a solid-state image sensor such as a CCD or a CMOS, for example, and captures a digital image of the peripheral area by imaging the periphery of the traveling path of the vehicle 51.
- the camera 11 outputs the captured image to the traffic signal recognition unit 16 as image data D09.
- the camera 11 stores information related to the installation position of the camera 11 with respect to the vehicle 51 and outputs the information to the traffic light detection area calculation unit 15 as camera information D03.
- the information regarding the installation position of the camera 11 can be calculated from the position on the image captured by the camera 11 by setting a calibration mark or the like at a known position with respect to the vehicle 51, for example. .
- the map information acquisition unit 17 uses the map database including the map data (map information around the vehicle) on which the vehicle travels, the position information of targets such as ground landmarks existing near the travel route, and the position of the traffic light. Get information.
- the map information D02 is output to the self-position detecting unit 12 and the traffic signal position estimating unit 13.
- the landmark information acquisition unit 18 is, for example, a vehicle-mounted sensing camera or laser radar, and recognizes ground landmarks (road markings (lane marks, stop lines, characters), curbs, traffic lights, signs, etc.). The information on the relative position with respect to the vehicle 51 is acquired. The landmark information is output to the self-position detector 12.
- the self-position detecting unit 12 acquires the landmark information D01 and the map information D02, detects the current position on the map of the vehicle 51 based on these information, and outputs this as the self-position information D05.
- the landmark information D01 includes information indicating the relative positional relationship of the ground landmark with respect to the vehicle 51. Accordingly, the current position of the vehicle 51 on the map can be detected by comparing the position information of the landmark information D01 with the position information of the ground landmark included in the map information D02.
- position includes coordinates and orientation. Specifically, the position of the ground landmark includes its coordinates and posture, and the position of the vehicle 51 includes its coordinates and posture.
- the self-position detector 12 outputs the coordinates (x, y, z) in the reference coordinate system and the posture (yaw, pitch, roll) that is the rotation direction of each coordinate axis as the self-position information D05.
- the traffic signal position estimation unit 13 estimates the relative position of the traffic signal with respect to the vehicle 51 based on the map information D02 and the self-position information D05.
- the map information D02 position information of each traffic signal existing on the travel path of the vehicle 51 is registered as coordinates. Therefore, the relative position of the traffic signal with respect to the vehicle 51 can be calculated based on the coordinates of the traffic signal and the coordinates and posture of the vehicle 51.
- the traffic signal position estimation part 13 outputs the calculated relative position of the traffic signal as traffic signal relative position information D06.
- the vehicle behavior estimation unit 14 uses a variety of information such as the vehicle speed, yaw rate, and turning angle (steering amount) of the vehicle 51 to time difference from the timing at which the self-position estimation result is updated to the timing at which an image is captured. (This is referred to as “ ⁇ T”), and the amount of movement of the vehicle 51 during this time difference ⁇ T is calculated. And this movement amount is output as vehicle behavior information D07. A method of calculating the movement amount will be described later.
- the above-described steering angle is an angle at which the steered wheels of the vehicle tilt in the left-right direction with respect to the straight traveling direction, and the steered angle is, for example, in a rack and pinion type steered mechanism as a steered amount. It can be replaced by the amount of movement of the spline shaft of the rack mechanism.
- the traffic signal detection area calculation unit 15 corrects the relative position between the vehicle 51 and the traffic signal based on the camera information D03, the traffic signal relative position information D06, and the vehicle behavior information D07, and further sets the camera information D03 and the corrected relative position. Based on this, the detection area of the traffic light is set in the image captured by the camera 11. This detection area is output to the traffic signal recognition unit 16 as traffic signal detection area information D08.
- the timing at which an image is captured by the camera 11 and the timing at which the self-position estimation result of the vehicle 51 is updated by the self-position detecting unit 12 do not coincide with each other, the traffic signal present in the image due to the time difference ⁇ T between the two. Cannot be accurately estimated, and as a result, the detection area cannot be accurately set.
- the self-position estimation result is updated at time t1 shown in FIG. 4 and an image is captured at time t11 when the time difference ⁇ T has elapsed
- the vehicle 51 moves during the time difference ⁇ T.
- the detection area to be set is shifted from the position of the traffic light in the image.
- the traffic light detection area calculation unit 15 estimates the movement amount of the vehicle 51 during the time difference ⁇ T based on the vehicle behavior information D07 estimated by the vehicle behavior estimation unit 14, and based on this movement amount, FIG.
- the self-position of the vehicle 51 at time t11 is estimated.
- the detection area is set in consideration of this movement amount. That is, the traffic signal detection area is set based on the estimated position of the traffic signal based on the map information D02 and the self-position information D05 and the amount of displacement of the traffic signal estimated position caused by the vehicle behavior.
- the camera 11 is fixed to the vehicle 51 and is converted from real space coordinates to image coordinates based on the installation information of the camera information D03.
- the position of the traffic light in the image can be estimated, and as a result, the detection area can be set.
- the traffic light recognition unit 16 shown in FIG. 2 recognizes the lighting color of the traffic light based on the image data D09 captured by the camera 11 and the traffic signal detection area information D08. Specifically, image processing for recognizing the lighting color of the traffic light is performed on the detection area set in the image data D09.
- a signal lamp included in a traffic light is detected by flashing light synchronized with an AC cycle of a commercial power supply, or a similarity determination of features such as red, green, yellow hue and round shape is performed. It can detect using the method to do.
- known image processing for detecting a traffic light can be applied.
- the traffic signal recognition process is performed not on the entire image data D09 captured by the camera 11 but on a detection area set as a part thereof, thereby reducing the load of information processing for signal detection and quickly. A traffic light can be detected. Then, the lighting color information D04 is output to the subsequent device.
- the signal recognition process is not limited to the above method, and other methods may be employed.
- step S ⁇ b> 11 the vehicle behavior estimation unit 14 acquires a cycle in which the self-position estimation result of the vehicle 51 executed by the self-position detection unit 12 is updated and a cycle in which an image is captured by the camera 11. Then, a time difference ⁇ T between the timing at which the self-position estimation result is updated and the timing at which the image is captured is calculated.
- the times t1, t2, t3 The self-position information D05 is acquired every time the period T1, e.g. Further, an image is taken every time the times t11, t13, t15,. Then, a time difference ⁇ T at each timing is calculated. Specifically, a time difference ⁇ T with respect to time t1, t3, t5,..., Which is the update timing of the self-position information immediately before time t11, t13, t15,.
- step S12 the vehicle behavior estimation unit 14 determines from the self-position of the vehicle 51 acquired at time t1 in FIG. 4B based on the traveling speed and yaw rate (or the turning angle) of the vehicle 51. The amount of movement of the vehicle 51 is calculated. Then, the self position of the vehicle 51 at time t11 after the elapse of the time difference ⁇ T is calculated. That is, when a detection region is set in an image using the image captured at time t11 in FIG. 4A and the self-position information D05 updated at time t1 in FIG. When the vehicle 51 moves in the first place, the detection area cannot be set at an appropriate position. Therefore, the vehicle behavior estimation unit 14 calculates the amount of movement of the vehicle 51 during the time difference ⁇ T. Further, the traffic light detection area calculation unit 15 calculates the self-position of the vehicle 51 at time t11.
- the corrected self-position of the vehicle 51 that is, the self-position coordinates (ex, ey) of the vehicle 51 after the time difference ⁇ T has elapsed can be obtained by the following expressions (1) and (2).
- ex cos ( ⁇ T * yawrate) * (cx ⁇ ox) -Sin ( ⁇ T * yawrate) * (cy-oy) + ox
- ey sin ( ⁇ T * yawrate) * (cx ⁇ ox) + Cos ( ⁇ T * yawrate) * (cy ⁇ oy) + oy (2)
- the corrected self-position coordinates (ex, ey) of the vehicle 51 can be obtained by the following equations (3) and (4).
- ex cx + v * ⁇ T * cos (yaw) (3)
- ey cy + v * ⁇ T * sin (yaw) (4)
- step S12 the time difference ⁇ T from time t1 is obtained by executing the above calculation based on the vehicle behavior information D07 output from the vehicle behavior estimation unit 14 and the self-position of the vehicle 51 acquired from the self-position information D05. It becomes possible to calculate the self-position of the vehicle 51 after elapse.
- the traffic light detection area calculation unit 15 predicts the traffic light position when the image is captured. That is, since the information on the self-position at time t11 in FIG. 4A and the image captured by the camera 11 are obtained, the traffic light in the image is obtained from the self-position and the position information of the traffic light included in the map information. Can be estimated.
- step S14 the traffic light detection area calculation unit 15 sets an area including the traffic signal in the image as a detection area, and outputs this information as traffic light detection area information D08 in step S15. Thereafter, the traffic signal recognition unit 16 detects the traffic signal from the set detection area, and detects the lighting color of the traffic signal.
- the detection area can be set in the image, and the lighting color of the traffic light existing in the detection area can be recognized.
- FIG. 5 shows a case where the vehicle 51 travels on a curved road that curves to the right.
- a case is considered in which the vehicle 51 passes through the point Z1 on the curved road at time t1 (see FIG. 4) and passes through the point Z2 after the elapse of the time difference ⁇ T.
- the self-position is estimated when the vehicle 51 passes the point Z1 in FIG. 5, and the image is displayed when the vehicle 51 passes the point Z2 where the time difference ⁇ T has elapsed. To be acquired. Therefore, if the detection area is set based on these, as shown in FIG. 6, the detection area R1 is set in the right direction with respect to the traffic light P1 existing in the image, and the traffic light cannot be detected.
- the detection region R1 is based on the self-position and image of the vehicle 51 when the vehicle 51 passes the point Z2 in FIG. Therefore, as shown in FIG. 7, the detection region R1 is set at an appropriate position with respect to the traffic light P1 existing in the image.
- the self-position detection unit 12, the traffic signal position estimation unit 13, the vehicle behavior estimation unit 14, the traffic signal detection area calculation unit 15, and the traffic signal recognition unit 16 described above use a microcontroller including a CPU, a memory, and an input / output unit. Can be realized. Specifically, the CPU configures a plurality of information processing units (12 to 16) included in the microcontroller by executing a computer program installed in advance. Part of the memory included in the microcontroller constitutes a map database that stores map information D02. The microcontroller may also be used as an ECU used for other control (for example, automatic driving control) related to the vehicle.
- a detection region in the image is based on the estimated position of the traffic signal obtained from the map information D02 and the amount of displacement of the estimated traffic signal position caused by the vehicle behavior.
- Set R1 Therefore, the detection region R1 for detecting the traffic light P1 can be set with high accuracy.
- the traffic light detection area calculation unit 15 estimates the vehicle movement amount based on the vehicle behavior and a predetermined time set in advance, and determines the detection area R1 based on the estimated position of the traffic light on the image and the vehicle movement amount. Since it is set, highly accurate recognition is possible.
- the detection region R1 can be set with high accuracy on the image picked up by the camera 11, and as a result, the lighting state of the traffic light P1, that is, the lighting color of red, green, yellow, or the lighting state of the arrow, etc. It becomes possible to recognize with high accuracy.
- the time difference ⁇ T can be set to be changed according to a predetermined cycle in which the self-position estimation result of the vehicle 51 is updated, that is, a cycle T1 shown in FIG. For example, when the period T1 increases, the time difference ⁇ T can be increased. By setting in this way, it is possible to set an appropriate detection region according to a change in the cycle in which the self-position estimation result is updated.
- the movement amount of the vehicle 51 during the time difference ⁇ T is calculated as the vehicle behavior using the traveling speed and yaw rate of the vehicle, the displacement amount of the traffic light estimated position caused by the vehicle behavior can be obtained with high accuracy. As a result, it is possible to set the detection region R1 with high accuracy.
- the detection region R1 can be set with high accuracy.
- the self-position information of the vehicle 51 at the point Z2 shown in FIG. 5 is estimated based on the vehicle behavior, if the vehicle behavior information D07 is large, the self-position estimation accuracy may be lowered. is there. Therefore, when the value of the vehicle behavior information D07 is large, it is possible to compensate for a decrease in estimation accuracy by increasing the range of the detection region R1 set by the traffic light detection region calculation unit 15.
- the size of the detection region R1 is set to “1”
- the size of the detection region R1 is increased as the vehicle behavior amount increases.
- a certain size for example, twice as large as the normal time is set as the upper limit, and when this size is reached, the detection region R1 is not further increased.
- the larger the vehicle behavior amount the larger the range of the detection region R1. Therefore, even when the traffic signal position estimation system is reduced due to the increase in the vehicle behavior amount, the traffic light P1 is out of the detection region R1.
- the detection region R1 can be set so as not to occur.
- FIG. 9A shows the timing at which an image is captured
- FIG. 9B shows the timing at which the self-position estimation result is updated.
- the update of the self-position estimation result is executed at cycle T4, and the image capturing is executed at cycle T3.
- the detection region R1 is set using the latest self-position estimation result before this time.
- time differences ⁇ T1 to ⁇ T6 are calculated as time difference ⁇ T1 between time t22 and time t32 and time difference ⁇ T2 between time t23 and time t33, and the movement amount of vehicle 51 is calculated using each time difference ⁇ T1 to ⁇ T5.
- the movement amount of the vehicle 51 between the time differences ⁇ T1 to T5 is calculated and the position of the vehicle 51 is corrected based on this movement amount, as in the first embodiment described above. Therefore, it is possible to set the detection area set in the image with high accuracy.
- step S31 the vehicle behavior estimation unit 14 determines whether or not the vehicle behavior is equal to or greater than a predetermined value based on various information such as the traveling speed, yaw rate, and turning angle of the vehicle 51.
- step S31 If the vehicle behavior is determined to be less than the predetermined value when the vehicle 51 is traveling at a low speed or stopped (NO in step S31), the self-position estimation result of the vehicle 51 is determined. It can be determined that the amount of vehicle behavior during the time difference ⁇ T between the timing at which is updated and the timing at which an image is captured is small and can be ignored. Therefore, in step S32, the vehicle behavior estimation unit 14 regards the vehicle behavior information D07 as zero, and advances the process to step S35.
- steps S33 to S37 are executed.
- the processing in steps S33 to S37 is the same as the processing in steps S11 to S15 shown in FIG.
- the movement amount during the time difference ⁇ T is calculated based on the behavior of the vehicle 51 only when the behavior of the vehicle 51 is greater than or equal to a predetermined value. Correct the position of. In other words, when the behavior of the vehicle 51 is small, the relative positional relationship between the self-position estimation result of the vehicle 51 and the self-position of the vehicle 51 when the image is captured during the time difference ⁇ T. In particular, since it can be estimated that the traveling direction does not change significantly, in such a case, the movement amount of the vehicle 51 between the time differences ⁇ T is not calculated. As a result, unnecessary calculations can be omitted and the calculation load can be reduced.
- FIG. 11 is a block diagram showing the configuration of the traffic signal recognition apparatus 101 according to the third embodiment.
- the traffic signal recognition device 101 is different from the traffic signal recognition device 100 shown in FIG. 2 described above in that a traveling path setting unit 37 is mounted. Moreover, the process by the vehicle behavior estimation part 34 is different. Since other configurations are the same as those in FIG. 2, the same reference numerals are given and description of the configurations is omitted.
- the travel route setting unit 37 receives the map information D02 and the self-location information D05 as inputs, and extracts a travel route on which the vehicle 51 is expected to travel in the future. Then, the extracted travel path is output to the vehicle behavior estimation unit 34 as travel path information D40.
- the vehicle behavior estimation unit 34 calculates the movement amount of the vehicle 51 during the time difference ⁇ T from the timing at which the self-position estimation result is updated to the timing at which an image is captured based on the travel route information D40. Is output to the traffic light detection area calculation unit 15 as vehicle behavior information D37.
- step S51 the travel path setting unit 37 determines the travel path of the vehicle 51 based on the self-position information D05 and the map information D02.
- the vehicle 51 is traveling in the future because the vehicle 51 can recognize the traveling path and the traveling position of the vehicle 51 based on the traveling path of the vehicle 51 and the self-location information D05 included in the map information D02.
- a deaf travel route can be determined.
- step S52 the vehicle behavior estimation unit 14 acquires a cycle in which the self-position estimation result of the vehicle 51 executed by the self-position detection unit 12 is updated and a cycle in which an image captured by the camera 11 is captured. Then, a time difference ⁇ T between the timing at which the self-position estimation result is updated and the timing at which the image is captured is calculated. Since this process is the same as the process of step S11 shown in FIG. 3, detailed description thereof is omitted.
- step S53 the vehicle behavior estimation unit 34 determines the vehicle during the time difference ⁇ T with respect to the self-position estimated at time t1 in FIG. 4B based on the traveling speed of the vehicle and the traveling path of the vehicle. 51 is calculated. This movement amount is output to the traffic light detection area calculation unit 15 as vehicle behavior information D37. And the traffic signal detection area calculation part 15 calculates the self-position of the vehicle 51 in the time t11 based on this vehicle behavior information D37.
- step S54 the traffic light detection area calculation unit 15 predicts the position of the traffic light when the image is captured. That is, the position of the traffic light with respect to the vehicle 51 is estimated based on the self-position calculation result and the image of the vehicle 51 at time t11 calculated in the process of step S53.
- step S55 the traffic signal detection area calculation unit 15 sets an area where the traffic signal is included in the image as a detection area, and in step S56, outputs this information as traffic signal detection area information D08. Thereafter, the traffic signal recognition unit 16 detects the traffic signal from the set detection area, and detects the lighting color of the traffic signal. Thus, the detection area can be set in the image, and the lighting color of the traffic light existing in the detection area can be recognized.
- FIG. 13A is an explanatory diagram showing the behavior of the vehicle 51 when the vehicle 51 changes lanes from the right lane X11 to the left lane X12 of the two-lane road.
- FIG. 13B is a characteristic diagram showing changes in yaw rate over time. The times when the vehicle 51 is traveling at the points Z3 and Z4 in FIG. 13A correspond to the times t51 and t52 shown in FIG. 13B, respectively.
- the self-position estimation accuracy decreases. That is, since the yaw rate has not yet occurred at the point Z3 where the vehicle 51 has not changed the lane as shown at time t51 in FIG. 13B, the self-position of the vehicle 51 after the elapse of the time difference ⁇ T is estimated. Accuracy is reduced. Even when the lane change is started, the rate of change of the yaw rate or the turning angle becomes large in the case of the lane change (time from t51 to t52). In such a case, the detection accuracy of the behavior of the vehicle 51 is increased. Decreases.
- the travel amount of the vehicle 51 is calculated using the travel path information D40 and the travel speed of the vehicle. That is, since the moving distance L1 of the vehicle 51 is obtained by (traveling speed ⁇ ⁇ T), the self acquired at the time of updating the self-position estimation result immediately before the image is captured (time t1 in FIG. 4). The position advanced by the moving distance L1 along the travel path with respect to the position may be the self-position of the vehicle 51 after the time difference ⁇ T has elapsed.
- the position of the lane change can be recognized in advance. Therefore, since the orientation of the vehicle 51 at the point Z4 in FIG. 13A can be calculated, the self-position of the vehicle 51 when the image is captured can be calculated. Thus, by estimating the vehicle behavior using the travel path set by the travel path setting unit 37, the movement amount of the vehicle 51 during the time difference ⁇ T can be calculated with higher accuracy.
- the traffic signal recognition apparatus 101 after the elapse of the time difference ⁇ T, based on the travel speed of the vehicle 51 and the travel path information D40 indicating the travel path on which the vehicle 51 will travel in the future.
- the self position of the vehicle 51 is estimated. Therefore, even when the vehicle 51 changes lanes, yaw rate does not occur initially and the yaw rate increases rapidly thereafter, the self-position of the vehicle 51 when the image is captured with high accuracy can be obtained. It is possible to estimate. Therefore, it is possible to set the detection area set in the image with high accuracy.
- the signal recognition device and the signal recognition method of the present invention have been described based on the illustrated embodiment, but the present invention is not limited to this, and the configuration of each unit is an arbitrary configuration having the same function. Can be replaced with something.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Navigation (AREA)
- Steering Control In Accordance With Driving Conditions (AREA)
Abstract
Description
[第1実施形態の説明]
図1は、本発明の第1実施形態に係る信号機認識装置、及びその周辺機器の構成を示すブロック図である。また、図2は、図1に示す信号機認識装置100を詳細に示したブロック図である。
ex=cos(ΔT*yawrate)*(cx-ox)
-sin(ΔT*yawrate)*(cy-oy)+ox …(1)
ey=sin(ΔT*yawrate)*(cx-ox)
+cos(ΔT*yawrate)*(cy-oy)+oy …(2)
一方、ヨーレイトが生じていない場合には、補正後の車両51の自己位置座標(ex,ey)は、次の(3)式、(4)式で求めることができる。
ex=cx+v*ΔT*cos(yaw) …(3)
ey=cy+v*ΔT*sin(yaw) …(4)
また、補正後のヨー角eyawは、eyaw=yawである。
前述した第1実施形態では、図4に示したように、周期T1に対して周期T2が2倍の周期となる例について示した。しかし、周期T1、T2が倍数の関係になっていない場合には、時間差ΔTは変化する。以下、図9に示すタイミングチャートを参照して説明する。
次に、本発明の第2実施形態について説明する。装置構成は、第1実施形態で示した図1、図2と同様であるので説明を省略する。以下、図10に示すフローチャートを参照して、第2実施形態に係る信号機認識装置100の処理手順について説明する。初めに、ステップS31において、車両挙動推定部14は、車両51の走行速度、ヨーレイト、転舵角等の各種の情報に基づいて、車両挙動が所定値以上であるか否かを判断する。
次に、本発明の第3実施形態について説明する。図11は第3実施形態に係る信号機認識装置101の構成を示すブロック図である。この信号機認識装置101は、前述した図2に示した信号機認識装置100と対比して、走行路設定部37を搭載している点で相違する。また、車両挙動推定部34による処理が相違する。それ以外の構成は、図2と同様であるので、同一符号を付して構成説明を省略する。
12 自己位置検出部
13 信号機位置推定部
14,34 車両挙動推定部
15 信号機検出領域算出部(信号機検出領域設定部)
16 信号機認識部
17 地図情報取得部
18 ランドマーク情報取得部
37 走行路設定部
51 車両
100,101 信号機認識装置
D01 ランドマーク情報
D02 地図情報
D03 カメラ情報
D04 点灯色情報
D05 自己位置情報
D06 信号機相対位置情報
D07 車両挙動情報
D08 信号機検出領域情報
D09 画像データ
D37 車両挙動情報
D40 走行路情報
P1 信号機
R1 検出領域
X11 右レーン
X12 左レーン
Claims (10)
- 車両に搭載され、前記車両周囲の画像を撮像する撮像部と、
前記車両周囲の地図情報を取得する地図情報取得部と、
前記車両の地図上の自己位置を検出する自己位置検出部と、
前記自己位置及び前記地図情報に基づいて、信号機の画像上の位置を推定する信号機位置推定部と、
前記車両の挙動を推定する車両挙動推定部と、
前記推定された信号機の画像上の位置と、前記車両の挙動による前記信号機の画像上の位置の変位量と、に基づいて、前記画像上に、信号機の検出領域を設定する信号機検出領域設定部と、
前記検出領域から信号機を検出する信号機認識部と、
を備えたことを特徴とする信号機認識装置。 - 前記信号機検出領域設定部は、前記車両挙動が予め設定した所定値以上の場合には、前記推定された信号機の画像上の位置及び前記車両挙動による前記信号機の画像上の位置の変位量に基づいて前記検出領域を設定し、
前記車両挙動が前記所定値未満の場合には、前記推定された信号機の画像上の位置に基づいて、前記検出領域を設定すること
を特徴とする請求項1に記載の信号機認識装置。 - 前記信号機検出領域設定部は、前記車両挙動と予め設定した所定時間に基づいて車両移動量を推定し、前記推定された信号機の画像上の位置と前記車両移動量に基づいて、前記検出領域を設定すること
を特徴とする請求項1または2に記載の信号機認識装置。 - 前記信号機検出領域設定部は、前記撮像部で画像を撮像したタイミングと前記自己位置を検出したタイミングとの間の時間差に基づいて、前記所定時間を設定すること
を特徴とする請求項3に記載の信号機認識装置。 - 前記自己位置検出部は、前記車両の自己位置を所定周期で検出し、
前記信号機検出領域設定部は、前記所定周期に応じて、前記時間差を設定すること
を特徴とする請求項4に記載の信号機認識装置。 - 前記車両挙動推定部は、車両のヨーレイトを用いて車両の挙動を推定すること
を特徴とする請求項1~5のいずれか1項に記載の信号機認識装置。 - 前記車両挙動推定部は、車両の転舵量を用いて車両の挙動を推定すること
を特徴とする請求項1~5のいずれか1項に記載の信号機認識装置。 - 前記自己位置及び前記地図情報に基づいて、車両が将来走行する走行路を設定する走行路設定部、を更に備え、
前記車両挙動推定部は、車両が将来走行する走行路に基づいて前記車両挙動を推定すること
を特徴とする請求項1~7のいずれか1項に記載の信号機認識装置。 - 前記信号機検出領域設定部は、車両挙動量が大きいほど、前記検出領域を大きく設定すること
を特徴とする請求項1~8のいずれか1項に記載の信号機認識装置。 - 車両に搭載され、車両周囲の画像を撮像するステップと、
車両周囲の地図情報を取得するステップと、
前記車両の地図上の自己位置を検出するステップと、
前記自己位置及び前記地図情報に基づいて、信号機の画像上の位置を推定するステップと、
前記車両の挙動を推定するステップと、
前記推定された信号機の画像上の位置と、前記車両の挙動による前記信号機の画像上の位置の変位量と、に基づいて、前記画像上に、信号機の検出領域を設定するステップと、
前記検出領域から信号機を検出するステップと、
を備えたことを特徴とする信号機認識方法。
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197023796A KR20190097326A (ko) | 2015-07-13 | 2015-07-13 | 신호기 인식 장치 및 신호기 인식 방법 |
CA2992405A CA2992405A1 (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
BR112018000708-0A BR112018000708B1 (pt) | 2015-07-13 | 2015-07-13 | Dispositivo de reconhecimento de sinal luminoso de trânsito e método de reconhecimento de sinal luminoso de trânsito |
KR1020187002566A KR20180021159A (ko) | 2015-07-13 | 2015-07-13 | 신호기 인식 장치 및 신호기 인식 방법 |
MX2018000437A MX367068B (es) | 2015-07-13 | 2015-07-13 | Dispositivo de reconocimiento de semaforo y metodo de reconocimiento de semaforo. |
RU2018105103A RU2678527C1 (ru) | 2015-07-13 | 2015-07-13 | Устройство и способ распознавания светофора |
PCT/JP2015/070042 WO2017009934A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
JP2017528038A JP6477883B2 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
EP15898247.0A EP3324384B1 (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
US15/743,905 US10789491B2 (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
CN201580081658.2A CN107836017B (zh) | 2015-07-13 | 2015-07-13 | 信号机识别装置及信号机识别方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/070042 WO2017009934A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017009934A1 true WO2017009934A1 (ja) | 2017-01-19 |
Family
ID=57758258
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/070042 WO2017009934A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10789491B2 (ja) |
EP (1) | EP3324384B1 (ja) |
JP (1) | JP6477883B2 (ja) |
KR (2) | KR20180021159A (ja) |
CN (1) | CN107836017B (ja) |
BR (1) | BR112018000708B1 (ja) |
CA (1) | CA2992405A1 (ja) |
MX (1) | MX367068B (ja) |
RU (1) | RU2678527C1 (ja) |
WO (1) | WO2017009934A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021006797A (ja) * | 2019-06-28 | 2021-01-21 | 日産自動車株式会社 | 自己位置推定方法及び自己位置推定装置 |
CN112489466A (zh) * | 2020-11-27 | 2021-03-12 | 恒大新能源汽车投资控股集团有限公司 | 交通信号灯识别方法和装置 |
JP2021077259A (ja) * | 2019-11-13 | 2021-05-20 | トヨタ自動車株式会社 | 運転支援装置 |
KR102368262B1 (ko) * | 2021-06-02 | 2022-03-03 | (주)에이아이매틱스 | 다중 관측정보를 이용한 신호등 배치정보 추정 방법 |
WO2022137819A1 (ja) * | 2020-12-24 | 2022-06-30 | 本田技研工業株式会社 | 車両 |
US20230098014A1 (en) * | 2021-09-24 | 2023-03-30 | Autonomous A2Z | Method for Predicting Traffic Light Information by Using Lidar and Server Using the Same |
JP7494809B2 (ja) | 2021-06-29 | 2024-06-04 | 株式会社デンソー | 支援装置、支援方法、支援プログラム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6451853B2 (ja) * | 2015-07-28 | 2019-01-16 | 日産自動車株式会社 | 走行制御装置の制御方法および走行制御装置 |
JP7205695B2 (ja) * | 2019-02-18 | 2023-01-17 | トヨタ自動車株式会社 | 運転支援システム |
JP7140043B2 (ja) * | 2019-05-07 | 2022-09-21 | 株式会社デンソー | 情報処理装置 |
JP7268497B2 (ja) * | 2019-06-24 | 2023-05-08 | トヨタ自動車株式会社 | 信号認識システム |
WO2021134348A1 (zh) * | 2019-12-30 | 2021-07-08 | 深圳元戎启行科技有限公司 | 交通灯状态识别方法、装置、计算机设备和存储介质 |
US12112550B2 (en) * | 2020-01-07 | 2024-10-08 | Motional Ad Llc | Systems and methods for traffic light detection |
JP7318609B2 (ja) * | 2020-08-06 | 2023-08-01 | トヨタ自動車株式会社 | 車載検出装置 |
CN113177522A (zh) * | 2021-05-24 | 2021-07-27 | 的卢技术有限公司 | 一种用于自动驾驶场景下的红绿灯检测与识别方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005202761A (ja) * | 2004-01-16 | 2005-07-28 | Toyota Motor Corp | 車両周辺監視装置 |
JP2007219758A (ja) * | 2006-02-15 | 2007-08-30 | Fujitsu Ten Ltd | 車載情報装置および車載装置用情報処理方法 |
JP2007241469A (ja) * | 2006-03-06 | 2007-09-20 | Toyota Motor Corp | 画像処理システム |
JP2013218571A (ja) * | 2012-04-10 | 2013-10-24 | Toyota Motor Corp | 画像認識装置および運転支援装置 |
WO2014162797A1 (ja) * | 2013-04-04 | 2014-10-09 | 日産自動車株式会社 | 信号認識装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4724043B2 (ja) * | 2006-05-17 | 2011-07-13 | トヨタ自動車株式会社 | 対象物認識装置 |
WO2014062797A1 (en) * | 2012-10-16 | 2014-04-24 | Numera | Methods for configuring biometric devices for transmitting health information |
RU144555U1 (ru) * | 2014-05-05 | 2014-08-27 | Павел Юрьевич Михайлов | Устройство для повышения безопасности движения транспортного средства |
JP6464783B2 (ja) | 2015-02-04 | 2019-02-06 | 株式会社デンソー | 物体検出装置 |
CN104952249A (zh) * | 2015-06-10 | 2015-09-30 | 浙江吉利汽车研究院有限公司 | 基于车联网的驾驶行为纠正方法及装置 |
-
2015
- 2015-07-13 CA CA2992405A patent/CA2992405A1/en not_active Abandoned
- 2015-07-13 US US15/743,905 patent/US10789491B2/en active Active
- 2015-07-13 WO PCT/JP2015/070042 patent/WO2017009934A1/ja active Application Filing
- 2015-07-13 KR KR1020187002566A patent/KR20180021159A/ko active Application Filing
- 2015-07-13 RU RU2018105103A patent/RU2678527C1/ru active
- 2015-07-13 KR KR1020197023796A patent/KR20190097326A/ko not_active Application Discontinuation
- 2015-07-13 CN CN201580081658.2A patent/CN107836017B/zh active Active
- 2015-07-13 BR BR112018000708-0A patent/BR112018000708B1/pt active IP Right Grant
- 2015-07-13 MX MX2018000437A patent/MX367068B/es active IP Right Grant
- 2015-07-13 EP EP15898247.0A patent/EP3324384B1/en active Active
- 2015-07-13 JP JP2017528038A patent/JP6477883B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005202761A (ja) * | 2004-01-16 | 2005-07-28 | Toyota Motor Corp | 車両周辺監視装置 |
JP2007219758A (ja) * | 2006-02-15 | 2007-08-30 | Fujitsu Ten Ltd | 車載情報装置および車載装置用情報処理方法 |
JP2007241469A (ja) * | 2006-03-06 | 2007-09-20 | Toyota Motor Corp | 画像処理システム |
JP2013218571A (ja) * | 2012-04-10 | 2013-10-24 | Toyota Motor Corp | 画像認識装置および運転支援装置 |
WO2014162797A1 (ja) * | 2013-04-04 | 2014-10-09 | 日産自動車株式会社 | 信号認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3324384A4 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7274366B2 (ja) | 2019-06-28 | 2023-05-16 | 日産自動車株式会社 | 自己位置推定方法及び自己位置推定装置 |
JP2021006797A (ja) * | 2019-06-28 | 2021-01-21 | 日産自動車株式会社 | 自己位置推定方法及び自己位置推定装置 |
JP2021077259A (ja) * | 2019-11-13 | 2021-05-20 | トヨタ自動車株式会社 | 運転支援装置 |
JP7156252B2 (ja) | 2019-11-13 | 2022-10-19 | トヨタ自動車株式会社 | 運転支援装置 |
CN112489466A (zh) * | 2020-11-27 | 2021-03-12 | 恒大新能源汽车投资控股集团有限公司 | 交通信号灯识别方法和装置 |
WO2022137819A1 (ja) * | 2020-12-24 | 2022-06-30 | 本田技研工業株式会社 | 車両 |
JPWO2022137819A1 (ja) * | 2020-12-24 | 2022-06-30 | ||
JP7413572B2 (ja) | 2020-12-24 | 2024-01-15 | 本田技研工業株式会社 | 車両 |
KR102368262B1 (ko) * | 2021-06-02 | 2022-03-03 | (주)에이아이매틱스 | 다중 관측정보를 이용한 신호등 배치정보 추정 방법 |
WO2022255678A1 (ko) * | 2021-06-02 | 2022-12-08 | (주)에이아이매틱스 | 다중 관측정보를 이용한 신호등 배치정보 추정 방법 |
JP7494809B2 (ja) | 2021-06-29 | 2024-06-04 | 株式会社デンソー | 支援装置、支援方法、支援プログラム |
US11643093B2 (en) * | 2021-09-24 | 2023-05-09 | Autonmous A2Z | Method for predicting traffic light information by using lidar and server using the same |
US20230098014A1 (en) * | 2021-09-24 | 2023-03-30 | Autonomous A2Z | Method for Predicting Traffic Light Information by Using Lidar and Server Using the Same |
Also Published As
Publication number | Publication date |
---|---|
KR20190097326A (ko) | 2019-08-20 |
EP3324384B1 (en) | 2021-03-03 |
RU2678527C1 (ru) | 2019-01-29 |
CN107836017A (zh) | 2018-03-23 |
MX367068B (es) | 2019-08-05 |
US20180204077A1 (en) | 2018-07-19 |
BR112018000708B1 (pt) | 2023-03-14 |
US10789491B2 (en) | 2020-09-29 |
EP3324384A4 (en) | 2018-12-05 |
KR20180021159A (ko) | 2018-02-28 |
CN107836017B (zh) | 2019-03-26 |
BR112018000708A2 (ja) | 2018-09-18 |
JPWO2017009934A1 (ja) | 2018-06-14 |
CA2992405A1 (en) | 2017-01-19 |
JP6477883B2 (ja) | 2019-03-13 |
EP3324384A1 (en) | 2018-05-23 |
MX2018000437A (es) | 2018-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6477883B2 (ja) | 信号機認識装置及び信号機認識方法 | |
EP3607272B1 (en) | Automated image labeling for vehicle based on maps | |
JP6447729B2 (ja) | 信号機認識装置及び信号機認識方法 | |
JP6222353B2 (ja) | 物標検出装置及び物標検出方法 | |
JP5966747B2 (ja) | 車両走行制御装置及びその方法 | |
JP6708730B2 (ja) | 移動体 | |
RU2700646C2 (ru) | Устройство обнаружения светофора и способ обнаружения светофора | |
WO2019073772A1 (ja) | 移動体の位置推定装置及び位置推定方法 | |
JP2020109560A (ja) | 信号機認識方法及び信号機認識装置 | |
JP6141734B2 (ja) | ステレオ画像処理装置 | |
KR101424636B1 (ko) | 자동 후진 주차시스템 | |
JP6174884B2 (ja) | 車外環境認識装置および車外環境認識方法 | |
JP5330341B2 (ja) | 車載カメラを用いた測距装置 | |
JP7122394B2 (ja) | 撮像部制御装置 | |
JP2009210499A (ja) | タイヤ径情報補正装置 | |
JP6122365B6 (ja) | ステレオカメラの調整システム | |
KR20170031282A (ko) | 차량용 카메라 보정 장치 | |
JP2015049040A (ja) | ステレオカメラの調整システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15898247 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017528038 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15743905 Country of ref document: US Ref document number: MX/A/2018/000437 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 2992405 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187002566 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018105103 Country of ref document: RU Ref document number: 2015898247 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018000708 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112018000708 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180112 |