WO2017163367A1 - 走路検出方法及び走路検出装置 - Google Patents
走路検出方法及び走路検出装置 Download PDFInfo
- Publication number
- WO2017163367A1 WO2017163367A1 PCT/JP2016/059397 JP2016059397W WO2017163367A1 WO 2017163367 A1 WO2017163367 A1 WO 2017163367A1 JP 2016059397 W JP2016059397 W JP 2016059397W WO 2017163367 A1 WO2017163367 A1 WO 2017163367A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road
- runway
- boundary point
- boundary
- shape
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 102
- 238000011156 evaluation Methods 0.000 claims description 18
- 239000000284 extract Substances 0.000 claims description 10
- 230000006870 function Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 20
- 238000000034 method Methods 0.000 description 18
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000012888 cubic function Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
- G06F2218/10—Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a road detection method and a road detection device.
- Patent Document 1 an apparatus for detecting a traveling lane from a road surface image is known (Patent Document 1).
- Patent Document 1 first, a horizontal edge histogram is created for a plurality of edge points back-projected on road surface coordinates. Then, the peak position of the edge histogram is obtained, and the edge group contributing to the peak position is set as one group, thereby detecting the lane marker.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide a runway detection method and a runway detection apparatus that can stably detect the shape of a runway boundary.
- a plurality of parallel runway boundary point groups extracted based on continuity of a plurality of runway feature points detected by a target detection sensor mounted on a vehicle are superimposed,
- a runway shape is estimated based on runway feature points included in a plurality of overlapped runway boundary point groups, and a runway boundary is determined based on a lateral position and a runway shape of a plurality of parallel runway boundary point groups.
- the present invention it is possible to estimate a runway shape by eliminating runway feature points belonging to a branch road that is not parallel to other runways. Therefore, the shape of the road boundary can be detected stably.
- FIG. 1 is a block diagram showing a configuration of a travel path detection apparatus 1 according to the first embodiment.
- FIG. 2 is a flowchart showing an example of a road detection method using the road detection device 1 shown in FIG.
- FIG. 3A is an overhead view showing a state in which the vehicle 51 is traveling in the left lane of a two-lane road with a gentle right curve.
- FIG. 3B (a) is an overhead view showing an example of the second peripheral map generated from the first peripheral map of FIG. 3A, and FIG. 3B (b) is from the second peripheral map of FIG. 3B (a). It is a graph which shows an example of the generated histogram.
- FIG. 3A is an overhead view showing a state in which the vehicle 51 is traveling in the left lane of a two-lane road with a gentle right curve.
- FIG. 3B (a) is an overhead view showing an example of the second peripheral map generated from the first peripheral map of FIG. 3A
- FIG. 3B (b) is from the second peripheral map
- FIG. 4A shows five road model functions (KK 0 , KK 1 , KK 2 , KK 3 , KK 4 ) applied to each of a plurality of parallel road boundary point groups and the road characteristics included in each road boundary point group. It is an overhead view which shows the point FP.
- FIG. 4B is an overhead view showing a state in which the runway feature points FP included in a plurality of parallel runway boundary point groups are overlapped.
- FIG. 5A is an overhead view showing a runway shape BC estimated based on a runway feature point FP included in a plurality of overlapped runway boundary point groups.
- FIG. 5B is an overhead view showing the runway shape (BC 0 to BC 4 ) respectively moved by the offset amount (lateral position: d 0 to d 4 ) in the y-axis direction of the road model function (KK 0 to KK 4 ). is there.
- FIG. 6 is a flowchart illustrating an example of a runway detection method according to a modification of the first embodiment.
- FIG. 7 is a bird's-eye view showing the road boundary (SK 1 , SK 2 , SK 3 , SK 4 , SK 4 ) obtained in the previous processing cycle.
- FIG. 8 is a block diagram showing a configuration of the travel path detection apparatus 2 according to the second embodiment.
- FIG. 9A shows an example of an image 52 captured by the camera 34 ′ in FIG. 8.
- FIG. 9A shows an example of an image 52 captured by the camera 34 ′ in FIG. 8.
- FIG. 9B is an overhead view showing the runway feature point FP converted to a position on the overhead view coordinates.
- FIG. 10 is a flowchart illustrating an example of a runway detection method according to the third embodiment.
- FIG. 11A is an overhead view showing a plurality of runway feature points FP 1 that support the runway shape (BC 1 to BC 3 ) of the main runway 54 and a plurality of runway feature points FP 2 that belong to the branch road 55.
- FIG. 11B is an overhead view showing a state where a plurality of runway feature points FP 2 supporting the runway shape (BC 4 , BC 5 ) of the branch road 55 are left from FIG. 11A by deleting the runway feature point FP 1 . is there.
- the travel path detection device 1 detects the boundary of the travel path on which the vehicle travels from the travel path feature points on the road surface detected by the sensors mounted on the vehicle.
- the travel path detection device 1 includes a target detection sensor 11 mounted on the vehicle, a movement amount detection sensor 10 that detects the movement amount of the vehicle based on the movement speed of the vehicle and the yaw rate of the vehicle, and the target detection sensor 11.
- a travel path detection circuit 12 for detecting a travel path boundary based on the plurality of travel path feature points and the travel amount of the vehicle detected by the travel amount detection sensor 10.
- the target detection sensor 11 detects white lines (including lane markers) marked on the road surface around the vehicle.
- the target detection sensor 11 includes a camera 34 attached to the vehicle, and an image processing circuit 35 that detects a road marking including a white line from a digital image captured by the camera 34.
- the detected road marking is expressed as a feature point group including a plurality of road feature points indicating the position.
- the image processing circuit 35 may detect, for example, a location (luminance edge) where the brightness of the image changes sharply or discontinuously as the runway feature point.
- the movement amount detection sensor 10 includes a wheel speed sensor 31, a yaw rate sensor 32, and a movement amount detection circuit 33.
- the wheel speed sensor 31 detects the rotational speed of the wheel provided in the vehicle.
- the yaw rate sensor 32 detects the yaw rate of the vehicle.
- the movement amount detection circuit 33 detects the movement amount of the vehicle in a predetermined time from the rotational speed of the wheel and the yaw rate of the vehicle.
- the moving amount of the vehicle includes, for example, the moving direction and the moving distance of the vehicle.
- the runway detection circuit 12 can be realized using a microcomputer including a CPU (Central Processing Unit), a memory, and an input / output unit.
- a computer program (running path detection program) for causing the microcomputer to function as the running path detection circuit 12 is installed in the microcomputer and executed.
- the microcomputer functions as the runway detection circuit 12.
- the road detection circuit 12 is realized by software.
- the road detection circuit 12 may be configured by preparing dedicated hardware for executing the following information processing. Is possible.
- the plurality of circuits (21, 22, 23) included in the travel path detection circuit 12 may be configured by individual hardware.
- the travel path detection circuit 12 may also be used as an electronic control unit (ECU) used for other control related to the vehicle.
- ECU electronice control unit
- the track detection circuit 12 includes a surrounding map generation circuit 21, a track shape estimation circuit 22, and a track boundary evaluation circuit 23.
- the surrounding map generation circuit 21 connects the feature point group histories detected by the target detection sensor 11 based on the movement amount of the vehicle between the time points when the feature point group is detected.
- a map (first surrounding map) is generated.
- the surrounding map generation circuit 21 connects the runway feature points observed at different times in consideration of the amount of movement of the vehicle. Thereby, the detection history of the runway feature points is accumulated, and the first surrounding map is generated.
- the camera 34 images the road surface around the vehicle every predetermined time.
- the movement amount detection sensor 10 detects the movement direction and the movement distance of the vehicle in this predetermined time.
- the surrounding map generation circuit 21 moves the position of the runway feature point in the direction opposite to the moving direction of the vehicle by the moving distance of the vehicle.
- the surrounding map generation circuit 21 accumulates the detection history of the plurality of road feature points by repeating the above and connecting the plurality of road feature points observed at different times in consideration of the moving amount of the vehicle. A first surrounding map is generated.
- FIG. 3A shows three runway boundaries (SKa, SKb, SKc) that define the two-lane road.
- the first surrounding map generated by the surrounding map generation circuit 21 includes a feature point group (not shown) detected along three runway boundaries (SKa, SKb, SKc).
- plane coordinates are used in which the position of the vehicle 51 is the origin, the traveling direction of the vehicle 51 is the x axis, and the vehicle width direction of the vehicle 51 is the y axis.
- the runway shape estimation circuit 22 extracts a runway boundary point group based on the continuity of a plurality of runway feature points included in the first surrounding map. Then, when a plurality of parallel road boundary point groups are extracted, the road shape estimation circuit 22 superimposes a plurality of parallel road boundary point groups and includes a plurality of road boundary point groups that are superimposed. The runway shape is estimated based on the feature points.
- the processing operation of the runway shape estimation circuit 22 will be described in detail.
- the runway shape estimation circuit 22 determines the continuity of a plurality of runway feature points by the frequency of coordinates in the vehicle width direction (y-axis direction). For example, the runway shape estimation circuit 22 takes the position of the vehicle 51 as the origin, the vehicle width direction of the vehicle 51 as the y axis, and the axis orthogonal to the y axis as the time axis (t axis), taking into account the movement amount of the vehicle 51. Without creating a second neighborhood map. As shown in FIG. 3B (a), the runway shape estimation circuit 22 detects a plurality of runway feature points FP included in the first surrounding map shown in FIG. 3A with the detection time (t) and the position in the vehicle width direction ( Plot on the second surrounding map based on y coordinate.
- the runway feature point FP is a plot along a straight line parallel to the t-axis.
- the runway shape estimation circuit 22 votes the runway feature points FP on the second surrounding map to a one-dimensional histogram along the y-axis as shown in FIG. 3B (b).
- the track shape estimation circuit 22 can determine the continuity of a plurality of track feature points from the histogram.
- the runway shape estimation circuit 22 detects a peak (y coordinate) of the histogram, and extracts a runway boundary point group by grouping the runway feature points FP on the second surrounding map for each peak. By grouping the runway feature points FP on the second surrounding map, the runway feature points FP can be grouped more easily than when the runway feature points FP on the first surrounding map are grouped. The plurality of grouped runway feature points FP constitute one runway boundary point group. Further, by performing grouping using a histogram, the road shape estimation circuit 22 can simultaneously extract a plurality of parallel road boundary point groups.
- the runway shape estimation circuit 22 can fit a plurality of curves to the boundary point group by approximating the runway feature points FP with curves using a known method instead of using a histogram. Then, the runway shape estimation circuit 22 can determine whether or not the plurality of fitted curves are parallel lines.
- the runway shape estimation circuit 22 applies a curve expressed by a road model function to each runway boundary point group in the first surrounding map.
- the runway shape estimation circuit 22 calculates coefficients a, b, c, and d.
- function fitting by the least square method may be used, but robust estimation such as RANSAC (Random sample consensus) may be used when more stability is desired.
- the runway shape estimation circuit 22 determines whether or not a plurality of parallel runway boundary point groups have been extracted. Specifically, the road shape estimation circuit 22 determines that a plurality of road boundary point groups to which road model functions having different coefficients a, b, and c and different coefficients d are applied are parallel to each other. Alternatively, it may be determined whether or not two or more peaks have been detected on the histogram.
- the overhead view shown in FIG. 4A is included in five road model functions (KK 0 , KK 1 , KK 2 , KK 3 , KK 4 ) and a road boundary point group applied to each of a plurality of parallel road boundary point groups.
- the runway feature point FP is shown.
- the five road model functions (KK 0 to KK 4 ) are as follows.
- KK 1 ax 3 + bx 2 + cx + d 1
- KK 2 ax 3 + bx 2 + cx + d 2
- KK 3 ax 3 + bx 2 + cx + d 3
- each road model function corresponds to the offset amount (lateral position) in the y-axis direction of the road model function.
- a lateral positional relationship between a plurality of parallel road boundary point groups is shown, and a relative positional relationship between a plurality of parallel road boundary point groups is shown.
- the runway shape estimation circuit 22 fits the runway feature points FP included in the plurality of parallel runway boundary point groups as shown in FIG. 4B.
- the road model function (KK 0 to KK 4 ) is moved by the same amount in the opposite direction to the offset amount (d 0 , d 1 , d 2 , d 3 , d 4 ) in the y-axis direction.
- the runway shape estimation circuit 22 can superimpose the runway feature points FP included in a plurality of parallel runway boundary point groups.
- the position where the runway feature points FP are overlapped may be the zero point of the y axis, or another runway boundary point group may be moved to any one of a plurality of parallel runway boundary point groups. , You may superimpose.
- the runway shape BC is expressed by a road model function.
- the track boundary evaluation circuit 23 determines a track boundary based on the lateral position (offset amount) of the plurality of parallel track boundary point groups and the track shape BC. Specifically, lane boundary evaluation circuit 23, as shown in Figure 5B, the offset amount of the y-axis direction of the track shape BC, road model function (KK 0 ⁇ KK 4) (Horizontal Position: d 0 ⁇ d 4) Just move each one. In other words, the position of each road boundary is restored to the shape of the road shape BC based on the road shape BC and the position (y coordinate) in the vehicle width direction of each road boundary point group.
- the track boundary evaluation circuit 23 determines a track boundary based on the degree of match of the track feature points included in each track boundary point group with respect to the track shape (BC 0 , BC 1 , BC 2 , BC 3 , BC 4 ).
- the runway boundary evaluation circuit 23 counts the number of runway feature points whose distance from the runway shape (BC 0 , BC 1 , BC 2 , BC 3 , BC 4 ) is shorter than the reference value as the degree of matching described above.
- the runway boundary evaluation circuit 23 determines that the degree of coincidence is low if the counted number of runway feature points is less than a predetermined value, and the degree of match is high if the number of runway feature points is equal to or greater than the predetermined value.
- the track boundary evaluation circuit 23 determines the track shape (BC 1 , BC 2 , BC 3 , BC 4 ) determined to have a high degree of match as a track boundary, and is configured from the determined track boundary as a track detection result. Output the track position information.
- step S01 as described with reference to FIG. 3A, FIG. 3B (a) and FIG. 3B (b), the travel path detection circuit 12 is based on the continuity of a plurality of travel path feature points FP. To extract.
- the road shape estimation circuit 22 determines whether or not a plurality of parallel road boundary point groups are extracted in step S01. For example, if two or more peaks are simultaneously found in a one-dimensional histogram along the y-axis, it can be determined that a plurality of parallel road boundary point groups have been extracted. Alternatively, the road shape estimation circuit 22 may determine that a plurality of parallel road boundary point groups have been extracted if there is a road model function in which the coefficients a, b, and c substantially match and the coefficient d is different.
- step S02 since it is possible to superimpose a plurality of parallel road boundary point groups, the process proceeds to step S03, and in the case of NO in step S02, the superimposition process is not possible, so the process proceeds to step S06. move on.
- step S03 the runway shape estimation circuit 22, as shown in FIG. 4B, the y-axis of the road model function (KK 0 to KK 4 ) to which the runway feature points FP included in the plurality of parallel runway boundary point groups are applied. It is moved by the same amount in the direction opposite to the direction offset amount (d 0 to d 4 ). Thereby, the runway shape estimation circuit 22 can superimpose a plurality of parallel runway boundary point groups.
- step S04 the road shape estimation circuit 22 applies a road model function to a plurality of overlapped road boundary point groups.
- the runway shape estimation circuit 22 can estimate the runway shape BC based on the runway feature points FP included in the overlapped multiple runway boundary point groups.
- step S05 lane boundary evaluation circuit 23, as shown in Figure 5B, the road shape BC, the offset amount of the y-axis direction of the road model function (KK 0 ⁇ KK 4) (Horizontal Position: d 0 ⁇ d 4) Just move each one.
- the road boundary evaluation circuit 23 can restore the position of each road boundary based on the position (y coordinate) in the vehicle width direction of each road boundary point group.
- step S06 the road boundary evaluation circuit 23 determines a road boundary based on the degree of matching of the road characteristic points FP included in each road boundary point group with respect to the road shape (BC 0 to BC 4 ). Specifically, first, the road boundary evaluation circuit 23 calculates the degree of coincidence of the road characteristic point FP with respect to the road shape (BC 0 to BC 4 ). And the runway boundary evaluation circuit 23 uses the runway shape (BC 0 ) determined to have a matching degree lower than a predetermined value as the runway shape extracted from the misdetected runway feature points (FP f1 , FP f2 ). Dismiss.
- the runway boundary evaluation circuit 23 determines the runway shape (BC 1 to BC 4 ) whose degree of coincidence is determined to be a predetermined value or more as the runway boundary, and as the runway detection result, the runway position information configured from the determined runway boundary Is output.
- the road detection device 1 shown in FIG. 1 extracts a road boundary point group by configuring a prior lane group from the road boundary obtained in the previous processing cycle in the road boundary point cloud extraction process shown in step S01. May be.
- step S10 and step S20 are executed.
- step S03 to 06 are the same as those in FIG.
- step S10 the road shape estimation circuit 22 configures a prior lane group using the road boundary obtained in the previous processing cycle. Specifically, as shown in FIG. 7, a case will be described in which four runway boundaries (SK 1 , SK 2 , SK 3 , SK 4 ) are obtained in the previous processing cycle.
- the runway shape estimation circuit 22 calculates an average runway width (w) from the four runway boundaries (SK 1 to SK 4 ). That is, the distance (running road width) between adjacent running road boundaries is obtained, and the average value (w) thereof is obtained. Then, the runway shape estimation circuit 22 adds new runway boundaries (SK W , SK-w) each having a runway width (w) on both sides of the four runway boundaries (SK 1 to SK 4 ). In total, the running path shape estimation circuit 22 configures six running path boundaries (SK 1 to SK 4 , SK W , SK-w) as a prior lane group.
- the advance lane group is determined by the lane group information of the map information. It may be configured.
- the road shape estimation circuit 22 extracts the road boundary point group by grouping the road characteristic points FP based on the degree of coincidence with the prior lane group.
- the runway shape estimation circuit 22 calculates the distance between the runway feature point FP and each runway boundary (SK 1 to SK 4 , SK W , SK-w) constituting the advance lane group, and the runway boundary ( SK 1 to SK 4 , SK W , SK-w) are assigned the road feature points FP. Then, the road feature points assigned to the same road boundary are grouped and extracted as one road boundary point group.
- the road shape estimation circuit 22 applies to the road characteristic point groups assigned to the same road boundary (SK 1 to SK 4 , SK W , SK-w). Then, a cubic function is applied, and an offset amount with respect to the origin of the coordinate system at the road boundary is obtained from the obtained constant term (d).
- the runway shape estimation circuit 22 estimates the runway shape (BC 0 to BC 4 ) based on the runway feature points FP included in the plurality of overlapped runway boundary point groups, and the runway boundary evaluation circuit 23 A road boundary is determined based on the lateral position (d 0 to d 4 ) and the road shape (BC 0 to BC 4 ) of the road boundary point group.
- a runway shape point can be estimated by eliminating runway feature points belonging to a branch road or the like that is not parallel to another runway. Therefore, it is possible to stably estimate the shape of the main runway currently being observed, and to obtain information such as the number of lanes and the lane width of the entire runway.
- the plurality of runway feature points included in the first surrounding map are runway feature points that are detected at different times and are connected in consideration of the amount of vehicle movement. Therefore, compared with the case where a runway shape is judged only from the runway feature point detected at one time, a runway shape can be estimated with higher accuracy.
- the track boundary evaluation circuit 23 determines a track boundary based on the degree of match (probability) of the track feature points included in each track boundary point group with respect to the track shape (BC 0 to BC 4 ). Thereby, the mis-detected runway feature points (FP f1 , FP f2 ) and the wrongly estimated runway shape can be rejected based on the degree of match.
- the runway shape and the runway boundary are obtained only from the runway feature points detected at one time. Will be described.
- the travel path detection device 2 does not include the travel distance detection sensor 10 of FIG. 1 because it is not necessary to generate the first surrounding map by connecting the travel path feature points in consideration of the travel distance of the vehicle.
- the runway detection circuit 12 does not include the surrounding map generation circuit 21 of FIG.
- the camera 34 ′ is installed in the vehicle in a state where the imaging direction is directed to the road surface in the traveling direction of the vehicle. Other configurations are the same as those of the travel path detection apparatus 1.
- the camera 34 ′ is attached in front of the vehicle interior of the vehicle and images a road marking in front of the vehicle.
- FIG. 9A shows an example of an image 52 captured by the camera 34 ′.
- the image 52 includes a road marking (lane marker 56) indicating a road boundary.
- the image processing circuit 35 detects an edge portion of the lane marker 56 in which the brightness of the image 52 changes sharply or discontinuously as the runway feature point FP.
- the runway shape estimation circuit 22 converts the position of the detected runway feature point FP on the image 52 into a position on the overhead view as viewed from above the vehicle 51, as shown in FIG. 9B.
- the traveling road shape estimation circuit 22 executes this viewpoint conversion processing based on the attachment angle of the camera 34 ′ with respect to the road surface, that is, the angle formed by the imaging direction with respect to the road surface, and the distance from the road surface to the camera 34 ′.
- the runway shape estimation circuit 22 groups the runway feature points FP for each runway boundary 56 on the overhead coordinates in FIG. 9B, and extracts a runway feature point group. That is, the runway shape estimation circuit 22 performs processing on the runway feature point FP shown in the overhead coordinates of FIG.
- FIG. 9B instead of the first and second surrounding maps shown in FIGS. 3A and 3B (a).
- the road feature point FP is detected within the angle of view 53a, 53b of the camera 34 '. If the frequency of the coordinates (y coordinate) in the vehicle width direction of the runway feature point FP is taken, a histogram can be created for the feature points in one frame image in the same manner as in FIG. 3B (b). . Therefore, the runway shape estimation circuit 22 may determine the continuity of the plurality of runway feature points FP from this histogram.
- the road shape and the road boundary can be detected in a shorter time compared to the road feature points detected at different times and connected in consideration of the moving amount of the vehicle.
- the camera 34 ′ can detect a road boundary in front of the vehicle 51 that cannot be obtained from the detection history of the past road feature points by imaging a road marking marked on the road surface in front of the vehicle 51. it can.
- FIG. 10 An example of a runway detection method according to the third embodiment will be described with reference to the flowchart of FIG.
- the operation procedure of the road detection circuit 12 in the road detection device 1 will be described.
- the process shown in FIG. 10 is repeatedly executed at a predetermined cycle.
- Steps S01 to S06 are the same as those in the first embodiment, and a description thereof will be omitted.
- step S06 the process proceeds to step S30, and the runway detection circuit 12 determines whether or not there is a runway shape having a matching degree equal to or higher than the reference value in step S06. That is, it is determined whether or not there is a road shape determined as a road boundary because the degree of match is equal to or greater than the reference value.
- a road boundary is detected (YES in S30)
- main line main line
- branch road another road
- step S31 the road detection circuit 12 extracts a road boundary point group in which the degree of coincidence with the road shape of the road characteristic point FP is lower than a predetermined value. For example, as shown in FIG. 11A, a plurality of runway feature points FP 1 (match point group) that support the runway shape (BC 1 to BC 3 ) of the main runway 54 are deleted, and the runway boundary of the main runway 54 is not supported. Only the runway feature point FP 2 is left.
- step S01 by grouping the track feature points FP 2 left extracting the lane boundary point group.
- the runway detection circuit 12 can detect the runway shape (BC 4 , BC 5 ) shown in FIG. After that, the process proceeds to step S31.
- the features other than the road boundary point group constituting the main line and the branch path Only point clouds are extracted. That is, since no road feature points that do not support the road shape (BC 1 to BC 5 ) remain, no road boundary is detected in step S06, and NO is determined in step S31.
- the running position information constituted by the running road boundary between the main line 54 and the branch road 55 determined so far is output, and the processing cycle is ended.
- a road boundary point group whose degree of match with respect to the road shape (BC 1 to BC 3 ) is lower than a predetermined value is extracted, and a road boundary point group whose degree of match is lower than a predetermined value.
- the other road shape (BC 4 , BC 5 ) is estimated based on the above. Thereby, not only a runway shape (main runway) but other runway shapes (branch roads, etc.) can be estimated.
- the grouping process of the runway feature points using the histogram has been described in the first embodiment.
- a histogram can be taken for the road feature point FP in the image 52 of one frame.
- the processing circuit includes a programmed processing device such as a processing device including an electrical circuit.
- Processing devices also include devices such as application specific integrated circuits (ASICs) and conventional circuit components arranged to perform the functions described in the embodiments.
- ASICs application specific integrated circuits
- the stand-alone type roadway detection device (1, 2) including the movement amount detection sensor 10 and the target detection sensor 11 is exemplified, but the roadway detection device uses a computer network via a wireless communication network. It can also be realized as a client-server model.
- the vehicle 51 (client) including the movement amount detection sensor 10 and the target detection sensor 11 is connected to the travel path detection device (server) via the computer network.
- the server provided with the runway detection circuit 12 shown in FIG. 1 or FIG. 8 can be connected to the movement amount detection sensor 10 and the target detection sensor 11 via the computer network.
- the travel path detection device is mainly configured by the travel path detection circuit 12 (server), and the movement amount detection sensor 10 and the target detection sensor 11 are not included in the travel path detection device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Probability & Statistics with Applications (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Geometry (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
次に、図面を参照して、実施形態を詳細に説明する。
KK1:y=ax3+bx2+cx+d1
KK2:y=ax3+bx2+cx+d2
KK3:y=ax3+bx2+cx+d3
KK4:y=ax3+bx2+cx+d4
図1に示した走路検出装置1は、ステップS01で示した走路境界点群の抽出処理において、前回の処理サイクルで求められた走路境界から事前車線群を構成して、走路境界点群を抽出してもよい。
SK2:y=a’x3+b’x2+c’x+d’2
SK3:y=a’x3+b’x2+c’x+d’3
SK4:y=a’x3+b’x2+c’x+d’4
第2実施形態では、異なる時刻に検出され、且つ車両の移動量を考慮してつなぎ合わせた走路特徴点の代わりに、一時刻で検出された走路特徴点のみから走路形状及び走路境界を求める例について説明する。
第3実施形態では、主たる走路(本線)を検出した後の再検索によって、主たる走路とは異なる走路、例えば分岐路を検出する例を説明する。ここでは、第1実施形態(図2)で示した主たる走路の検出処理の後に、分岐路の再検索の処理を付け加えた例について説明するが、第1実施形態の変形例(図6)或いは第2実施形態の後に実施しても構わない。
10 移動量検出センサ
11 物標検出センサ
12 走路検出回路
22 走路形状推定回路
23 走路境界評価回路
51 車両
BC0~BC5 走路形状
FP 走路特徴点
Claims (5)
- 車両に搭載された物標検出センサが検出した複数の走路特徴点に基づいて走路境界を検出する走路検出回路を用いた走路検出方法であって、前記走路検出回路は、
前記複数の走路特徴点の連続性に基づいて走路境界点群を抽出し、
平行する複数の走路境界点群が抽出された場合には、前記平行する複数の走路境界点群を重ね合わせ、
重ね合わせた複数の走路境界点群に含まれる走路特徴点に基づいて走路形状を推定し、
前記平行する複数の走路境界点群の横位置と前記走路形状とに基づいて走路境界を決定する
ことを特徴とする走路検出方法。 - 前記複数の走路特徴点は、異なる時刻に検出され、且つ車両の移動量を考慮してつなぎ合わせた走路特徴点であることを特徴とする請求項1に記載の走路検出方法。
- 前記走路検出回路は、各走路境界点群に含まれる走路特徴点の前記走路形状に対する合致度に基づいて走路境界を決定することを特徴とする請求項1又は2に記載の走路検出方法。
- 前記走路検出回路は、走路境界点群に含まれる走路特徴点の前記走路形状に対する合致度が所定値より低い走路境界点群を抽出し、前記合致度が所定値より低い走路境界点群に基づいて他の走路形状を推定することを特徴とする請求項1~3のいずれか一項に記載の走路検出方法。
- 車両に搭載された物標検出センサが検出した複数の走路特徴点に基づいて走路境界点群を抽出し、平行する複数の走路境界点群が抽出された場合には、前記平行する複数の走路境界点群を重ね合わせ、重ね合わせた複数の走路境界点群に含まれる走路特徴点に基づいて走路形状を推定する走路形状推定回路と、
前記平行する複数の走路境界点群の横位置と前記走路形状とに基づいて走路境界を決定する走路境界評価回路と、
を備えることを特徴とする走路検出装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020187029663A KR20180122691A (ko) | 2016-03-24 | 2016-03-24 | 주로 검출 방법 및 주로 검출 장치 |
BR112018069472-0A BR112018069472B1 (pt) | 2016-03-24 | 2016-03-24 | Método de detecção de faixa de tráfego e dispositivo de detecção de faixa de tráfego |
RU2018137203A RU2725561C2 (ru) | 2016-03-24 | 2016-03-24 | Способ и устройство обнаружения полос движения |
US16/087,477 US10614321B2 (en) | 2016-03-24 | 2016-03-24 | Travel lane detection method and travel lane detection device |
PCT/JP2016/059397 WO2017163367A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
EP16895403.0A EP3435352B1 (en) | 2016-03-24 | 2016-03-24 | Travel path detection method and travel path detection device |
CA3018661A CA3018661C (en) | 2016-03-24 | 2016-03-24 | Travel lane detection method and travel lane detection device |
JP2018506707A JP6617828B2 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
CN201680083901.9A CN108885831B (zh) | 2016-03-24 | 2016-03-24 | 行进路检测方法及行进路检测装置 |
MX2018011509A MX2018011509A (es) | 2016-03-24 | 2016-03-24 | Metodo de deteccion de carril de circulacion y dispositivo de deteccion de carril de circulacion. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/059397 WO2017163367A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017163367A1 true WO2017163367A1 (ja) | 2017-09-28 |
Family
ID=59900032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/059397 WO2017163367A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10614321B2 (ja) |
EP (1) | EP3435352B1 (ja) |
JP (1) | JP6617828B2 (ja) |
KR (1) | KR20180122691A (ja) |
CN (1) | CN108885831B (ja) |
BR (1) | BR112018069472B1 (ja) |
CA (1) | CA3018661C (ja) |
MX (1) | MX2018011509A (ja) |
RU (1) | RU2725561C2 (ja) |
WO (1) | WO2017163367A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022535351A (ja) * | 2019-05-28 | 2022-08-08 | モービルアイ ビジョン テクノロジーズ リミテッド | 車両ナビゲーションのためのシステム及び方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6740470B2 (ja) * | 2017-05-19 | 2020-08-12 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
CN109086650B (zh) * | 2017-06-14 | 2022-04-12 | 现代摩比斯株式会社 | 校准方法和校准设备 |
US11023746B2 (en) | 2019-01-02 | 2021-06-01 | Here Global B.V. | Lane count estimation |
US11932245B2 (en) * | 2020-09-01 | 2024-03-19 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for improving path selection for automated driving |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001236506A (ja) * | 2000-02-22 | 2001-08-31 | Nec Corp | 白線検出方法および白線検出装置 |
JP2005346383A (ja) * | 2004-06-02 | 2005-12-15 | Toyota Motor Corp | 境界線検出装置 |
JP2006350571A (ja) * | 2005-06-14 | 2006-12-28 | Toyota Motor Corp | 道路区画線検出装置 |
JP2012022574A (ja) * | 2010-07-15 | 2012-02-02 | Fuji Heavy Ind Ltd | 車両用白線認識装置 |
JP2015069340A (ja) * | 2013-09-27 | 2015-04-13 | 富士重工業株式会社 | 車両用白線認識装置 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
US5991427A (en) * | 1996-07-31 | 1999-11-23 | Aisin Seiki Kabushiki Kaisha | Method and apparatus for detecting a lane on a road |
US6819779B1 (en) * | 2000-11-22 | 2004-11-16 | Cognex Corporation | Lane detection system and apparatus |
JP4551018B2 (ja) * | 2001-04-05 | 2010-09-22 | 富士通株式会社 | 画像結合装置 |
JP3603836B2 (ja) * | 2001-11-20 | 2004-12-22 | 日産自動車株式会社 | 道路白線認識装置 |
JP2004200351A (ja) * | 2002-12-18 | 2004-07-15 | Hitachi Ltd | 露光装置及び露光方法 |
JP3956926B2 (ja) * | 2003-09-24 | 2007-08-08 | アイシン精機株式会社 | 路面走行レーン検出装置 |
JP3864945B2 (ja) * | 2003-09-24 | 2007-01-10 | アイシン精機株式会社 | 路面走行レーン検出装置 |
JP4616046B2 (ja) * | 2005-03-22 | 2011-01-19 | 本田技研工業株式会社 | 車両用画像処理システム、車両用画像処理方法、車両用画像処理プログラム、及び車両 |
JP4659631B2 (ja) * | 2005-04-26 | 2011-03-30 | 富士重工業株式会社 | 車線認識装置 |
JP4965991B2 (ja) * | 2006-01-10 | 2012-07-04 | 株式会社リコー | カラー画像形成装置 |
JP2007241468A (ja) * | 2006-03-06 | 2007-09-20 | Toyota Motor Corp | 車線変更検出装置 |
US7916935B2 (en) * | 2006-09-19 | 2011-03-29 | Wisconsin Alumni Research Foundation | Systems and methods for automatically determining 3-dimensional object information and for controlling a process based on automatically-determined 3-dimensional object information |
JP2009041972A (ja) * | 2007-08-07 | 2009-02-26 | Toshiba Corp | 画像処理装置及びその方法 |
JP5363921B2 (ja) * | 2009-08-31 | 2013-12-11 | 富士重工業株式会社 | 車両用白線認識装置 |
JP5035371B2 (ja) * | 2010-03-15 | 2012-09-26 | アイシン精機株式会社 | 横断歩道検出装置、横断歩道検出システム,横断歩道検出方法及びプログラム |
US20120022739A1 (en) * | 2010-07-20 | 2012-01-26 | Gm Global Technology Operations, Inc. | Robust vehicular lateral control with front and rear cameras |
JP5258859B2 (ja) * | 2010-09-24 | 2013-08-07 | 株式会社豊田中央研究所 | 走路推定装置及びプログラム |
JP5568029B2 (ja) * | 2011-02-09 | 2014-08-06 | 富士重工業株式会社 | 車両用白線認識装置 |
CN102184535B (zh) * | 2011-04-14 | 2013-08-14 | 西北工业大学 | 一种车辆所在车道边界检测方法 |
US9098751B2 (en) * | 2011-07-27 | 2015-08-04 | Gentex Corporation | System and method for periodic lane marker identification and tracking |
WO2013018673A1 (ja) * | 2011-08-02 | 2013-02-07 | 日産自動車株式会社 | 立体物検出装置及び立体物検出方法 |
JP5862670B2 (ja) * | 2011-08-02 | 2016-02-16 | 日産自動車株式会社 | 走行支援装置および走行支援方法 |
MX2014001500A (es) * | 2011-09-12 | 2014-05-12 | Nissan Motor | Dispositivo de deteccion de objeto tridimensional. |
RU2571871C2 (ru) * | 2012-12-06 | 2015-12-27 | Александр ГУРЕВИЧ | Способ определения границ дороги, формы и положения объектов, находящихся на дороге, и устройство для его выполнения |
KR101906951B1 (ko) * | 2013-12-11 | 2018-10-11 | 한화지상방산 주식회사 | 차선 검출 시스템 및 차선 검출 방법 |
-
2016
- 2016-03-24 EP EP16895403.0A patent/EP3435352B1/en active Active
- 2016-03-24 KR KR1020187029663A patent/KR20180122691A/ko active Search and Examination
- 2016-03-24 CA CA3018661A patent/CA3018661C/en active Active
- 2016-03-24 JP JP2018506707A patent/JP6617828B2/ja active Active
- 2016-03-24 BR BR112018069472-0A patent/BR112018069472B1/pt active IP Right Grant
- 2016-03-24 RU RU2018137203A patent/RU2725561C2/ru active
- 2016-03-24 US US16/087,477 patent/US10614321B2/en active Active
- 2016-03-24 MX MX2018011509A patent/MX2018011509A/es active IP Right Grant
- 2016-03-24 WO PCT/JP2016/059397 patent/WO2017163367A1/ja active Application Filing
- 2016-03-24 CN CN201680083901.9A patent/CN108885831B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001236506A (ja) * | 2000-02-22 | 2001-08-31 | Nec Corp | 白線検出方法および白線検出装置 |
JP2005346383A (ja) * | 2004-06-02 | 2005-12-15 | Toyota Motor Corp | 境界線検出装置 |
JP2006350571A (ja) * | 2005-06-14 | 2006-12-28 | Toyota Motor Corp | 道路区画線検出装置 |
JP2012022574A (ja) * | 2010-07-15 | 2012-02-02 | Fuji Heavy Ind Ltd | 車両用白線認識装置 |
JP2015069340A (ja) * | 2013-09-27 | 2015-04-13 | 富士重工業株式会社 | 車両用白線認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3435352A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022535351A (ja) * | 2019-05-28 | 2022-08-08 | モービルアイ ビジョン テクノロジーズ リミテッド | 車両ナビゲーションのためのシステム及び方法 |
Also Published As
Publication number | Publication date |
---|---|
CN108885831B (zh) | 2020-04-14 |
EP3435352A1 (en) | 2019-01-30 |
CA3018661C (en) | 2020-06-09 |
US20190095723A1 (en) | 2019-03-28 |
EP3435352B1 (en) | 2021-11-17 |
RU2018137203A3 (ja) | 2020-04-24 |
MX2018011509A (es) | 2019-01-10 |
KR20180122691A (ko) | 2018-11-13 |
RU2725561C2 (ru) | 2020-07-02 |
JP6617828B2 (ja) | 2019-12-18 |
BR112018069472A2 (pt) | 2019-02-12 |
RU2018137203A (ru) | 2020-04-24 |
EP3435352A4 (en) | 2019-03-27 |
BR112018069472B1 (pt) | 2023-12-19 |
JPWO2017163367A1 (ja) | 2019-02-14 |
CN108885831A (zh) | 2018-11-23 |
CA3018661A1 (en) | 2017-09-28 |
US10614321B2 (en) | 2020-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6617828B2 (ja) | 走路検出方法及び走路検出装置 | |
EP3082066B1 (en) | Road surface gradient detection device | |
US9797981B2 (en) | Moving-object position/attitude estimation apparatus and moving-object position/attitude estimation method | |
EP2372607A2 (en) | Scene matching reference data generation system and position measurement system | |
EP2372611A2 (en) | Scene matching reference data generation system and position measurement system | |
KR101977652B1 (ko) | 모바일 맵핑 시스템을 이용한 도로 면형 자동 취득 방법 | |
CN109074741B (zh) | 行进路检测方法及行进路检测装置 | |
CN110110678B (zh) | 道路边界的确定方法和装置、存储介质及电子装置 | |
JP2012159469A (ja) | 車両用画像認識装置 | |
JP2007316685A (ja) | 走路境界検出装置および走路境界検出方法 | |
JP2017174197A (ja) | 立体物検出方法及び立体物検出装置 | |
JP7024176B2 (ja) | 走路検出方法及び走路検出装置 | |
JP2012159470A (ja) | 車両用画像認識装置 | |
JP7058128B2 (ja) | 情報処理装置およびプログラム | |
JP6143176B2 (ja) | 停止線検出装置、移動体制御装置及び停止線検出用プログラム | |
JP2016151863A (ja) | 白線検出装置 | |
JP6451544B2 (ja) | 道路境界検出装置、自己位置推定装置及び道路境界検出方法 | |
JP2018096935A (ja) | 自車位置推定装置、プログラム、記録媒体、および自車位置推定方法 | |
JP2021018604A (ja) | 画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018506707 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 3018661 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/011509 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187029663 Country of ref document: KR Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018069472 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016895403 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016895403 Country of ref document: EP Effective date: 20181024 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16895403 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112018069472 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180924 |