WO2020090250A1 - Image processing apparatus, image processing method and program - Google Patents

Image processing apparatus, image processing method and program Download PDF

Info

Publication number
WO2020090250A1
WO2020090250A1 PCT/JP2019/036083 JP2019036083W WO2020090250A1 WO 2020090250 A1 WO2020090250 A1 WO 2020090250A1 JP 2019036083 W JP2019036083 W JP 2019036083W WO 2020090250 A1 WO2020090250 A1 WO 2020090250A1
Authority
WO
WIPO (PCT)
Prior art keywords
straight line
boundary
unit
coordinate position
line
Prior art date
Application number
PCT/JP2019/036083
Other languages
French (fr)
Japanese (ja)
Inventor
之寛 斉藤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020090250A1 publication Critical patent/WO2020090250A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • This technology relates to image processing devices, image processing methods, and programs, and accurately detects boundaries using captured images.
  • straight lines have been detected and, for example, white lines on roads have been recognized based on the detection results.
  • a straight line penetrating the edge point group is calculated for each window, and a white line candidate point selected from the edge point group penetrating the straight line is filtered to calculate the shape of the road white line.
  • straight lines are detected by using a method such as Hough transform or RANSAC (Random Sample Consensus), and a plurality of detected straight lines are filtered to detect white lines. ..
  • the filter processing result may diverge.
  • the detection methods of Patent Document 1 and Patent Document 2 are weak against noise in the road and lack robustness, so that it is not possible to accurately detect, for example, a boundary between a white line and a portion other than the white line.
  • the purpose of this technology is to provide an image processing device, an image processing method, and a program that can accurately detect boundaries using captured images.
  • the first aspect of this technology is Filtering processing for selecting a straight line corresponding to the straight line indicated by the predetermined coordinate position from the detected straight lines based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position
  • the image processing apparatus includes a unit.
  • the straight line detection unit performs Hough transform using an image of an image region near the image capturing position in a captured image generated by an image capturing device provided in a moving body, and a polar coordinate space indicating a straight line in the captured image.
  • the coordinate position in the ⁇ - ⁇ space which is the coordinate position, is detected.
  • the filter processing unit extracts the straight line indicated by the predetermined coordinate position from the captured image based on the distance between the coordinate position detected by the straight line detection unit and the predetermined coordinate position.
  • the predetermined coordinate position is a preset coordinate position or a coordinate position indicating a straight line detected from a captured image generated at a time different from the captured image by the filter processing unit.
  • the predetermined coordinate position is a coordinate position indicating the straight line closest to the moving body in the straight line detected by the straight line detection unit.
  • the filter processing unit sets a permissible range for a predetermined coordinate position and uses a coordinate position within the permissible range at the coordinate position detected by the straight line detection unit.
  • the allowable range is set according to the moving state of the moving body or the moving environment. Further, when the straight line corresponding to the straight line indicated by the predetermined coordinate position cannot be selected from the detected straight lines, the filter processing unit outputs the lost determination information indicating that the corresponding straight line cannot be selected.
  • the decompression unit sets a search reference area of a predetermined size including the straight line selected by the filter processing unit, and sets the boundary search area in the direction of the line segment of the boundary seed that is the selected straight line in the search reference area. Then, a straight line having a gradient with respect to the boundary seed smaller than the threshold value and connected to the boundary seed is extracted from the boundary search region as a boundary extension line. In addition, the decompression unit sets a new boundary search region adjacent to the extracted boundary decompression line in the direction of the line segment of the extracted boundary decompression line according to the fact that the boundary decompression line is extracted from the boundary decompression line.
  • a straight line whose inclination to the line is smaller than the threshold value and which is connected to the boundary extension line is extracted from the new boundary search region as a boundary extension line.
  • the decompression unit sets the curvature range of the estimated boundary line formed by the boundary seed and the boundary expansion line based on the area size of the boundary search area and the threshold value.
  • the second aspect of this technology is Based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position, a straight line corresponding to the straight line indicated by the predetermined coordinate position is obtained from the detected straight line by the filter processing unit.
  • An image processing method including selecting.
  • the third aspect of this technology is A program that causes a computer to perform image processing of a captured image, A procedure for selecting, from the detected straight lines, a straight line corresponding to the straight line indicated by the predetermined coordinate position based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position.
  • the program that is executed by the computer In the program that is executed by the computer.
  • the program of the present technology is, for example, a storage medium or a communication medium provided in a computer-readable format to a general-purpose computer capable of executing various program codes, for example, a storage medium such as an optical disk, a magnetic disk, or a semiconductor memory. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program is realized on the computer.
  • FIG. 1 illustrates the configuration of the image processing apparatus.
  • the image processing device 20 has a straight line detection unit 21, a filter processing unit 22, and an expansion unit 23.
  • the straight line detection unit 21 detects the coordinate position in the polar coordinate space indicating the straight line from the captured image generated by the image capturing apparatus 10, and detects the coordinate position of the straight line that is a candidate for the boundary.
  • the straight line detection unit 21 detects a straight line using, for example, Hough transform, and outputs the coordinate position in the ⁇ - ⁇ space indicating the detected straight line (also referred to as a candidate straight line) to the filter processing unit 22 as candidate straight line information. Further, the straight line detection unit 21 suppresses erroneous detection of a candidate straight line by setting the straight line detection area for detecting the straight line in the vicinity of the neighborhood and the range where the boundary appears in the detection of the straight line.
  • the filter processing unit 22 detects a straight line corresponding to the straight line indicated by the predetermined coordinate position based on the distance between the coordinate position in the polar coordinate space indicating the straight line detected by the straight line detection unit 21 from the captured image and the predetermined coordinate position. Select from the straight lines.
  • the filter processing unit 22 uses a time series filter such as a Kalman filter, an extended Kalman filter, a sigma point Kalman filter, or a particle filter.
  • the filter processing unit 22 filters the candidate straight line indicated by the candidate straight line information supplied from the straight line detection unit 21, and selects the candidate straight line corresponding to the straight line indicating the boundary indicated by the predetermined coordinate position.
  • the coordinate position of the straight line indicating the boundary in the polar coordinate space is set in advance as a predetermined coordinate position (also referred to as a reference position).
  • the filter processing unit 22 performs a time-series filter process to determine a coordinate position of the candidate straight line detected by the straight line detection unit 21 from the coordinate position to the reference position that is the shortest and is shorter than a preset threshold value. Select a straight line.
  • the filter processing unit 22 sets the coordinate position of the selected straight line as a new reference position, and determines the coordinate position closest to the reference position from the coordinate position indicating the candidate straight line detected by the straight line detection unit 21 from the subsequent captured image. Select the straight line as described above.
  • the filter processing unit 22 outputs selected straight line information indicating a straight line (also referred to as a selected straight line) selected from the candidate straight lines to the decompression unit 23. Further, when the straight line corresponding to the straight line indicated by the reference position cannot be selected from the candidate straight lines, the filter processing unit 22 outputs the lost determination information indicating that the straight line corresponding to the straight line indicated by the reference position cannot be selected. To do.
  • the expansion unit 23 sets a search reference area of a predetermined size including the selected straight line indicated by the selected straight line information from the filter processing unit 22. Further, the decompressing unit 23 sets a boundary search area in the direction of the line segment of the boundary seed with the portion included in the search reference area in the selected straight line as a boundary seed, and the inclination from the boundary search area to the boundary seed is smaller than the threshold value. A straight line connected to the boundary seed is extracted as a boundary extension line. Further, in response to the extraction of the boundary extension line, the expansion unit 23 newly sets a boundary search area adjacent to the boundary extension line in the direction of the line segment.
  • the decompression unit 23 extracts a straight line that has a smaller inclination with respect to the boundary extension line than the threshold value and that is connected to the boundary extension line from the newly set boundary search area, and sets it as a new boundary extension line. Similarly, the above processing is repeated until the boundary extension line cannot be extracted or until the boundary extension line is continuously extracted a predetermined number of times.
  • the decompression unit 23 outputs the information indicating the estimated boundary line to the outside, with the boundary seed extracted by the filter processing unit 22 and the boundary expansion line connected to the boundary seed as the estimated boundary line. Further, the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line.
  • the straight line detection unit 21 performs straight line detection using, for example, Hough transform, and converts a straight line passing through a point on the xy plane into a coordinate position in a ⁇ - ⁇ space which is a polar coordinate space.
  • FIG. 2 is a diagram for explaining the Hough transform.
  • FIG. 2A illustrates a straight line L on the xy plane, and the straight line passing through the point (xi, yi) is the distance between the origin and the straight line as a parameter ⁇ , and the angle between the normal line of the straight line and the x axis. Is a parameter ⁇ , it can be expressed by the relationship of the equation (1).
  • Fig. 3 shows an operation example of Hough conversion.
  • the straight line detection unit 21 performs a filter process for removing noise without blurring the contour of the captured image, for example, a filter process using a bilateral filter or the like.
  • the binarized edge image shown in FIG. 3B is generated from the captured image after the filtering process.
  • the straight line detection unit 21 projects the edge points of the binarized edge image in the ⁇ - ⁇ space, and the parameters ⁇ and ⁇ of the points where the characteristic curves of the respective edge points overlap correspond to the characteristic curve in which the overlapping occurs.
  • ⁇ and ⁇ be the parameters of the straight line that passes through each edge point.
  • the straight line detection unit 21 sets the straight line detection area AS in the vicinity of the neighborhood and in the range where the boundary appears, as shown in FIG.
  • the imaging device 10 is provided in a moving body (for example, a vehicle), and the straight line detection unit 21 uses a captured image obtained by imaging the front of the imaging device.
  • the boundary line extending in front of the vehicle is displayed as a straight line in which the boundary line having the same length is longer on the captured image in the near region than in the distant region. For this reason, when performing with a predetermined length (section length) for determining whether or not the straight line is the boundary line, there is a possibility that the straight line cannot be detected with the accuracy of the section length when the image of the distant area is used.
  • the straight line detection unit 21 can determine whether or not the straight line is the boundary line with the accuracy of the section length by setting the straight line detection area AS in the neighborhood area, and the erroneous detection of the candidate straight line is performed. Can be suppressed. Further, the calculation cost of the straight line detection can be reduced as compared with the case where the straight line detection area AS is not set.
  • FIG. 4 is a diagram for explaining a straight line detection operation based on edge points.
  • edge points x1, y1) (x2, y2) (x3, y3) are detected on the xy plane as shown in FIG.
  • characteristic curves for each edge point in the ⁇ - ⁇ space are curves CLa, CLb, CLc.
  • the straight line corresponding to the coordinate position ( ⁇ s, ⁇ s) is an edge point (indicated by a broken line in FIG. 4A).
  • the straight line SL passes through x1, y1) (x2, y2) (x3, y3). Therefore, the straight lines La1 to Lan in the straight line detection area AS can be detected as shown in FIG. 3 (c) based on the edge points detected in the straight line detection area AS shown in FIG. 3 (b).
  • the filter processing unit 22 performs time-series filter processing on the candidate straight line information in the ⁇ - ⁇ space, and selects a candidate straight line close to a predetermined preset initial value.
  • the candidate straight line of the boundary does not have continuity in the slope a and the intercept b with respect to time, and the candidate straight line indicating the boundary is accurately selected by the time series filtering process. Difficult to do.
  • the slope a diverges to infinity, which makes it impossible to perform the filter calculation.
  • a line that is close to the boundary candidate line with respect to time has a characteristic that it has continuity and does not change abruptly. Therefore, it is possible to select a boundary straight line from the candidate lines.
  • the filter processing unit 22 performs time-series filter processing using the candidate straight line information, and the distance from the coordinate position of the candidate straight line detected by the straight line detection unit 21 to the reference position is the shortest and shorter than a preset threshold value. Select the straight line at the coordinate position that is the distance.
  • the distance for example, the Euclidean distance in the ⁇ - ⁇ space may be used, or the Mahalanobis distance using the covariance matrix of the Kalman filter may be used.
  • the filter processing unit 22 selects the straight line having the shortest calculated distance from the candidate straight lines.
  • the distance is calculated based on Expression (3).
  • a candidate straight line Lfdmin having the shortest dist (Lfi, Ltg) is selected.
  • the filter processing unit 22 sets the coordinate position indicating the selected straight line to a new reference position, and filters the candidate straight lines detected from the captured image thereafter.
  • FIG. 5 shows an operation example of the filter processing unit.
  • FIG. 5A illustrates the candidate straight lines detected from the captured image at time t0 in the ⁇ - ⁇ space.
  • FIG. 5B illustrates the candidate straight lines detected from the captured image at time t1 after time t0 in the ⁇ - ⁇ space.
  • the black circles represent candidate straight lines.
  • the filter processing unit 22 has the shortest distance from the coordinate position PSr and is set in advance.
  • the straight line indicated by the coordinate position PSLt0 which is a distance shorter than the threshold value, is set as the selected straight line.
  • the coordinate position PSLt0 as the reference position at the time t1
  • the coordinate position PSLt0 has the shortest distance from the coordinate position PSLt0 and is indicated by the coordinate position PSLt1 which is shorter than the preset threshold value. Select the straight line as the selected straight line.
  • a selected straight line is tracked in the time direction by sequentially selecting a straight line whose coordinate position is closest to the reference position and has a distance shorter than a preset threshold value.
  • the filter processing unit 22 outputs the selected straight line information indicating the selected straight line to the decompression unit 23.
  • the filter processing unit 22 cannot select a straight line corresponding to the straight line indicated by the reference position from the candidate straight lines, that is, a straight line at a coordinate position whose distance from the reference position is shorter than a preset threshold value is a candidate. If the straight line cannot be selected, the lost determination information indicating that the straight line corresponding to the straight line indicated by the reference position cannot be selected is output. For example, if a candidate straight line whose distance is shorter than a preset threshold value is not found continuously for a predetermined period, lost determination information indicating that the selected straight line (or boundary seed) is not detected is generated and output to the outside.
  • the filter processing unit 22 sets the coordinate position indicating the straight line selected from the candidate straight lines based on the predetermined rule to the reference position and starts the time-series filter processing, not only when the reference position is set in advance. Good.
  • the filter processing unit 22 may adjust the threshold value according to a user operation, a captured image acquisition status, or the like.
  • the threshold can be set by the parameter ⁇ and the parameter ⁇ .
  • the filter processing unit 22 can set the permissible range when the boundary included in the captured image is displaced by adjusting the threshold value of the parameter ⁇ according to the user operation or the acquisition status of the captured image.
  • the filter processing unit 22 can set the allowable range of continuity (smoothness) of the boundary included in the captured image by adjusting the threshold value of the parameter ⁇ according to the user operation or the acquisition status of the captured image. ..
  • the filter characteristic of the filter processing unit 22 can be adjusted by the parameter ⁇ and the parameter ⁇ .
  • the threshold value can be adjusted by the parameter ⁇ and the parameter ⁇
  • the user can easily adjust the filter characteristic to a desired characteristic with the parameter that is easy to understand.
  • the straight line can be specified by the parameter ⁇ and the parameter ⁇ , the setting of the selection process becomes easier as compared with the case of selecting the straight line on the xy plane.
  • the decompression unit 23 performs viewpoint conversion of the captured image.
  • the vertical direction is the viewpoint direction with respect to the plane where the selected straight line indicated by the selected straight line information is located.
  • the viewpoint conversion may be performed using camera parameters that are acquired in advance for the imaging device 10 that generated the captured image. By performing the viewpoint conversion in this way, the boundary included in the captured image can be represented by a two-dimensional plane, and the boundary extension line can be easily detected.
  • the decompressing unit 23 sets the search reference area so as to include the selection straight line with the position of the selection straight line indicated by the selection straight line information from the filter processing unit 22 as a reference, and uses the straight line in the search reference region as a boundary seed. To do. Further, the decompression unit 23 sets a boundary search area (search segment) in the direction of the line segment of the boundary seed and performs Hough transform of the boundary search area, and the inclination with respect to the boundary seed extracted from the boundary search area by the filter processing unit 22 is A straight line smaller than the threshold value and connected to the boundary seed is extracted as a boundary extension line.
  • the boundary search area and the threshold value are set in advance so that a boundary line having a curvature larger than the curvature specified by the manufacturer of the moving body or the user can be detected.
  • the length of the straight line in the direction of the line segment is a section length for determining whether or not the straight line is the boundary line.
  • the length of the straight line in the direction orthogonal to the line segment direction is set so as to include a line segment whose inclination with respect to the line segment of the boundary seed or the boundary extension line is a threshold value.
  • 6 and 7 are diagrams for explaining the operation of the decompression unit, and the imaging device 10 is provided in, for example, a moving body.
  • 6A illustrates the captured image at time tb1
  • FIG. 7A illustrates the captured image at time tb2, which is later than time tb1.
  • 6B shows a part of the bird's-eye view of the captured image at time tb1
  • FIG. 7B shows a part of the bird's-eye view of the captured image at time tb2 for reference.
  • the straight line detecting unit 21 detects a candidate straight line from the straight line detecting area provided in the vicinity position as described above, for example, the straight line detecting area near the front of the moving body.
  • FIGS. 7C to 7G show the moving body OBM in the captured image at time tb2.
  • the bird's-eye view which shows a front area is illustrated. Further, at the time tb1, the curve area EG-c of the boundary EG is separated from the moving body OBM, and at the time tb2, the curve area EG-c of the boundary EG is close to the moving body OBM.
  • FIG. 6 shows the search reference area AEs including the selected straight line detected from the captured image at the time tb1 and selected by the filter processing unit 22.
  • the boundary seed LEs is a selected straight line portion included in the search reference area AEs.
  • the decompressing unit 23 sets the boundary search area AEa adjacent to the search reference area AEs in the line segment direction of the boundary seed LEs with the position of the boundary seed LEs as a reference, as shown in (e) of FIG.
  • the decompression unit 23 performs the Hough transform on the boundary search area AEa, and as shown in (f) of FIG.
  • the boundary extension line LEa is detected.
  • the decompression unit 23 newly sets the boundary search area AEa in the line segment direction based on the position of the boundary extension line LEa in response to the extraction of the boundary extension line LEa.
  • the decompression unit 23 extracts a straight line whose inclination with respect to the boundary decompression line extracted immediately before is smaller than a threshold value and is connected to the boundary decompression line from the newly set boundary search area AEa, and sets it as a new boundary decompression line LEa.
  • the decompression unit 23 repeats the above process until the boundary decompression line cannot be extracted or until the boundary decompression line is continuously extracted a predetermined number of times.
  • the decompression unit 23 uses the boundary seed LEs extracted by the filter processing unit 22 and the boundary decompression line LEa sequentially extracted as the estimated boundary line LE to output information indicating the estimated boundary line to the outside. Output to. Further, the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line.
  • FIG. 7D shows the search reference area AEs including the selected straight line detected by the captured image at the time tb2 and selected by the filter processing unit 22.
  • the boundary seed LEs is a selected straight line portion included in the search reference area AEs.
  • the decompressing unit 23 sets the boundary search area AEa adjacent to the search reference area AEs in the line segment direction of the boundary seed LEs with the position of the boundary seed LEs as a reference, as shown in (e) of FIG. 7. Further, the decompression unit 23 performs the Hough transform on the boundary search area AEa, and as shown in (f) of FIG.
  • a straight line having a smaller inclination with respect to the boundary seed LEs than the threshold value and connected to the boundary seed LEs is extracted from the boundary search area AEa.
  • the boundary extension line LEa is detected. Therefore, a straight line having an inclination smaller than the threshold value with respect to the boundary seed LEs is also extracted as the boundary extension line LEa.
  • the decompression unit 23 newly sets the boundary search area AEa in the line segment direction based on the position of the boundary extension line LEa in response to the extraction of the boundary extension line LEa.
  • the decompression unit 23 extracts a straight line whose inclination with respect to the boundary decompression line extracted immediately before is smaller than a threshold value and is connected to the boundary decompression line from the newly set boundary search area AEa, and sets it as a new boundary decompression line LEa. Similarly, the decompression unit 23 repeats the above process until the boundary decompression line cannot be extracted or until the boundary decompression line is continuously extracted a predetermined number of times. As shown in (g) of FIG. 7, the decompression unit 23 uses the boundary seed LEs extracted by the filter processing unit 22 and the boundary decompression line LEa sequentially extracted as the estimated boundary line LE to estimate the estimated boundary line. Output information to the outside.
  • the decompression unit 23 extracts the straight line extending from the boundary seed selected by the filter processing unit 22 for each boundary search region and connects the straight line as an estimated boundary line, and the estimated boundary line information indicating the estimated boundary line. Can be generated.
  • the decompression unit 23 may set the curvature range of the estimated boundary line according to the area size of the boundary search area and the threshold value. For example, the decompression unit 23 can limit the estimated boundary line to a line segment having a small curvature by increasing the size of the boundary seed or the direction of the boundary expansion line in the boundary search region or decreasing the threshold value. In addition, the expansion unit 23 can also make a line segment having a large curvature an estimated boundary line by increasing the size in the direction of the boundary expansion line or decreasing the threshold value. Therefore, the extension unit 23 may preset the detectable curvature range of the boundary according to the region size and the threshold value.
  • the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line. In this way, by superimposing the image showing the estimated boundary line on the captured image, it becomes possible to easily recognize which position of the captured image is detected as the boundary line.
  • FIG. 8 is a flowchart illustrating the operation of the image processing apparatus.
  • the image processing device starts acquisition of a captured image.
  • the image processing device 20 starts acquisition of the captured image generated by the imaging device 10 and proceeds to step ST2.
  • the image processing apparatus performs straight line detection processing in step ST2.
  • the straight line detection unit 21 of the image processing device 20 detects a straight line in the ⁇ - ⁇ space from the captured image acquired from the imaging device 10, and proceeds to step ST3.
  • the image processing apparatus performs filter processing.
  • the filter processing unit 22 of the image processing device 20 selects a straight line indicating a boundary from the straight lines detected in step ST2.
  • FIG. 9 is a flowchart showing the filtering process.
  • the filter processing unit detects a straight line that minimizes the distance.
  • the filter processing unit 22 detects, from the candidate straight lines detected by the straight line detection unit 21, the straight line having the minimum distance based on the equation (3).
  • the filter processing unit 22 determines the distance between the coordinate position of the straight line detected by the straight line detection unit in the ⁇ - ⁇ space and the preset predetermined coordinate position (reference position) for each candidate straight line. Then, the straight line with the minimum distance is detected. Further, when the predetermined coordinate position is updated in the setting update process of step ST13 described later, the updated coordinate position is used to detect the straight line having the smallest distance from the candidate straight lines detected after the update.
  • the filter processing unit 22 performs the time-series filter processing in this way, detects the straight line with the minimum distance, and proceeds to step ST12.
  • step ST12 the filter processing unit determines whether the minimum distance is smaller than the threshold value. If the minimum distance dist (Lfdmin, Ltg), which is the distance between the coordinate position of the straight line detected in step ST11 and the predetermined coordinate position, is smaller than the threshold value, the filter processing unit 22 proceeds to step ST13, and the minimum distance dist (Lfdmin , Ltg) is greater than or equal to the threshold value, the process proceeds to step ST14.
  • step ST13 the filter processing unit performs setting update processing.
  • the filter processing unit 22 sets the straight line detected in step ST11 as the selected straight line, and sets the coordinate position indicating this selected straight line as the predetermined coordinate position.
  • step ST14 the filter processing unit outputs the lost judgment information. Since the minimum distance dist (Lfdmin, Ltg) is determined to be greater than or equal to the threshold, the filter processing unit 22 indicates that a straight line close to a predetermined coordinate position, that is, a straight line that can be regarded as a boundary line cannot be detected. Outputs lost judgment information.
  • step ST4 when the process proceeds from step ST3 to step ST4, the image processing apparatus performs decompression processing.
  • the decompression unit 23 of the image processing device 20 decompresses the line segment indicating the boundary by detecting a straight line following the straight line selected by the filtering process.
  • FIG. 10 is a flowchart showing the decompression process.
  • the decompression unit performs viewpoint conversion.
  • the decompression unit 23 performs viewpoint conversion of the captured image with respect to the plane where the straight line selected by the filtering process is located, with the vertical direction being the viewpoint direction, and proceeds to step ST22.
  • step ST22 the extension unit sets the boundary seed.
  • the decompression unit 23 sets the search reference area so as to include the selected straight line, with the position of the straight line selected by the filter processing unit 22 as a reference. Further, the decompression unit 23 performs boundary seeding on the straight line in the search reference area and proceeds to step ST23.
  • step ST23 the decompression unit sets the boundary search area.
  • the extension unit 23 sets the boundary search area adjacent to the search reference area in the direction of the line segment of the boundary seed, with the position of the boundary seed as a reference.
  • the extension unit 23 sets the boundary search area on the basis of the position of the boundary extension line so as to be adjacent to the boundary search area where the boundary extension line is detected as a straight line. In this way, the decompression unit 23 sets the boundary search area and proceeds to step ST24.
  • step ST24 the decompression unit detects a straight line in the boundary search area.
  • the decompression unit 23 detects a straight line included in the boundary search area set in step ST23 and proceeds to step ST25.
  • the extension unit determines whether or not a straight line with a close inclination can be detected.
  • the decompression unit 23 calculates the slope with respect to the straight line detected in step ST24 and the boundary seed or the boundary expansion line, and if the straight line having the smallest slope and smaller than the threshold is detected in step ST24, the slope is the smallest and A straight line smaller than the threshold is set as a boundary extension line and the process returns to step ST23.
  • the decompression unit 23 ends the decompression process when a straight line whose inclination is smaller than the threshold value cannot be detected.
  • the image processing device when the process proceeds from step ST4 to step ST5, the image processing device performs boundary line information output processing.
  • the image processing device 20 outputs the estimated boundary line information indicating the estimated boundary line detected by the processes of step ST3 and step ST4 to the external device.
  • the image processing apparatus 20 outputs the lost determination information to the external device when a straight line whose minimum distance is smaller than the threshold value is not detected in the filter process of step ST3.
  • the boundary can be detected with higher accuracy than in the case of detecting the straight line indicating the boundary on the xy plane. it can.
  • the filter characteristic of the filter processing unit that is, the tracking characteristic of the selected straight line can be made a desired characteristic by adjusting the threshold value in the ⁇ - ⁇ space, for example.
  • the above-mentioned parameter ⁇ can specify the allowable range of the positional deviation of the straight line.
  • the allowable range of continuity (smoothness) of the straight line can be designated by the parameter ⁇ .
  • the boundary search area is set and the boundary seed is expanded, the processing is easier than in the case of estimating the boundary line by function approximation. Further, since only the straight lines in the boundary search area are candidates and the boundary lines are estimated based on the continuity of the straight lines, robustness against noise and the like can be improved. Furthermore, since the boundary line is estimated based on the continuity of straight lines on the bird's-eye view, it is possible to deal with the case where the boundary line has a curvature when turning right or left.
  • the straight line detection unit 21 detects a straight line using RANSAC (Random Sample Consensus) or the least squares method, and the filter processing unit 22 performs filter processing using the coordinate position of the polar coordinate space indicating the detected straight line. You may do it.
  • RANSAC Random Sample Consensus
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is applicable to any type of vehicle, electric vehicle, hybrid electric vehicle, motorcycle, bicycle, personal mobility, carrier vehicle used in factories, robots, construction machines, agricultural machines (tractors), and the like. May be realized as a device mounted on the moving body.
  • FIG. 11 is a block diagram showing a schematic configuration example of a vehicle control system 100 which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
  • the vehicle when distinguishing a vehicle provided with the vehicle control system 100 from other vehicles, the vehicle is referred to as the own vehicle or the own vehicle.
  • the vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a body system control unit 109, and a body.
  • the system 110, the storage unit 111, and the automatic operation control unit 112 are provided.
  • the communication network 121 is, for example, an in-vehicle communication network or a bus compliant with any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). Become. In addition, each part of the vehicle control system 100 may be directly connected without going through the communication network 121.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • the input unit 101 includes a device used by the passenger to input various data and instructions.
  • the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than a manual operation such as voice or gesture.
  • the input unit 101 may be a remote control device that uses infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that corresponds to the operation of the vehicle control system 100.
  • the input unit 101 generates an input signal based on the data and instructions input by the passenger, and supplies the input signal to each unit of the vehicle control system 100.
  • the data acquisition unit 102 includes various sensors that acquire data used for processing of the vehicle control system 100, and supplies the acquired data to each unit of the vehicle control system 100.
  • the data acquisition unit 102 includes various sensors for detecting the state of the own vehicle and the like.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an accelerator pedal operation amount, a brake pedal operation amount, a steering wheel steering angle, and an engine speed. It is provided with a sensor or the like for detecting the number of rotations of the motor or the rotation speed of the wheels.
  • IMU inertial measurement unit
  • the data acquisition unit 102 includes various sensors for detecting information outside the vehicle.
  • the data acquisition unit 102 includes an imaging device such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the data acquisition unit 102 includes an environment sensor for detecting weather or weather, and an ambient information detection sensor for detecting an object around the vehicle.
  • the environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.
  • the ambient information detection sensor includes, for example, an ultrasonic sensor, a radar, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a sonar, and the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the vehicle.
  • the data acquisition unit 102 includes a GNSS receiver that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite.
  • GNSS Global Navigation Satellite System
  • the data acquisition unit 102 includes various sensors for detecting information inside the vehicle.
  • the data acquisition unit 102 includes an imaging device that images the driver, a biometric sensor that detects biometric information of the driver, a microphone that collects sound in the vehicle interior, and the like.
  • the biometric sensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of an occupant sitting on a seat or a driver who holds the steering wheel.
  • the communication unit 103 communicates with the in-vehicle device 104 and various devices outside the vehicle, a server, a base station, etc., and transmits data supplied from each unit of the vehicle control system 100 or receives received data from the vehicle control system. It is supplied to each part of 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 can also support a plurality of types of communication protocols.
  • the communication unit 103 uses a wireless LAN. , Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like to perform wireless communication with the in-vehicle device 104.
  • the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (via a connection terminal (and a cable, if necessary), not shown.
  • USB Universal Serial Bus
  • HDMI registered trademark
  • MHL via a connection terminal (and a cable, if necessary), not shown.
  • Mobile High-definition Link is used to perform wired communication with the in-vehicle device 104.
  • the communication unit 103 communicates with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a network unique to a business operator) via a base station or an access point. Communicate.
  • a device for example, an application server or a control server
  • an external network for example, the Internet, a cloud network, or a network unique to a business operator
  • the communication unit 103 uses a P2P (PeerToPeer) technology to communicate with a terminal (for example, a pedestrian or a shop terminal, or an MTC (MachineType Communication) terminal) that exists near the vehicle.
  • the communication unit 103 may communicate between the vehicle and the vehicle, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.
  • the communication unit 103 includes a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a wireless station installed on the road, and obtains information such as the current position, traffic congestion, traffic regulation, and required time. To do.
  • the in-vehicle device 104 includes, for example, a mobile device or a wearable device that the passenger has, an information device that is carried in or attached to the vehicle, a navigation device that searches for a route to an arbitrary destination, and the like.
  • the output control unit 105 controls the output of various information to the passengers of the own vehicle or the outside of the vehicle.
  • the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data), and supplies the output signal to the output unit 106 to output the output unit.
  • the output of visual information and auditory information from 106 is controlled.
  • the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image or a panoramic image, and outputs an output signal including the generated image. It is supplied to the output unit 106.
  • the output control unit 105 generates voice data including a warning sound or a warning message for a danger such as collision, contact, or entry into a dangerous zone, and outputs an output signal including the generated voice data to the output unit 106.
  • Supply for example, the output control unit 105 generates voice data including a warning sound or a warning message for a danger such as collision,
  • the output unit 106 includes a device capable of outputting visual information or auditory information to a passenger of the vehicle or outside the vehicle.
  • the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a glasses-type display worn by a passenger, a projector, a lamp, and the like.
  • the display device included in the output unit 106 includes visual information in the driver's visual field, such as a device having a head-up display, a transmissive display, and an AR (Augmented Reality) display function, in addition to a device having a normal display. It may be a display device.
  • the drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies a control signal to each unit other than the drive system system 108 as necessary to notify the control state of the drive system system 108 and the like.
  • the drive system 108 includes various devices related to the drive system of the vehicle.
  • the drive system system 108 includes a drive force generation device for generating a drive force of an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to wheels, a steering mechanism for adjusting a steering angle, It is equipped with a braking device that generates a braking force, an ABS (Antilock Brake System), an ESC (Electronic Stability Control), an electric power steering device, and the like.
  • the body system control unit 109 controls the body system 110 by generating various control signals and supplying them to the body system 110. Further, the body system control unit 109 supplies a control signal to each unit other than the body system system 110 as necessary to notify the control state of the body system system 110 and the like.
  • the body system 110 includes various body-related devices mounted on the vehicle body.
  • the body system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, headlights, backlights, brake lights, blinkers, fog lights, etc.). And so on.
  • the storage unit 111 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, and a magneto-optical storage device. ..
  • the storage unit 111 stores various programs and data used by each unit of the vehicle control system 100.
  • the storage unit 111 stores map data such as a three-dimensional high-precision map such as a dynamic map, a global map having a lower accuracy than the high-precision map and covering a wide area, and a local map including information around the vehicle.
  • Map data such as a three-dimensional high-precision map such as a dynamic map, a global map having a lower accuracy than the high-precision map and covering a wide area, and a local map including information around the vehicle.
  • the automatic driving control unit 112 performs control related to autonomous driving such as autonomous driving or driving support. Specifically, for example, the automatic driving control unit 112 may perform collision avoidance or impact mitigation of the own vehicle, follow-up traveling based on inter-vehicle distance, vehicle speed maintenance traveling, collision warning of the own vehicle, lane departure warning of the own vehicle, or the like. Coordinated control for the purpose of realizing the functions of ADAS (Advanced Driver Assistance System) including Further, for example, the automatic driving control unit 112 performs cooperative control for the purpose of autonomous driving that autonomously travels without depending on the operation of the driver.
  • the automatic driving control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
  • the detection unit 131 detects various kinds of information necessary for controlling automatic driving.
  • the detection unit 131 includes a vehicle exterior information detection unit 141, a vehicle interior information detection unit 142, and a vehicle state detection unit 143.
  • the outside-vehicle information detection unit 141 performs detection processing of information outside the own vehicle based on data or signals from each unit of the vehicle control system 100.
  • the vehicle exterior information detection unit 141 performs detection processing, recognition processing, and tracking processing of an object around the vehicle, and detection processing of a distance to the object.
  • Objects to be detected include vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, road markings, and the like.
  • the vehicle exterior information detection unit 141 performs a process of detecting the environment around the vehicle.
  • the surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, and road surface condition.
  • the vehicle exterior information detection unit 141 uses the data indicating the result of the detection process to obtain the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the traffic rule recognition unit 152, the situation recognition unit 153, and the operation control unit 135. To the emergency avoidance unit 171 and the like.
  • the in-vehicle information detection unit 142 performs in-vehicle information detection processing based on data or signals from each unit of the vehicle control system 100.
  • the in-vehicle information detection unit 142 performs driver authentication processing and recognition processing, driver state detection processing, passenger detection processing, and in-vehicle environment detection processing.
  • the driver's state to be detected includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight direction, and the like.
  • the environment inside the vehicle to be detected includes, for example, temperature, humidity, brightness, odor, and the like.
  • the in-vehicle information detection unit 142 supplies the data indicating the result of the detection processing to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
  • the vehicle state detection unit 143 performs detection processing of the state of the vehicle based on data or signals from each unit of the vehicle control system 100.
  • the state of the vehicle to be detected includes, for example, speed, acceleration, steering angle, presence / absence of abnormality, content of driving operation, position and inclination of power seat, state of door lock, and other in-vehicle devices. State etc. are included.
  • the vehicle state detection unit 143 supplies the data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
  • the self-position estimating unit 132 estimates the position and attitude of the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the vehicle exterior information detecting unit 141 and the situation recognizing unit 153 of the situation analyzing unit 133. Perform processing.
  • the self-position estimation unit 132 also generates a local map (hereinafter, referred to as a self-position estimation map) used for estimating the self-position, if necessary.
  • the self-position estimation map is, for example, a high-precision map using a technology such as SLAM (Simultaneous Localization and Mapping).
  • the self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 111.
  • the situation analysis unit 133 analyzes the situation of the vehicle and surroundings.
  • the situation analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situation recognition unit 153, and a situation prediction unit 154.
  • the map analysis unit 151 uses data or signals from each unit of the vehicle control system 100, such as the self-position estimation unit 132 and the vehicle exterior information detection unit 141, as necessary, and stores various maps stored in the storage unit 111. Performs analysis processing and builds a map containing information necessary for automatic driving processing.
  • the map analysis unit 151 uses the constructed map as a traffic rule recognition unit 152, a situation recognition unit 153, a situation prediction unit 154, a route planning unit 161, a behavior planning unit 162, and a motion planning unit 163 of the planning unit 134. Supply to.
  • the traffic rule recognition unit 152 determines the traffic rules around the vehicle based on data or signals from the vehicle position control unit 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, and the map analysis unit 151. Perform recognition processing. By this recognition processing, for example, the position and state of the signal around the own vehicle, the content of traffic regulation around the own vehicle, the lane in which the vehicle can travel, and the like are recognized.
  • the traffic rule recognition unit 152 supplies data indicating the result of the recognition process to the situation prediction unit 154 and the like.
  • the situation recognition unit 153 converts data or signals from the vehicle position control unit 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. Based on this, recognition processing of the situation regarding the own vehicle is performed. For example, the situation recognition unit 153 performs recognition processing of the situation of the own vehicle, the situation around the own vehicle, the situation of the driver of the own vehicle, and the like. The situation recognition unit 153 also generates a local map (hereinafter, referred to as a situation recognition map) used for recognizing the situation around the own vehicle, as necessary.
  • the situation recognition map is, for example, an occupancy grid map (Occupancy Grid Map).
  • the situation of the subject vehicle to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the subject vehicle, and the presence / absence and content of an abnormality.
  • the situation around the subject vehicle to be recognized is, for example, the type and position of a stationary object in the surroundings, the type and position of a moving object in the surroundings, position and movement (for example, speed, acceleration, moving direction, etc.), and surrounding roads.
  • the configuration and the condition of the road surface, and the surrounding weather, temperature, humidity, and brightness are included.
  • the driver's state to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight movement, and driving operation.
  • the situation recognition unit 153 supplies data indicating the result of the recognition process (including a situation recognition map, if necessary) to the self-position estimation unit 132, the situation prediction unit 154, and the like.
  • the situation recognition unit 153 also stores the situation recognition map in the storage unit 111.
  • the situation predicting unit 154 performs a process of predicting the situation regarding the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the map analyzing unit 151, the traffic rule recognizing unit 152, and the situation recognizing unit 153.
  • the situation prediction unit 154 performs a prediction process of the situation of the own vehicle, the situation around the own vehicle, the situation of the driver, and the like.
  • the situation of the subject vehicle to be predicted includes, for example, the behavior of the subject vehicle, the occurrence of abnormality, and the mileage that can be traveled.
  • the situation around the subject vehicle to be predicted includes, for example, the behavior of a moving object around the subject vehicle, a change in the signal state, and a change in the environment such as the weather.
  • the driver's situation to be predicted includes, for example, the driver's behavior and physical condition.
  • the situation prediction unit 154 together with the data from the traffic rule recognition unit 152 and the situation recognition unit 153, data indicating the result of the prediction process, the route planning unit 161, the action planning unit 162, and the operation planning unit 163 of the planning unit 134. Etc.
  • the route planning unit 161 plans a route to a destination based on data or signals from each part of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the route planning unit 161 sets a route from the current position to the designated destination based on the global map. Further, for example, the route planning unit 161 appropriately changes the route based on traffic jams, accidents, traffic restrictions, construction conditions, and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
  • the action planning unit 162 safely operates the route planned by the route planning unit 161 within the planned time on the basis of data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. Plan your vehicle's behavior to drive. For example, the action planning unit 162 makes a plan such as start, stop, traveling direction (for example, forward, backward, left turn, right turn, direction change, etc.), lane, traveling speed, and overtaking. The action planning unit 162 supplies data indicating the planned action of the own vehicle to the action planning unit 163 and the like. The action planning unit 163 receives data from each unit of the vehicle control system 100, such as the map analysis unit 151 and the situation prediction unit 154.
  • the action planning unit 162 plans the action of the own vehicle for realizing the action planned.
  • the operation planning unit 163 makes a plan such as acceleration, deceleration, and traveling track.
  • the operation planning unit 163 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like of the operation control unit 135.
  • the operation control unit 135 controls the operation of the own vehicle.
  • the operation control unit 135 includes an emergency avoidance unit 171, an acceleration / deceleration control unit 172, and a direction control unit 173.
  • the emergency avoidance unit 171 is based on the detection results of the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, and the vehicle state detection unit 143, and collides, touches, enters a dangerous zone, the driver's abnormality, and the vehicle Performs detection processing for emergencies such as abnormalities.
  • the emergency avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency such as a sudden stop or a sharp turn.
  • the emergency avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like.
  • the acceleration / deceleration control unit 172 performs acceleration / deceleration control for realizing the operation of the vehicle planned by the operation planning unit 163 or the emergency situation avoiding unit 171. For example, the acceleration / deceleration control unit 172 calculates the control target value of the driving force generation device or the braking device for realizing the planned acceleration, deceleration, or sudden stop, and drives the control command indicating the calculated control target value. It is supplied to the system control unit 107.
  • the direction control unit 173 performs direction control for realizing the operation of the vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the direction control unit 173 calculates the control target value of the steering mechanism for realizing the planned traveling track or steep turn by the operation planning unit 163 or the emergency situation avoidance unit 171 and performs control indicating the calculated control target value. The command is supplied to the drive system control unit 107.
  • the image processing device 20 is provided in, for example, the vehicle exterior information detection unit 141 and uses, for example, a captured image of the front of the vehicle acquired by the data acquisition unit 102 to drive the traveling lane. And a boundary line of another area, for example, a boundary line such as a boundary with a road shoulder or a white line portion indicating a section from another traveling lane is detected.
  • the vehicle exterior information detection unit 141 outputs estimated boundary line information indicating the detected estimated boundary line to the situation recognition unit 153 and the emergency situation avoidance unit 171.
  • the initial value used in the filter processing unit 22 is not limited to the preset reference position, but the coordinate position indicating the straight line selected from the candidate straight lines based on the predetermined rule is set as the reference position to start the filter processing.
  • the coordinate position indicating the straight line closest to the vehicle side surface from the straight line detected by the straight line detection unit at a predetermined timing is set as the reference position. You may set and start a filter process.
  • the situation recognition unit 153 performs a situation recognition process for the vehicle based on the estimated boundary line information. For example, the situation recognition unit 153 determines the relative positional relationship between the own vehicle and the white line or the shoulder of the road, and supplies the determination result to the direction control unit 173 of the operation control unit 135 so that the own vehicle travels in the lane. The direction is controlled so that it does not come off.
  • the emergency avoidance unit 171 also performs an emergency detection process such as deviation from the driving lane based on the estimated boundary line information. When the occurrence of an emergency is detected, the emergency avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency such as a sudden stop or a sharp turn.
  • the emergency avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the direction control unit 173 and the like, and performs emergency avoidance control.
  • the filter processing unit 22 of the image processing device 20 may change the filter characteristic according to the moving state or moving environment of the vehicle. Specifically, the filter characteristic of the filter processing unit 22 is changed according to the route planned by the route planning unit 161. For example, on expressways, white lines and the like are maintained compared to ordinary roads, and continuity is secured. Further, it is desirable to be able to accurately detect boundaries such as road shoulders and white lines when traveling at high speed.
  • the filter processing unit 22 makes the permissible ranges of the parameters ⁇ and the parameters ⁇ stricter than when traveling on a general road when traveling on a highway or during high-speed traveling so that the boundary can be correctly detected.
  • the filter processing unit 22 may make the permissible range of the parameter ⁇ or the parameter ⁇ more lenient when traveling on an urban road or at low speed than when traveling on a highway or the like so as to easily detect a boundary. ..
  • the allowable ranges of the parameter ⁇ and the parameter ⁇ may be set according to the motion model of the own vehicle. For example, the position of the road shoulder or the white line that moves according to the movement of the vehicle may be estimated, and the allowable range of the parameter ⁇ or the parameter ⁇ may be set based on the estimated position. By setting the allowable range in this way, the estimated boundary line information can be generated in consideration of the movement of the own vehicle.
  • the operation control is switched based on the lost determination information output from the image processing device, even if the boundary is not detected, information from other sensors is used to perform driving assistance or automatic driving. be able to. Further, it is possible to notify the driver of the lost judgment information to call attention, or to instruct the driver to switch from automatic operation to manual operation.
  • the extension unit 23 sets the length of the search reference area and the boundary search area (the length of the boundary seed in the line segment direction) to several tens of centimeters to 1 meter (preferably). Is about 80 cm), and the width (length in the direction orthogonal to the line segment direction of the boundary seed) is about several tens of centimeters (preferably about 50 cm), the boundary indicating the road shoulder or the white line.
  • the extension line can be detected accurately.
  • the series of processes described in the specification can be executed by hardware, software, or a composite configuration of both.
  • the program recording the processing sequence is installed in a memory in a computer incorporated in dedicated hardware and executed.
  • the program can be installed and executed in a general-purpose computer that can execute various processes.
  • the program can be recorded in advance in a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium.
  • the program is a flexible disk, CD-ROM (Compact Disc Read Only Memory), MO (Magneto optical) disc, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disc, semiconductor memory card. It can be temporarily (or permanently) stored (recorded) in a removable recording medium such as.
  • a removable recording medium can be provided as so-called package software.
  • the program may be wirelessly or wired transferred from the download site to the computer via a network such as a LAN (Local Area Network) or the Internet.
  • a network such as a LAN (Local Area Network) or the Internet.
  • the program thus transferred can be received and installed in a recording medium such as a built-in hard disk.
  • the image processing device of the present technology can also have the following configurations.
  • a straight line corresponding to the straight line indicated by the predetermined coordinate position is selected from the detected straight lines based on the distance between the coordinate position in the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position.
  • An image processing apparatus including a filter processing unit.
  • the predetermined coordinate position is a preset coordinate position or a coordinate position indicating a straight line detected and selected from a captured image generated at a time different from the captured image by the filter processing unit (1 ) The image processing device described in (1).
  • An imaging device that acquires the captured image is provided on the moving body, The image processing device according to (2), wherein the predetermined coordinate position is a coordinate position indicating a straight line closest to the moving body.
  • the filter processing unit sets an allowable range for the predetermined coordinate position and uses the coordinate position within the allowable range indicating a straight line detected from the captured image (1) to (3).
  • the image processing device according to any one of 1.
  • An imaging device that acquires the captured image is provided on the moving body, The image processing device according to (4), wherein the filter processing unit sets the allowable range according to a moving state or a moving environment of the moving body.
  • the filter processing unit outputs the lost determination information indicating that the corresponding straight line cannot be selected ( The image processing device according to any one of 1) to (5).
  • the image processing device according to any one of (1) to (6), further including a straight line detection unit that detects a coordinate position in a polar coordinate space indicating a straight line from the captured image.
  • a straight line detection unit detects the coordinate position indicating a straight line from an image region near the image pickup position in the captured image.
  • the straight line detection unit performs Hough transform using the captured image to detect a coordinate position indicating a straight line in the ⁇ - ⁇ space that is the polar coordinate space. Processing equipment.
  • a search reference area having a predetermined size including the straight line selected by the filter processing unit is set, and a boundary search area is set in a line segment direction of a boundary seed that is the selected straight line in the search reference area.
  • the extension unit detects the boundary extension line from the boundary search region, the extension unit newly sets the boundary search region adjacent in the direction of a line segment of the detected boundary extension line, and detects the boundary extension line.
  • the image processing apparatus wherein a straight line having a slope with respect to the boundary extension line smaller than a threshold value and connected to the boundary extension line is detected as the boundary extension line by detecting from the new boundary search area.
  • the decompression unit sets a curvature range of an estimated boundary line formed by the boundary seed and the boundary expansion line based on the area size of the boundary search area and the threshold value. (10) or (11) Image processing device.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A straight line detection unit 21 uses an image of the vicinity of an imaging position in a captured image generated by an imaging device 10 to carry out a Hough transform, and detects a coordinate position in a ρ-θ space, which is a coordinate position that is in a polar coordinate space and indicates a straight line in the captured image. On the basis of the distance between the coordinate position detected by the straight line detection unit 21 and a prescribed coordinate position, a filter processing unit 22 extracts from the captured image a straight line indicated by the prescribed coordinate position. An extension unit 23 sets a search reference region that has a prescribed size and includes a straight line selected by the filter processing unit 22, sets a boundary search region in a line segment direction of a boundary seed, which is the straight line selected within the search reference region, and detects from the boundary search region, as a boundary extension line, a straight line which is connected to the boundary seed and for which the inclination with respect to the boundary seed is smaller than a threshold value. An estimated boundary line is formed from a line segment comprising the boundary extension line and the boundary seed. Thus, it is possible to detect a boundary with good precision from a captured image.

Description

画像処理装置と画像処理方法およびプログラムImage processing apparatus, image processing method, and program
 この技術は、画像処理装置と画像処理方法およびプログラムに関し、撮像画像を用いて精度よく境界を検出する。 This technology relates to image processing devices, image processing methods, and programs, and accurately detects boundaries using captured images.
 従来、直線検出を行い、検出結果に基づき例えば道路の白線等を認識することが行われている。例えば特許文献1では、エッジ点群を貫く直線をウィンドウ毎に算出して、それらの直線で貫かれるエッジ点群から選択した白線候補点のフィルタ処理を行い、道路白線の形状を算出することが行われている。また、特許文献2では、ハフ変換やRANSAC(Random Sample Consensus)等の手法を用いて直線の検出を行い、検出された複数の直線のフィルタ処理を行い、白線を検出することが行われている。 Conventionally, straight lines have been detected and, for example, white lines on roads have been recognized based on the detection results. For example, in Patent Document 1, a straight line penetrating the edge point group is calculated for each window, and a white line candidate point selected from the edge point group penetrating the straight line is filtered to calculate the shape of the road white line. Has been done. Further, in Patent Document 2, straight lines are detected by using a method such as Hough transform or RANSAC (Random Sample Consensus), and a plurality of detected straight lines are filtered to detect white lines. ..
特開2004-199341号公報Japanese Patent Laid-Open No. 2004-199341 特開2018-041315号公報Japanese Patent Laid-Open No. 2018-041315
 ところで、特許文献1や特許文献2のフィルタ処理では時系列フィルタが用いられており、時系列フィルタはXY平面で行われていることから、フィルタ処理結果が発散してしまうおそれがある。また、特許文献1や特許文献2の検出方法は、道路内のノイズに弱くロバスト性に欠けるため、例えば白線と白線でない部分等の境界を精度よく検出することができない。 By the way, since the time series filter is used in the filter processing of Patent Documents 1 and 2, and the time series filter is performed on the XY plane, the filter processing result may diverge. Further, the detection methods of Patent Document 1 and Patent Document 2 are weak against noise in the road and lack robustness, so that it is not possible to accurately detect, for example, a boundary between a white line and a portion other than the white line.
 そこで、この技術では、撮像画像を用いて精度よく境界を検出できる画像処理装置と画像処理方法およびプログラムを提供することを目的とする。 Therefore, the purpose of this technology is to provide an image processing device, an image processing method, and a program that can accurately detect boundaries using captured images.
 この技術の第1の側面は、
 撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択するフィルタ処理部
を備える画像処理装置にある。
The first aspect of this technology is
Filtering processing for selecting a straight line corresponding to the straight line indicated by the predetermined coordinate position from the detected straight lines based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position The image processing apparatus includes a unit.
 この技術において、直線検出部は、例えば移動体に設けられた撮像装置で生成された撮像画像における撮像位置近傍の画像領域の画像を用いてハフ変換を行い、撮像画像における直線を示す極座標空間の座標位置であるρ-θ空間の座標位置を検出する。 In this technique, the straight line detection unit performs Hough transform using an image of an image region near the image capturing position in a captured image generated by an image capturing device provided in a moving body, and a polar coordinate space indicating a straight line in the captured image. The coordinate position in the ρ-θ space, which is the coordinate position, is detected.
 フィルタ処理部は、直線検出部で検出された座標位置と所定の座標位置との距離に基づき、所定の座標位置で示された直線を撮像画像から抽出する。所定の座標位置は、予め設定された座標位置またはフィルタ処理部で撮像画像と異なる時刻で生成された撮像画像から検出されている直線を示す座標位置である。また、所定の座標位置は、直線検出部で検出された直線において移動体に最も近い直線を示す座標位置である。フィルタ処理部は、所定の座標位置に対して許容範囲を設定して、直線検出部で検出された座標位置における許容範囲内の座標位置を用いる。また、許容範囲は、移動体の移動状態または移動環境に応じて設定する。さらに、フィルタ処理部は、所定の座標位置で示された直線に対応する直線を検出された直線から選択できない場合、対応する直線を選択できないことを示すロスト判定情報を出力する。 The filter processing unit extracts the straight line indicated by the predetermined coordinate position from the captured image based on the distance between the coordinate position detected by the straight line detection unit and the predetermined coordinate position. The predetermined coordinate position is a preset coordinate position or a coordinate position indicating a straight line detected from a captured image generated at a time different from the captured image by the filter processing unit. Further, the predetermined coordinate position is a coordinate position indicating the straight line closest to the moving body in the straight line detected by the straight line detection unit. The filter processing unit sets a permissible range for a predetermined coordinate position and uses a coordinate position within the permissible range at the coordinate position detected by the straight line detection unit. The allowable range is set according to the moving state of the moving body or the moving environment. Further, when the straight line corresponding to the straight line indicated by the predetermined coordinate position cannot be selected from the detected straight lines, the filter processing unit outputs the lost determination information indicating that the corresponding straight line cannot be selected.
 また、伸長部は、フィルタ処理部で選択された直線を含む所定サイズの探索基準領域を設定して、探索基準領域内における選択された直線である境界シードの線分方向に境界探索領域を設定して、境界探索領域から境界シードに対する傾きが閾値よりも小さく境界シードに繋がる直線を境界伸長線として抽出する。また、伸長部は、境界探索領域から境界伸長線が抽出できたことに応じて、該抽出した境界伸長線の線分方向に新たに境界探索領域を隣接して設定して、抽出した境界伸長線に対する傾きが閾値よりも小さく境界伸長線に繋がる直線を新たな境界探索領域から抽出して境界伸長線とする。また、伸長部は、境界探索領域の領域サイズと閾値によって、境界シードと境界伸長線によって構成される推定境界線の曲率範囲を設定する。 Further, the decompression unit sets a search reference area of a predetermined size including the straight line selected by the filter processing unit, and sets the boundary search area in the direction of the line segment of the boundary seed that is the selected straight line in the search reference area. Then, a straight line having a gradient with respect to the boundary seed smaller than the threshold value and connected to the boundary seed is extracted from the boundary search region as a boundary extension line. In addition, the decompression unit sets a new boundary search region adjacent to the extracted boundary decompression line in the direction of the line segment of the extracted boundary decompression line according to the fact that the boundary decompression line is extracted from the boundary decompression line. A straight line whose inclination to the line is smaller than the threshold value and which is connected to the boundary extension line is extracted from the new boundary search region as a boundary extension line. In addition, the decompression unit sets the curvature range of the estimated boundary line formed by the boundary seed and the boundary expansion line based on the area size of the boundary search area and the threshold value.
 この技術の第2の側面は、
 撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線からフィルタ処理部で選択すること
を含む画像処理方法にある。
The second aspect of this technology is
Based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position, a straight line corresponding to the straight line indicated by the predetermined coordinate position is obtained from the detected straight line by the filter processing unit. An image processing method including selecting.
 この技術の第3の側面は、
 撮像画像の画像処理をコンピュータで実行させるプログラムであって、
 前記撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択する手順
を前記コンピュータで実行させるプログラムにある。
The third aspect of this technology is
A program that causes a computer to perform image processing of a captured image,
A procedure for selecting, from the detected straight lines, a straight line corresponding to the straight line indicated by the predetermined coordinate position based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position. In the program that is executed by the computer.
 なお、本技術のプログラムは、例えば、様々なプログラム・コードを実行可能な汎用コンピュータに対して、コンピュータ可読な形式で提供する記憶媒体、通信媒体、例えば、光ディスクや磁気ディスク、半導体メモリなどの記憶媒体、あるいは、ネットワークなどの通信媒体によって提供可能なプログラムである。このようなプログラムをコンピュータ可読な形式で提供することにより、コンピュータ上でプログラムに応じた処理が実現される。 Note that the program of the present technology is, for example, a storage medium or a communication medium provided in a computer-readable format to a general-purpose computer capable of executing various program codes, for example, a storage medium such as an optical disk, a magnetic disk, or a semiconductor memory. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program is realized on the computer.
画像処理装置の構成を例示した図である。It is the figure which illustrated the composition of the image processing device. ハフ変換を説明するための図である。It is a figure for explaining Hough transform. ハフ変換の動作例を示した図である。It is a figure showing an operation example of Hough conversion. エッジ点に基づいた直線の検出動作を説明するための図である。It is a figure for demonstrating the detection operation of the straight line based on an edge point. フィルタ処理部の動作例を示した図である。It is a figure showing an example of operation of a filter processing part. 伸長部の動作を説明するための図である。It is a figure for demonstrating operation | movement of an expansion part. 伸長部の動作を説明するための図である。It is a figure for demonstrating operation | movement of an expansion part. 画像処理装置の動作を例示したフローチャートである。6 is a flowchart illustrating the operation of the image processing apparatus. フィルタ処理を示すフローチャートである。It is a flowchart which shows a filter process. 伸長処理を示すフローチャートである。It is a flow chart which shows decompression processing. 車両制御システム100の概略的な構成例を示すブロック図である。2 is a block diagram showing a schematic configuration example of a vehicle control system 100. FIG.
 以下、本技術を実施するための形態について説明する。なお、説明は以下の順序で行う。
 1.画像処理装置の構成
 2.画像処理装置の動作
 3.応用例
Hereinafter, modes for carrying out the present technology will be described. The description will be given in the following order.
1. Configuration of image processing apparatus 2. Operation of image processing apparatus 3. Application example
 <1.画像処理装置の構成>
 図1は、画像処理装置の構成を例示している。画像処理装置20は、直線検出部21とフィルタ処理部22と伸長部23を有している。
<1. Configuration of image processing device>
FIG. 1 illustrates the configuration of the image processing apparatus. The image processing device 20 has a straight line detection unit 21, a filter processing unit 22, and an expansion unit 23.
 直線検出部21は、撮像装置10で生成された撮像画像から直線を示す極座標空間の座標位置の検出を行い、境界の候補となる直線の座標位置を検出する。直線検出部21は、例えばハフ変換を用いて直線の検出を行い、検出した直線(候補直線ともいう)を示すρ-θ空間の座標位置を、候補直線情報としてフィルタ処理部22へ出力する。また、直線検出部21は、直線の検出において、直線を検出する直線検出領域を近傍付近かつ境界が現れる範囲に設定することで、候補直線の誤検出を抑制する。 The straight line detection unit 21 detects the coordinate position in the polar coordinate space indicating the straight line from the captured image generated by the image capturing apparatus 10, and detects the coordinate position of the straight line that is a candidate for the boundary. The straight line detection unit 21 detects a straight line using, for example, Hough transform, and outputs the coordinate position in the ρ-θ space indicating the detected straight line (also referred to as a candidate straight line) to the filter processing unit 22 as candidate straight line information. Further, the straight line detection unit 21 suppresses erroneous detection of a candidate straight line by setting the straight line detection area for detecting the straight line in the vicinity of the neighborhood and the range where the boundary appears in the detection of the straight line.
 フィルタ処理部22は、直線検出部21で撮像画像から検出した直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、所定の座標位置で示された直線に対応する直線を検出した直線から選択する。フィルタ処理部22は、時系列フィルタ例えばカルマンフィルタ,拡張カルマンフィルタ,シグマポイントカルマンフィルタ,パーティクルフィルタ等を用いる。フィルタ処理部22は、直線検出部21から供給された候補直線情報で示された候補直線のフィルタ処理を行い、所定の座標位置で示された境界を示す直線に対応する候補直線を選択する。フィルタ処理部22では、境界を示す直線の極座標空間における座標位置が予め所定の座標位置(基準位置ともいう)として設定されている。フィルタ処理部22は、時系列フィルタ処理を行い、直線検出部21で検出された候補直線の座標位置から基準位置までの距離が最も短く、予め設定された閾値よりも短い距離である座標位置の直線を選択する。また、フィルタ処理部22は、選択した直線の座標位置を新たな基準位置として、その後の撮像画像から直線検出部21で検出された候補直線を示す座標位置から、基準位置に最も近い座標位置の直線を上述のように選択する。フィルタ処理部22は、候補直線から選択した直線(選択直線ともいう)を示す選択直線情報を伸長部23へ出力する。また、フィルタ処理部22は、基準位置で示された直線に対応する直線を候補直線からから選択できない場合、基準位置で示された直線に対応する直線を選択できないことを示すロスト判定情報を出力する。 The filter processing unit 22 detects a straight line corresponding to the straight line indicated by the predetermined coordinate position based on the distance between the coordinate position in the polar coordinate space indicating the straight line detected by the straight line detection unit 21 from the captured image and the predetermined coordinate position. Select from the straight lines. The filter processing unit 22 uses a time series filter such as a Kalman filter, an extended Kalman filter, a sigma point Kalman filter, or a particle filter. The filter processing unit 22 filters the candidate straight line indicated by the candidate straight line information supplied from the straight line detection unit 21, and selects the candidate straight line corresponding to the straight line indicating the boundary indicated by the predetermined coordinate position. In the filter processing unit 22, the coordinate position of the straight line indicating the boundary in the polar coordinate space is set in advance as a predetermined coordinate position (also referred to as a reference position). The filter processing unit 22 performs a time-series filter process to determine a coordinate position of the candidate straight line detected by the straight line detection unit 21 from the coordinate position to the reference position that is the shortest and is shorter than a preset threshold value. Select a straight line. In addition, the filter processing unit 22 sets the coordinate position of the selected straight line as a new reference position, and determines the coordinate position closest to the reference position from the coordinate position indicating the candidate straight line detected by the straight line detection unit 21 from the subsequent captured image. Select the straight line as described above. The filter processing unit 22 outputs selected straight line information indicating a straight line (also referred to as a selected straight line) selected from the candidate straight lines to the decompression unit 23. Further, when the straight line corresponding to the straight line indicated by the reference position cannot be selected from the candidate straight lines, the filter processing unit 22 outputs the lost determination information indicating that the straight line corresponding to the straight line indicated by the reference position cannot be selected. To do.
 伸長部23は、フィルタ処理部22からの選択直線情報で示された選択直線を含む所定サイズの探索基準領域を設定する。また、伸長部23は、選択直線において探索基準領域に含まれる部分を境界シードとして、境界シードの線分方向に境界探索領域を設定して、境界探索領域から境界シードに対する傾きが閾値よりも小さく境界シードに繋がる直線を境界伸長線として抽出する。さらに、伸長部23は、境界伸長線が抽出できたことに応じて、この境界伸長線の線分方向に新たに境界探索領域を隣接して設定する。伸長部23は、新たに設定した境界探索領域から境界伸長線に対する傾きが閾値よりも小さく境界伸長線に繋がる直線を抽出して新たな境界伸長線とする。以下同様に、上述の処理を境界伸長線が抽出できなくなるまで、あるいは境界伸長線が連続して所定回数抽出するまで繰り返し行う。伸長部23は、フィルタ処理部22で抽出した境界シードと境界シードに繋がる境界伸長線を推定境界線として、推定境界線を示す情報を外部へ出力する。また、伸長部23は、推定境界線を示す情報の出力として、推定境界線を示す画像を重畳させた撮像画像を出力してもよい。 The expansion unit 23 sets a search reference area of a predetermined size including the selected straight line indicated by the selected straight line information from the filter processing unit 22. Further, the decompressing unit 23 sets a boundary search area in the direction of the line segment of the boundary seed with the portion included in the search reference area in the selected straight line as a boundary seed, and the inclination from the boundary search area to the boundary seed is smaller than the threshold value. A straight line connected to the boundary seed is extracted as a boundary extension line. Further, in response to the extraction of the boundary extension line, the expansion unit 23 newly sets a boundary search area adjacent to the boundary extension line in the direction of the line segment. The decompression unit 23 extracts a straight line that has a smaller inclination with respect to the boundary extension line than the threshold value and that is connected to the boundary extension line from the newly set boundary search area, and sets it as a new boundary extension line. Similarly, the above processing is repeated until the boundary extension line cannot be extracted or until the boundary extension line is continuously extracted a predetermined number of times. The decompression unit 23 outputs the information indicating the estimated boundary line to the outside, with the boundary seed extracted by the filter processing unit 22 and the boundary expansion line connected to the boundary seed as the estimated boundary line. Further, the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line.
 <2.画像処理装置の動作>
 次に、画像処理装置の動作について説明する。直線検出部21は、例えばハフ変換を用いて直線検出を行い、xy平面の点を通る直線を極座標空間であるρ-θ空間の座標位置に変換する。図2は、ハフ変換を説明するための図である。図2の(a)は、xy平面の直線Lを例示しており、点(xi,yi)を通る直線は、原点から直線までの距離をパラメータρ、直線の法線とx軸の成す角度をパラメータθとすると、式(1)の関係で示すことができる。また、点(xi,yi)を通る直線Lのパラメータρとパラメータθの関係は、図2の(b)に示す特性となる。
  ρ = xicosθ+yisinθ  ・・・(1)
<2. Operation of image processing device>
Next, the operation of the image processing apparatus will be described. The straight line detection unit 21 performs straight line detection using, for example, Hough transform, and converts a straight line passing through a point on the xy plane into a coordinate position in a ρ-θ space which is a polar coordinate space. FIG. 2 is a diagram for explaining the Hough transform. FIG. 2A illustrates a straight line L on the xy plane, and the straight line passing through the point (xi, yi) is the distance between the origin and the straight line as a parameter ρ, and the angle between the normal line of the straight line and the x axis. Is a parameter θ, it can be expressed by the relationship of the equation (1). The relationship between the parameter ρ and the parameter θ of the straight line L passing through the point (xi, yi) has the characteristic shown in (b) of FIG.
ρ = xicos θ + yisin θ (1)
 図3は、ハフ変換の動作例を示している。例えば図3の(a)に示す撮像画像が入力された場合、直線検出部21は、撮像画像の輪郭をぼやかすことなくノイズを除去するフィルタ処理、例えばバイラテラルフィルタ等を用いてフィルタ処理を行い、フィルタ処理後の撮像画像から図3の(b)に示す2値化エッジ画像を生成する。さらに、直線検出部21は、2値化エッジ画像のエッジ点をρ-θ空間に投影して、エッジ点毎の特性曲線が重なる点のパラメータρ,θを、重なりを生じた特性曲線に対応する各エッジ点を通過する直線のパラメータρ,θとする。また、直線検出部21は、図3の(b)に示すように、近傍付近かつ境界が現れる範囲に直線検出領域ASを設定している。例えば、撮像装置10が移動体(例えば車両)に設けられており、直線検出部21は撮像装置によって前方を撮像して得られた撮像画像を用いる。この場合、車両前方に延びる境界線は、同じ長さ境界線が近傍領域では遠方領域に比べて撮像画像上で長い直線として表示される。このため、直線が境界線であるとするか否かを判別する所定の長さ(区間長)で行う場合、遠方領域の画像を用いると区間長の精度で直線を検出できないおそれがある。そこで、直線検出部21は、近傍領域に直線検出領域ASを設定することで、区間長の精度で直線が境界線であるとするか否かの判別を行えるようになり、候補直線の誤検出を抑制できる。また、直線検出領域ASを設定しない場合に比べて直線検出の演算コストを削減できる。 Fig. 3 shows an operation example of Hough conversion. For example, when the captured image shown in FIG. 3A is input, the straight line detection unit 21 performs a filter process for removing noise without blurring the contour of the captured image, for example, a filter process using a bilateral filter or the like. Then, the binarized edge image shown in FIG. 3B is generated from the captured image after the filtering process. Further, the straight line detection unit 21 projects the edge points of the binarized edge image in the ρ-θ space, and the parameters ρ and θ of the points where the characteristic curves of the respective edge points overlap correspond to the characteristic curve in which the overlapping occurs. Let ρ and θ be the parameters of the straight line that passes through each edge point. Further, the straight line detection unit 21 sets the straight line detection area AS in the vicinity of the neighborhood and in the range where the boundary appears, as shown in FIG. For example, the imaging device 10 is provided in a moving body (for example, a vehicle), and the straight line detection unit 21 uses a captured image obtained by imaging the front of the imaging device. In this case, the boundary line extending in front of the vehicle is displayed as a straight line in which the boundary line having the same length is longer on the captured image in the near region than in the distant region. For this reason, when performing with a predetermined length (section length) for determining whether or not the straight line is the boundary line, there is a possibility that the straight line cannot be detected with the accuracy of the section length when the image of the distant area is used. Therefore, the straight line detection unit 21 can determine whether or not the straight line is the boundary line with the accuracy of the section length by setting the straight line detection area AS in the neighborhood area, and the erroneous detection of the candidate straight line is performed. Can be suppressed. Further, the calculation cost of the straight line detection can be reduced as compared with the case where the straight line detection area AS is not set.
 図4は、エッジ点に基づいた直線の検出動作を説明するための図である。例えば、図4の(a)に示すように、xy平面でエッジ点(x1,y1)(x2,y2)(x3,y3)が検出されたとする。また、図4の(b)に示すように、ρ-θ空間におけるエッジ点毎の特性曲線を曲線CLa,CLb,CLcとする。この場合、曲線CLa,CLb,CLcが座標位置(ρs,θs)で重なることから、座標位置(ρs,θs)に対応する直線は、図4の(a)において破線で示すようにエッジ点(x1,y1)(x2,y2)(x3,y3)を通過する直線SLとなる。 したがって、図3の(b)に示す直線検出領域ASで検出されたエッジ点に基づいて、図3の(c)に示すように、直線検出領域ASにおける直線La1~Lanを検出できる。 FIG. 4 is a diagram for explaining a straight line detection operation based on edge points. For example, assume that edge points (x1, y1) (x2, y2) (x3, y3) are detected on the xy plane as shown in FIG. Further, as shown in FIG. 4B, characteristic curves for each edge point in the ρ-θ space are curves CLa, CLb, CLc. In this case, since the curves CLa, CLb, and CLc overlap at the coordinate position (ρs, θs), the straight line corresponding to the coordinate position (ρs, θs) is an edge point (indicated by a broken line in FIG. 4A). The straight line SL passes through x1, y1) (x2, y2) (x3, y3). Therefore, the straight lines La1 to Lan in the straight line detection area AS can be detected as shown in FIG. 3 (c) based on the edge points detected in the straight line detection area AS shown in FIG. 3 (b).
 フィルタ処理部22は、ρ-θ空間で候補直線情報の時系列フィルタ処理を行い、所定の予め設定されている初期値に近い候補直線を選択する。 The filter processing unit 22 performs time-series filter processing on the candidate straight line information in the ρ-θ space, and selects a candidate straight line close to a predetermined preset initial value.
 例えば、xy平面において、直線は傾きaと切片bを用いて式(2)に示すように定義される。
 y=ax+b  ・・・(2)
For example, on the xy plane, a straight line is defined as shown in equation (2) using the slope a and the intercept b.
y = ax + b (2)
 このように、傾きaと切片bで直線を定義する場合、境界の候補直線は時間に対して傾きaと切片bに連続性がなく、時系列フィルタ処理によって境界を示す候補直線を精度よく選択することが困難である。また、候補直線がy軸に対して水平になると、傾きaは無限大に発散することから、フィルタ演算を行うことができなくなってしまう。しかし、ρ-θ空間内では時間に対して境界の候補直線に近いものは連続性があり急に変化しない特徴があることから、境界を示す直線を候補直線から選択することが可能となる。 Thus, when a straight line is defined by the slope a and the intercept b, the candidate straight line of the boundary does not have continuity in the slope a and the intercept b with respect to time, and the candidate straight line indicating the boundary is accurately selected by the time series filtering process. Difficult to do. When the candidate straight line is horizontal to the y-axis, the slope a diverges to infinity, which makes it impossible to perform the filter calculation. However, in the ρ-θ space, a line that is close to the boundary candidate line with respect to time has a characteristic that it has continuity and does not change abruptly. Therefore, it is possible to select a boundary straight line from the candidate lines.
 フィルタ処理部22は、候補直線情報を用いて時系列フィルタ処理を行い、直線検出部21で検出された候補直線の座標位置から基準位置までの距離が最も短く、予め設定された閾値よりも短い距離である座標位置の直線を選択する。距離としては、例えばρ-θ空間でのユークリッド距離を用いてもよく、カルマンフィルタの共分散行列を使ったマハラノビス距離等を用いてもよい。フィルタ処理部22は、算出した距離が最も短い直線を候補直線から選択する。 The filter processing unit 22 performs time-series filter processing using the candidate straight line information, and the distance from the coordinate position of the candidate straight line detected by the straight line detection unit 21 to the reference position is the shortest and shorter than a preset threshold value. Select the straight line at the coordinate position that is the distance. As the distance, for example, the Euclidean distance in the ρ-θ space may be used, or the Mahalanobis distance using the covariance matrix of the Kalman filter may be used. The filter processing unit 22 selects the straight line having the shortest calculated distance from the candidate straight lines.
 ここで、直線検出部21で検出された候補直線の集合をLF={Lf1,Lf2,・・・Lfn}、基準位置に対応する直線を直線Ltgとした場合、式(3)に基づき、距離dist(Lfi,Ltg)が最も短い候補直線Lfdminを選択する。
 Lfdmin s.t. min(dist(Lfi,Ltg)),Lfi∈LF
                           ・・・(3)
Here, when the set of candidate straight lines detected by the straight line detection unit 21 is LF = {Lf1, Lf2, ... Lfn} and the straight line corresponding to the reference position is the straight line Ltg, the distance is calculated based on Expression (3). A candidate straight line Lfdmin having the shortest dist (Lfi, Ltg) is selected.
Lfdmin st min (dist (Lfi, Ltg)), LfiεLF
... (3)
 また、フィルタ処理部22は、選択直線を示す座標位置を新たな基準位置に設定して、その後の撮像画像から検出した候補直線のフィルタ処理を行う。 Further, the filter processing unit 22 sets the coordinate position indicating the selected straight line to a new reference position, and filters the candidate straight lines detected from the captured image thereafter.
 図5は、フィルタ処理部の動作例を示している。図5の(a)は時刻t0の撮像画像から検出された候補直線をρ-θ空間で例示している。また、図5(b)は時刻t0よりも後の時刻t1の撮像画像から検出された候補直線をρ-θ空間で例示している。なお、黒丸は候補直線を示している。 FIG. 5 shows an operation example of the filter processing unit. FIG. 5A illustrates the candidate straight lines detected from the captured image at time t0 in the ρ-θ space. In addition, FIG. 5B illustrates the candidate straight lines detected from the captured image at time t1 after time t0 in the ρ-θ space. The black circles represent candidate straight lines.
 ここで、図5の(a)に示すように、基準位置が予め座標位置PSr(白丸位置)に設定されている場合、フィルタ処理部22は、座標位置PSrから距離が最も短く、予め設定された閾値よりも短い距離である座標位置PSLt0で示される直線を選択直線とする。また、座標位置PSLt0を時刻t1における基準位置として、図5の(b)に示すように、座標位置PSLt0から距離が最も短く、予め設定された閾値よりも短い距離である座標位置PSLt1で示される直線を選択直線とする。以下同様にして、座標位置が基準位置に最も近く、予め設定された閾値よりも短い距離の直線を順次選択することで、選択直線の時間方向のトラッキングを行う。フィルタ処理部22は、選択直線を示す選択直線情報を伸長部23へ出力する。 Here, as shown in (a) of FIG. 5, when the reference position is previously set to the coordinate position PSr (white circle position), the filter processing unit 22 has the shortest distance from the coordinate position PSr and is set in advance. The straight line indicated by the coordinate position PSLt0, which is a distance shorter than the threshold value, is set as the selected straight line. Further, with the coordinate position PSLt0 as the reference position at the time t1, as shown in FIG. 5B, the coordinate position PSLt0 has the shortest distance from the coordinate position PSLt0 and is indicated by the coordinate position PSLt1 which is shorter than the preset threshold value. Select the straight line as the selected straight line. Similarly, a selected straight line is tracked in the time direction by sequentially selecting a straight line whose coordinate position is closest to the reference position and has a distance shorter than a preset threshold value. The filter processing unit 22 outputs the selected straight line information indicating the selected straight line to the decompression unit 23.
 また、フィルタ処理部22は、基準位置で示された直線に対応する直線を候補直線からから選択できない場合、すなわち、基準位置との距離が予め設定された閾値よりも短い座標位置の直線を候補直線から選択することができない場合、基準位置で示された直線に対応する直線を選択できないことを示すロスト判定情報を出力する。例えば所定期間連続して、距離が予め設定した閾値よりも短い候補直線が見つからなかった場合、選択直線(あるいは境界シード)が検出されないことを示すロスト判定情報を生成して外部へ出力する。 In addition, when the filter processing unit 22 cannot select a straight line corresponding to the straight line indicated by the reference position from the candidate straight lines, that is, a straight line at a coordinate position whose distance from the reference position is shorter than a preset threshold value is a candidate. If the straight line cannot be selected, the lost determination information indicating that the straight line corresponding to the straight line indicated by the reference position cannot be selected is output. For example, if a candidate straight line whose distance is shorter than a preset threshold value is not found continuously for a predetermined period, lost determination information indicating that the selected straight line (or boundary seed) is not detected is generated and output to the outside.
 また、フィルタ処理部22は、基準位置が予め設定されている場合に限らず、所定ルールに基づき候補直線から選択した直線を示す座標位置を基準位置に設定して時系列フィルタ処理を開始してもよい。 Further, the filter processing unit 22 sets the coordinate position indicating the straight line selected from the candidate straight lines based on the predetermined rule to the reference position and starts the time-series filter processing, not only when the reference position is set in advance. Good.
 また、フィルタ処理部22では、ユーザ操作や撮像画像の取得状況等に応じて閾値を調整してもよい。この場合、フィルタ処理はρ-θ空間で行うことから、閾値はパラメータρとパラメータθで設定できる。例えば、フィルタ処理部22は、ユーザ操作あるいは撮像画像の取得状況等に応じて、パラメータρの閾値を調整することで、撮像画像に含まれる境界が位置ずれしたときの許容範囲を設定できる。また、フィルタ処理部22は、ユーザ操作あるいは撮像画像の取得状況等に応じて、パラメータθの閾値を調整することで、撮像画像に含まれる境界の連続性(滑らかさ)の許容範囲を設定できる。このように、フィルタ処理部22のフィルタ特性は、パラメータρとパラメータθによって調整できる。また、パラメータρとパラメータθで閾値を調整することが可能であることから、ユーザはわかりやすいパラメータで容易にフィルタ特性を所望の特性に調整できる。さらに、パラメータρとパラメータθで直線を特定できるので、xy平面で直線を選択する場合に比べて選択処理の設定が容易となる。 Also, the filter processing unit 22 may adjust the threshold value according to a user operation, a captured image acquisition status, or the like. In this case, since the filtering process is performed in the ρ-θ space, the threshold can be set by the parameter ρ and the parameter θ. For example, the filter processing unit 22 can set the permissible range when the boundary included in the captured image is displaced by adjusting the threshold value of the parameter ρ according to the user operation or the acquisition status of the captured image. Further, the filter processing unit 22 can set the allowable range of continuity (smoothness) of the boundary included in the captured image by adjusting the threshold value of the parameter θ according to the user operation or the acquisition status of the captured image. .. In this way, the filter characteristic of the filter processing unit 22 can be adjusted by the parameter ρ and the parameter θ. Moreover, since the threshold value can be adjusted by the parameter ρ and the parameter θ, the user can easily adjust the filter characteristic to a desired characteristic with the parameter that is easy to understand. Further, since the straight line can be specified by the parameter ρ and the parameter θ, the setting of the selection process becomes easier as compared with the case of selecting the straight line on the xy plane.
 伸長部23は、撮像画像の視点変換を行う。視点変換では、選択直線情報で示された選択直線が位置する平面に対して、鉛直方向を視点方向とする。視点変換は、撮像画像を生成した撮像装置10について予め取得されているカメラパラメータを用いて行えばよい。このように視点変換を行えば、撮像画像に含まれる境界を二次元平面で表すことが可能となり、境界伸長線の検出を容易に行うことができる。 The decompression unit 23 performs viewpoint conversion of the captured image. In the viewpoint conversion, the vertical direction is the viewpoint direction with respect to the plane where the selected straight line indicated by the selected straight line information is located. The viewpoint conversion may be performed using camera parameters that are acquired in advance for the imaging device 10 that generated the captured image. By performing the viewpoint conversion in this way, the boundary included in the captured image can be represented by a two-dimensional plane, and the boundary extension line can be easily detected.
 伸長部23は、フィルタ処理部22からの選択直線情報で示された選択直線の位置を基準として、選択直線を含むように探索基準領域を設定して、探索基準領域内の直線を境界シードとする。また、伸長部23は、境界シードの線分方向に境界探索領域(探索セグメント)を設定して境界探索領域のハフ変換を行い、境界探索領域からフィルタ処理部22で抽出した境界シードに対する傾きが閾値よりも小さく境界シードに繋がる直線を抽出して境界伸長線とする。境界探索領域と閾値は、移動体の製造者やユーザが指定した曲率よりも大きい曲率を有する境界線を検出できるように予め設定されている。例えば探索基準領域と境界探索領域おいて、直線の線分方向の長さは、直線が境界線であるとするか否かを判別する区間長とする。また、境界探索領域おいて直線の線分方向に対して直交する方向の長さは、境界シードあるいは境界伸長線の線分に対する傾きが閾値である線分を含むように設定する。 The decompressing unit 23 sets the search reference area so as to include the selection straight line with the position of the selection straight line indicated by the selection straight line information from the filter processing unit 22 as a reference, and uses the straight line in the search reference region as a boundary seed. To do. Further, the decompression unit 23 sets a boundary search area (search segment) in the direction of the line segment of the boundary seed and performs Hough transform of the boundary search area, and the inclination with respect to the boundary seed extracted from the boundary search area by the filter processing unit 22 is A straight line smaller than the threshold value and connected to the boundary seed is extracted as a boundary extension line. The boundary search area and the threshold value are set in advance so that a boundary line having a curvature larger than the curvature specified by the manufacturer of the moving body or the user can be detected. For example, in the search reference area and the boundary search area, the length of the straight line in the direction of the line segment is a section length for determining whether or not the straight line is the boundary line. In the boundary search area, the length of the straight line in the direction orthogonal to the line segment direction is set so as to include a line segment whose inclination with respect to the line segment of the boundary seed or the boundary extension line is a threshold value.
 図6と図7は、伸長部の動作を説明するための図であり、撮像装置10は例えば移動体に設けられている。図6の(a)は時刻tb1の撮像画像、図7の(a)は時刻tb1よりも遅い時刻tb2の撮像画像を例示している。図6の(b)は時刻tb1の撮像画像の鳥瞰図の一部、図7の(b)は時刻tb2の撮像画像の鳥瞰図の一部を参考として示している。 6 and 7 are diagrams for explaining the operation of the decompression unit, and the imaging device 10 is provided in, for example, a moving body. 6A illustrates the captured image at time tb1, and FIG. 7A illustrates the captured image at time tb2, which is later than time tb1. 6B shows a part of the bird's-eye view of the captured image at time tb1, and FIG. 7B shows a part of the bird's-eye view of the captured image at time tb2 for reference.
 撮像装置10が移動体に設けられている場合、直線検出部21では上述のように近傍の位置に設けた直線検出領域、例えば移動体の前方で近傍する直線検出領域から候補直線を検出している。図6の(c)~(g)は,時刻tb1の撮像画像における移動体OBMと前方領域を示す鳥瞰図、図7の(c)~(g)では、時刻tb2の撮像画像における移動体OBMと前方領域を示す鳥瞰図を例示している。また、時刻tb1では境界EGのカーブ領域EG-cが移動体OBMから離れており、時刻tb2では境界EGのカーブ領域EG-cが移動体OBMに近づいている。 When the imaging device 10 is provided on the moving body, the straight line detecting unit 21 detects a candidate straight line from the straight line detecting area provided in the vicinity position as described above, for example, the straight line detecting area near the front of the moving body. There is. 6C to 6G are bird's-eye views showing the moving body OBM and the front area in the captured image at time tb1, and FIGS. 7C to 7G show the moving body OBM in the captured image at time tb2. The bird's-eye view which shows a front area is illustrated. Further, at the time tb1, the curve area EG-c of the boundary EG is separated from the moving body OBM, and at the time tb2, the curve area EG-c of the boundary EG is close to the moving body OBM.
 図6の(d)は、時刻tb1の撮像画像から検出されてフィルタ処理部22で選択された選択直線を含む探索基準領域AEsを示している。また、境界シードLEsは、探索基準領域AEsに含まれる選択直線部分である。伸長部23は、境界シードLEsの位置を基準として図6の(e)に示すように境界シードLEsの線分方向に探索基準領域AEsに隣接して境界探索領域AEaを設定する。また、伸長部23は、境界探索領域AEaについてハフ変換を行い、図6の(f)に示すように、境界シードLEsに対する傾きが閾値よりも小さく境界シードLEsに繋がる直線を境界探索領域AEaから検出して境界伸長線LEaとする。さらに、伸長部23は、境界伸長線LEaが抽出できたことに応じて、この境界伸長線LEaの位置を基準として線分方向に境界探索領域AEaを新たに設定する。伸長部23は、直前に抽出された境界伸長線に対する傾きが閾値よりも小さく境界伸長線に繋がる直線を新たに設定した境界探索領域AEaから抽出して新たな境界伸長線LEaとする。以下同様に、伸長部23は、上述の処理を境界伸長線が抽出できなくなるまで、あるいは境界伸長線が連続して所定回数抽出されるまで繰り返し行う。伸長部23は、図6の(g)に示すように、フィルタ処理部22で抽出した境界シードLEsと順次抽出された境界伸長線LEaを推定境界線LEとして、推定境界線を示す情報を外部へ出力する。また、伸長部23は、推定境界線を示す情報の出力として、推定境界線を示す画像を重畳させた撮像画像を出力してもよい。 (D) of FIG. 6 shows the search reference area AEs including the selected straight line detected from the captured image at the time tb1 and selected by the filter processing unit 22. The boundary seed LEs is a selected straight line portion included in the search reference area AEs. The decompressing unit 23 sets the boundary search area AEa adjacent to the search reference area AEs in the line segment direction of the boundary seed LEs with the position of the boundary seed LEs as a reference, as shown in (e) of FIG. In addition, the decompression unit 23 performs the Hough transform on the boundary search area AEa, and as shown in (f) of FIG. 6, a straight line whose inclination with respect to the boundary seed LEs is smaller than the threshold value and which is connected to the boundary seed LEs from the boundary search area AEa. The boundary extension line LEa is detected. Further, the decompression unit 23 newly sets the boundary search area AEa in the line segment direction based on the position of the boundary extension line LEa in response to the extraction of the boundary extension line LEa. The decompression unit 23 extracts a straight line whose inclination with respect to the boundary decompression line extracted immediately before is smaller than a threshold value and is connected to the boundary decompression line from the newly set boundary search area AEa, and sets it as a new boundary decompression line LEa. Similarly, the decompression unit 23 repeats the above process until the boundary decompression line cannot be extracted or until the boundary decompression line is continuously extracted a predetermined number of times. As shown in (g) of FIG. 6, the decompression unit 23 uses the boundary seed LEs extracted by the filter processing unit 22 and the boundary decompression line LEa sequentially extracted as the estimated boundary line LE to output information indicating the estimated boundary line to the outside. Output to. Further, the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line.
 図7の(d)は、時刻tb2の撮像画像から検出されてフィルタ処理部22で選択された選択直線を含む探索基準領域AEsを示している。また、境界シードLEsは、探索基準領域AEsに含まれる選択直線部分である。伸長部23は、境界シードLEsの位置を基準として図7の(e)に示すように境界シードLEsの線分方向に探索基準領域AEsに隣接して境界探索領域AEaを設定する。また、伸長部23は、境界探索領域AEaについてハフ変換を行い、図7の(f)に示すように、境界シードLEsに対する傾きが閾値よりも小さく境界シードLEsに繋がる直線を境界探索領域AEaから検出して境界伸長線LEaとする。したがって、境界シードLEsに対して閾値よりも小さい傾きを生じた直線も境界伸長線LEaとして抽出される。さらに、伸長部23は、境界伸長線LEaが抽出できたことに応じて、この境界伸長線LEaの位置を基準として線分方向に境界探索領域AEaを新たに設定する。伸長部23は、直前に抽出された境界伸長線に対する傾きが閾値よりも小さく境界伸長線に繋がる直線を新たに設定した境界探索領域AEaから抽出して新たな境界伸長線LEaとする。以下同様に、伸長部23は、上述の処理を境界伸長線が抽出できなくなるまで、あるいは境界伸長線が連続して所定回数抽出されるまで繰り返し行う。伸長部23は、図7の(g)に示すように、フィルタ処理部22で抽出した境界シードLEsと順次抽出された境界伸長線LEaを推定境界線LEとして、推定境界線を示す推定境界線情報を外部へ出力する。 FIG. 7D shows the search reference area AEs including the selected straight line detected by the captured image at the time tb2 and selected by the filter processing unit 22. The boundary seed LEs is a selected straight line portion included in the search reference area AEs. The decompressing unit 23 sets the boundary search area AEa adjacent to the search reference area AEs in the line segment direction of the boundary seed LEs with the position of the boundary seed LEs as a reference, as shown in (e) of FIG. 7. Further, the decompression unit 23 performs the Hough transform on the boundary search area AEa, and as shown in (f) of FIG. 7, a straight line having a smaller inclination with respect to the boundary seed LEs than the threshold value and connected to the boundary seed LEs is extracted from the boundary search area AEa. The boundary extension line LEa is detected. Therefore, a straight line having an inclination smaller than the threshold value with respect to the boundary seed LEs is also extracted as the boundary extension line LEa. Further, the decompression unit 23 newly sets the boundary search area AEa in the line segment direction based on the position of the boundary extension line LEa in response to the extraction of the boundary extension line LEa. The decompression unit 23 extracts a straight line whose inclination with respect to the boundary decompression line extracted immediately before is smaller than a threshold value and is connected to the boundary decompression line from the newly set boundary search area AEa, and sets it as a new boundary decompression line LEa. Similarly, the decompression unit 23 repeats the above process until the boundary decompression line cannot be extracted or until the boundary decompression line is continuously extracted a predetermined number of times. As shown in (g) of FIG. 7, the decompression unit 23 uses the boundary seed LEs extracted by the filter processing unit 22 and the boundary decompression line LEa sequentially extracted as the estimated boundary line LE to estimate the estimated boundary line. Output information to the outside.
 このように、伸長部23は、フィルタ処理部22で選択された境界シードから延びる直線を境界探索領域毎に抽出して連結した線分を推定境界線として、推定境界線を示す推定境界線情報を生成できる。 As described above, the decompression unit 23 extracts the straight line extending from the boundary seed selected by the filter processing unit 22 for each boundary search region and connects the straight line as an estimated boundary line, and the estimated boundary line information indicating the estimated boundary line. Can be generated.
 また、伸長部23は、境界探索領域の領域サイズと閾値によって、推定境界線の曲率範囲を設定してもよい。例えば、伸長部23は、境界探索領域において境界シードあるいは境界伸長線の方向のサイズを大きくあるいは閾値を小さくすることで、推定境界線を曲率の小さい線分に制限できる。また、伸長部23は、境界伸長線の方向のサイズを大きくあるいは閾値を小さくすれば、曲率の大きい線分も推定境界線とすることができる。したがって、伸長部23は、検出可能とする境界の曲率範囲を領域サイズと閾値によって予め設定してもよい。 Further, the decompression unit 23 may set the curvature range of the estimated boundary line according to the area size of the boundary search area and the threshold value. For example, the decompression unit 23 can limit the estimated boundary line to a line segment having a small curvature by increasing the size of the boundary seed or the direction of the boundary expansion line in the boundary search region or decreasing the threshold value. In addition, the expansion unit 23 can also make a line segment having a large curvature an estimated boundary line by increasing the size in the direction of the boundary expansion line or decreasing the threshold value. Therefore, the extension unit 23 may preset the detectable curvature range of the boundary according to the region size and the threshold value.
 さらに、伸長部23は、推定境界線を示す情報の出力として、推定境界線を示す画像を重畳させた撮像画像を出力してもよい。このように、推定境界線を示す画像を撮像画像に重畳すれば、撮像画像のいずれの位置が境界線として検出されているかを容易に認識できるようになる。 Furthermore, the decompression unit 23 may output a captured image on which an image indicating the estimated boundary line is superimposed, as the output of the information indicating the estimated boundary line. In this way, by superimposing the image showing the estimated boundary line on the captured image, it becomes possible to easily recognize which position of the captured image is detected as the boundary line.
 図8は、画像処理装置の動作を例示したフローチャートである。ステップST1で画像処理装置は撮像画像の取得を開始する。画像処理装置20は、撮像装置10で生成された撮像画像の取得を開始してステップST2に進む。 FIG. 8 is a flowchart illustrating the operation of the image processing apparatus. In step ST1, the image processing device starts acquisition of a captured image. The image processing device 20 starts acquisition of the captured image generated by the imaging device 10 and proceeds to step ST2.
 ステップST2で画像処理装置は直線検出処理を行う。画像処理装置20の直線検出部21は、撮像装置10から取得した撮像画像からρ-θ空間での直線を検出してステップST3に進む。 The image processing apparatus performs straight line detection processing in step ST2. The straight line detection unit 21 of the image processing device 20 detects a straight line in the ρ-θ space from the captured image acquired from the imaging device 10, and proceeds to step ST3.
 ステップST3で画像処理装置はフィルタ処理を行う。画像処理装置20のフィルタ処理部22は、ステップST2で検出された直線から境界を示す直線を選択する。図9はフィルタ処理を示すフローチャートである。 At step ST3, the image processing apparatus performs filter processing. The filter processing unit 22 of the image processing device 20 selects a straight line indicating a boundary from the straight lines detected in step ST2. FIG. 9 is a flowchart showing the filtering process.
 ステップST11でフィルタ処理部は距離が最小となる直線を検出する。フィルタ処理部22は、直線検出部21で検出された候補直線から、式(3)に基づき距離が最小となる直線を検出する。フィルタ処理部22は、最初にフィルタ処理を行う場合、直線検出部で検出された直線のρ-θ空間における座標位置と予め設定された所定の座標位置(基準位置)との距離を候補直線毎に算出して、距離が最小となる直線を検出する。また、後述するステップST13の設定更新処理で所定の座標位置が更新された場合、更新後の座標位置を用いて更新後に検出された候補直線から距離が最小となる直線を検出する。なお、検出した候補直線Lfdminの座標位置と基準位置に対応する直線Ltgの座標位置との距離は、最小距離dist(Lfdmin,Ltg)とする。フィルタ処理部22は、このように時系列フィルタ処理を行い距離が最小となる直線を検出してステップST12に進む。 In step ST11, the filter processing unit detects a straight line that minimizes the distance. The filter processing unit 22 detects, from the candidate straight lines detected by the straight line detection unit 21, the straight line having the minimum distance based on the equation (3). When first performing the filter processing, the filter processing unit 22 determines the distance between the coordinate position of the straight line detected by the straight line detection unit in the ρ-θ space and the preset predetermined coordinate position (reference position) for each candidate straight line. Then, the straight line with the minimum distance is detected. Further, when the predetermined coordinate position is updated in the setting update process of step ST13 described later, the updated coordinate position is used to detect the straight line having the smallest distance from the candidate straight lines detected after the update. The distance between the coordinate position of the detected candidate straight line Lfdmin and the coordinate position of the straight line Ltg corresponding to the reference position is the minimum distance dist (Lfdmin, Ltg). The filter processing unit 22 performs the time-series filter processing in this way, detects the straight line with the minimum distance, and proceeds to step ST12.
 ステップST12でフィルタ処理部は最小距離が閾値よりも小さいか判別する。フィルタ処理部22は、ステップST11で検出した直線の座標位置と所定の座標位置との距離である最小距離dist(Lfdmin,Ltg)が閾値よりも小さい場合はステップST13に進み、最小距離dist(Lfdmin,Ltg)が閾値以上である場合はステップST14に進む。 In step ST12, the filter processing unit determines whether the minimum distance is smaller than the threshold value. If the minimum distance dist (Lfdmin, Ltg), which is the distance between the coordinate position of the straight line detected in step ST11 and the predetermined coordinate position, is smaller than the threshold value, the filter processing unit 22 proceeds to step ST13, and the minimum distance dist (Lfdmin , Ltg) is greater than or equal to the threshold value, the process proceeds to step ST14.
 ステップST13でフィルタ処理部は設定更新処理を行う。フィルタ処理部22は、ステップST11で検出した直線を選択直線として、この選択直線を示す座標位置を所定の座標位置とする。 In step ST13, the filter processing unit performs setting update processing. The filter processing unit 22 sets the straight line detected in step ST11 as the selected straight line, and sets the coordinate position indicating this selected straight line as the predetermined coordinate position.
 ステップST14でフィルタ処理部はロスト判定情報の出力を行う。フィルタ処理部22は、最小距離dist(Lfdmin,Ltg)が閾値以上であると判別されていることから、所定の座標位置に近い直線、すなわち境界線とみなすことができる直線は検出できないことを示すロスト判定情報を出力する。 In step ST14, the filter processing unit outputs the lost judgment information. Since the minimum distance dist (Lfdmin, Ltg) is determined to be greater than or equal to the threshold, the filter processing unit 22 indicates that a straight line close to a predetermined coordinate position, that is, a straight line that can be regarded as a boundary line cannot be detected. Outputs lost judgment information.
 図8に戻り、ステップST3からステップST4に進むと、画像処理装置は伸長処理を行う。画像処理装置20の伸長部23は、フィルタ処理によって選択された直線に続く直線を検出することで、境界を示す線分を伸長させる。図10は伸長処理を示すフローチャートである。 Returning to FIG. 8, when the process proceeds from step ST3 to step ST4, the image processing apparatus performs decompression processing. The decompression unit 23 of the image processing device 20 decompresses the line segment indicating the boundary by detecting a straight line following the straight line selected by the filtering process. FIG. 10 is a flowchart showing the decompression process.
 ステップST21で伸長部は視点変換を行う。伸長部23は、フィルタ処理で選択された直線が位置する平面に対して、鉛直方向を視点方向として撮像画像の視点変換を行いステップST22に進む。 At step ST21, the decompression unit performs viewpoint conversion. The decompression unit 23 performs viewpoint conversion of the captured image with respect to the plane where the straight line selected by the filtering process is located, with the vertical direction being the viewpoint direction, and proceeds to step ST22.
 ステップST22で伸長部は境界シードを設定する。伸長部23は、フィルタ処理部22で選択された直線の位置を基準として、選択された直線を含むように探索基準領域を設定する。さらに、伸長部23は、探索基準領域内の直線を境界シードしてステップST23に進む。 In step ST22, the extension unit sets the boundary seed. The decompression unit 23 sets the search reference area so as to include the selected straight line, with the position of the straight line selected by the filter processing unit 22 as a reference. Further, the decompression unit 23 performs boundary seeding on the straight line in the search reference area and proceeds to step ST23.
 ステップST23で伸長部は境界探索領域を設定する。伸長部23は、境界シードの線分方向に、境界シードの位置を基準として境界探索領域を探索基準領域に隣接して設定する。また、伸長部23は、直線に境界伸長線が検出されている場合、境界伸長線の位置を基準として境界探索領域を直線に境界伸長線が検出された境界探索領域に隣接して設定する。このように、伸長部23は、境界探索領域を設定してステップST24に進む。 In step ST23, the decompression unit sets the boundary search area. The extension unit 23 sets the boundary search area adjacent to the search reference area in the direction of the line segment of the boundary seed, with the position of the boundary seed as a reference. In addition, when the boundary extension line is detected as a straight line, the extension unit 23 sets the boundary search area on the basis of the position of the boundary extension line so as to be adjacent to the boundary search area where the boundary extension line is detected as a straight line. In this way, the decompression unit 23 sets the boundary search area and proceeds to step ST24.
 ステップST24で伸長部は境界探索領域で直線検出を行う。伸長部23はステップST23で設定された境界探索領域に含まれる直線を検出してステップST25に進む。 In step ST24, the decompression unit detects a straight line in the boundary search area. The decompression unit 23 detects a straight line included in the boundary search area set in step ST23 and proceeds to step ST25.
 ステップST25で伸長部は傾きの近い直線を検出できたか判別する。伸長部23は、ステップST24で検出した直線と境界シードあるいは境界伸長線に対する傾きを算出して、傾きが最も小さくかつ閾値よりも小さい直線がステップST24で検出されている場合、傾きが最も小さくかつ閾値よりも小さい直線を境界伸長線としてステップST23に戻る。また、伸長部23は、傾きが閾値よりも小さい直線を検出できない場合、伸長処理を終了する。 At step ST25, the extension unit determines whether or not a straight line with a close inclination can be detected. The decompression unit 23 calculates the slope with respect to the straight line detected in step ST24 and the boundary seed or the boundary expansion line, and if the straight line having the smallest slope and smaller than the threshold is detected in step ST24, the slope is the smallest and A straight line smaller than the threshold is set as a boundary extension line and the process returns to step ST23. In addition, the decompression unit 23 ends the decompression process when a straight line whose inclination is smaller than the threshold value cannot be detected.
 図8に戻り、ステップST4からステップST5に進むと、画像処理装置は境界線情報出力処理を行う。画像処理装置20は、ステップST3とステップST4の処理によって検出された推定境界線を示す推定境界線情報を外部機器へ出力する。また、画像処理装置20は、ステップST3のフィルタ処理において、最小距離が閾値よりも小さい直線が検出されていない場合、ロスト判定情報を外部機器へ出力する。 Returning to FIG. 8, when the process proceeds from step ST4 to step ST5, the image processing device performs boundary line information output processing. The image processing device 20 outputs the estimated boundary line information indicating the estimated boundary line detected by the processes of step ST3 and step ST4 to the external device. In addition, the image processing apparatus 20 outputs the lost determination information to the external device when a straight line whose minimum distance is smaller than the threshold value is not detected in the filter process of step ST3.
 このような本技術によれば、ρ-θ空間での時系列フィルタ処理によって、境界を示す直線が選択されるので、xy平面で境界を示す直線を検出する場合に比べて精度よく境界を検出できる。 According to the present technology as described above, since the straight line indicating the boundary is selected by the time series filter processing in the ρ-θ space, the boundary can be detected with higher accuracy than in the case of detecting the straight line indicating the boundary on the xy plane. it can.
 また、直線はρ-θ空間で特定されることから、例えばρ-θ空間で閾値を調整することによりフィルタ処理部のフィルタ特性、すなわち選択直線のトラッキング特性を所望の特性とすることが可能となる。例えば、上述のパラメータρによって、直線の位置ズレの許容範囲を指定できる。また、パラメータθによって、直線の連続性(滑らかさ)の許容範囲を指定できる。 Further, since the straight line is specified in the ρ-θ space, the filter characteristic of the filter processing unit, that is, the tracking characteristic of the selected straight line can be made a desired characteristic by adjusting the threshold value in the ρ-θ space, for example. Become. For example, the above-mentioned parameter ρ can specify the allowable range of the positional deviation of the straight line. Further, the allowable range of continuity (smoothness) of the straight line can be designated by the parameter θ.
 さらに、境界探索領域を設定して境界シードの伸長処理を行うことから、関数近似によって境界線を推定する場合に比べて、処理が容易となる。また、境界探索領域内の直線のみを候補として、直線の連続性に基づき境界線の推定を行っていることから、ノイズ等に対するロバスト性を向上させることができる。さらに、鳥瞰図上で直線の連続性に基づき境界線の推定を行っていることから、右左折時に境界線が曲率を有するようになっても対応可能となる。 Furthermore, since the boundary search area is set and the boundary seed is expanded, the processing is easier than in the case of estimating the boundary line by function approximation. Further, since only the straight lines in the boundary search area are candidates and the boundary lines are estimated based on the continuity of the straight lines, robustness against noise and the like can be improved. Furthermore, since the boundary line is estimated based on the continuity of straight lines on the bird's-eye view, it is possible to deal with the case where the boundary line has a curvature when turning right or left.
 なお、上述の画像処理装置では、ハフ変換を用いて直線を検出する場合を例示したが、直線の検出は他の方法で行ってもよい。例えば、直線検出部21は、RANSAC(Random Sample Consensus)や最小二乗法等を用いて直線の検出を行い、フィルタ処理部22は、検出した直線を示す極座標空間の座標位置を用いてフィルタ処理を行うようにしてもよい。 Note that, in the above-described image processing device, the case where the straight line is detected by using the Hough transform is illustrated, but the straight line may be detected by another method. For example, the straight line detection unit 21 detects a straight line using RANSAC (Random Sample Consensus) or the least squares method, and the filter processing unit 22 performs filter processing using the coordinate position of the polar coordinate space indicating the detected straight line. You may do it.
 <3.応用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、工場等で用いられる搬送車、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
<3. Application example>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure is applicable to any type of vehicle, electric vehicle, hybrid electric vehicle, motorcycle, bicycle, personal mobility, carrier vehicle used in factories, robots, construction machines, agricultural machines (tractors), and the like. May be realized as a device mounted on the moving body.
 図11は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム100の概略的な構成例を示すブロック図である。 FIG. 11 is a block diagram showing a schematic configuration example of a vehicle control system 100 which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
 なお、以下、車両制御システム100が設けられている車両を他の車両と区別する場合、自車又は自車両と称する。 Note that, hereinafter, when distinguishing a vehicle provided with the vehicle control system 100 from other vehicles, the vehicle is referred to as the own vehicle or the own vehicle.
 車両制御システム100は、入力部101、データ取得部102、通信部103、車内機器104、出力制御部105、出力部106、駆動系制御部107、駆動系システム108、ボディ系制御部109、ボディ系システム110、記憶部111、及び、自動運転制御部112を備える。入力部101、データ取得部102、通信部103、出力制御部105、駆動系制御部107、ボディ系制御部109、記憶部111、及び、自動運転制御部112は、通信ネットワーク121を介して、相互に接続されている。通信ネットワーク121は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)、又は、FlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークやバス等からなる。なお、車両制御システム100の各部は、通信ネットワーク121を介さずに、直接接続される場合もある。 The vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, an in-vehicle device 104, an output control unit 105, an output unit 106, a drive system control unit 107, a drive system system 108, a body system control unit 109, and a body. The system 110, the storage unit 111, and the automatic operation control unit 112 are provided. The input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the drive system control unit 107, the body system control unit 109, the storage unit 111, and the automatic driving control unit 112, via the communication network 121, Connected to each other. The communication network 121 is, for example, an in-vehicle communication network or a bus compliant with any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). Become. In addition, each part of the vehicle control system 100 may be directly connected without going through the communication network 121.
 なお、以下、車両制御システム100の各部が、通信ネットワーク121を介して通信を行う場合、通信ネットワーク121の記載を省略するものとする。例えば、入力部101と自動運転制御部112が、通信ネットワーク121を介して通信を行う場合、単に入力部101と自動運転制御部112が通信を行うと記載する。 Note that, hereinafter, when each unit of the vehicle control system 100 communicates via the communication network 121, the description of the communication network 121 is omitted. For example, when the input unit 101 and the automatic driving control unit 112 communicate with each other via the communication network 121, it is simply described that the input unit 101 and the automatic driving control unit 112 communicate with each other.
 入力部101は、搭乗者が各種のデータや指示等の入力に用いる装置を備える。例えば、入力部101は、タッチパネル、ボタン、マイクロフォン、スイッチ、及び、レバー等の操作デバイス、並びに、音声やジェスチャ等により手動操作以外の方法で入力可能な操作デバイス等を備える。また、例えば、入力部101は、赤外線若しくはその他の電波を利用したリモートコントロール装置、又は、車両制御システム100の操作に対応したモバイル機器若しくはウェアラブル機器等の外部接続機器であってもよい。入力部101は、搭乗者により入力されたデータや指示等に基づいて入力信号を生成し、車両制御システム100の各部に供給する。 The input unit 101 includes a device used by the passenger to input various data and instructions. For example, the input unit 101 includes an operation device such as a touch panel, a button, a microphone, a switch, and a lever, and an operation device that can be input by a method other than a manual operation such as voice or gesture. Further, for example, the input unit 101 may be a remote control device that uses infrared rays or other radio waves, or an externally connected device such as a mobile device or a wearable device that corresponds to the operation of the vehicle control system 100. The input unit 101 generates an input signal based on the data and instructions input by the passenger, and supplies the input signal to each unit of the vehicle control system 100.
 データ取得部102は、車両制御システム100の処理に用いるデータを取得する各種のセンサ等を備え、取得したデータを、車両制御システム100の各部に供給する。 The data acquisition unit 102 includes various sensors that acquire data used for processing of the vehicle control system 100, and supplies the acquired data to each unit of the vehicle control system 100.
 例えば、データ取得部102は、自車の状態等を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、ジャイロセンサ、加速度センサ、慣性計測装置(IMU)、及び、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数、モータ回転数、若しくは、車輪の回転速度等を検出するためのセンサ等を備える。 For example, the data acquisition unit 102 includes various sensors for detecting the state of the own vehicle and the like. Specifically, for example, the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), an accelerator pedal operation amount, a brake pedal operation amount, a steering wheel steering angle, and an engine speed. It is provided with a sensor or the like for detecting the number of rotations of the motor or the rotation speed of the wheels.
 また、例えば、データ取得部102は、自車の外部の情報を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ、及び、その他のカメラ等の撮像装置を備える。また、例えば、データ取得部102は、天候又は気象等を検出するための環境センサ、及び、自車の周囲の物体を検出するための周囲情報検出センサを備える。環境センサは、例えば、雨滴センサ、霧センサ、日照センサ、雪センサ等からなる。周囲情報検出センサは、例えば、超音波センサ、レーダ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ソナー等からなる。 Further, for example, the data acquisition unit 102 includes various sensors for detecting information outside the vehicle. Specifically, for example, the data acquisition unit 102 includes an imaging device such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. In addition, for example, the data acquisition unit 102 includes an environment sensor for detecting weather or weather, and an ambient information detection sensor for detecting an object around the vehicle. The environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like. The ambient information detection sensor includes, for example, an ultrasonic sensor, a radar, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a sonar, and the like.
 さらに、例えば、データ取得部102は、自車の現在位置を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、GNSS(Global Navigation Satellite System)衛星からのGNSS信号を受信するGNSS受信機等を備える。 Further, for example, the data acquisition unit 102 includes various sensors for detecting the current position of the vehicle. Specifically, for example, the data acquisition unit 102 includes a GNSS receiver that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite.
 また、例えば、データ取得部102は、車内の情報を検出するための各種のセンサを備える。具体的には、例えば、データ取得部102は、運転者を撮像する撮像装置、運転者の生体情報を検出する生体センサ、及び、車室内の音声を集音するマイクロフォン等を備える。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座っている搭乗者又はステアリングホイールを握っている運転者の生体情報を検出する。 Further, for example, the data acquisition unit 102 includes various sensors for detecting information inside the vehicle. Specifically, for example, the data acquisition unit 102 includes an imaging device that images the driver, a biometric sensor that detects biometric information of the driver, a microphone that collects sound in the vehicle interior, and the like. The biometric sensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of an occupant sitting on a seat or a driver who holds the steering wheel.
 通信部103は、車内機器104、並びに、車外の様々な機器、サーバ、基地局等と通信を行い、車両制御システム100の各部から供給されるデータを送信したり、受信したデータを車両制御システム100の各部に供給したりする。なお、通信部103がサポートする通信プロトコルは、特に限定されるものではなく、また、通信部103が、複数の種類の通信プロトコルをサポートすることも可能である
 例えば、通信部103は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)、又は、WUSB(Wireless USB)等により、車内機器104と無線通信を行う。また、例えば、通信部103は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface)、又は、MHL(Mobile High-definition Link)等により、車内機器104と有線通信を行う。
The communication unit 103 communicates with the in-vehicle device 104 and various devices outside the vehicle, a server, a base station, etc., and transmits data supplied from each unit of the vehicle control system 100 or receives received data from the vehicle control system. It is supplied to each part of 100. The communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 can also support a plurality of types of communication protocols. For example, the communication unit 103 uses a wireless LAN. , Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like to perform wireless communication with the in-vehicle device 104. Further, for example, the communication unit 103 uses a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (via a connection terminal (and a cable, if necessary), not shown. Mobile High-definition Link) is used to perform wired communication with the in-vehicle device 104.
 さらに、例えば、通信部103は、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)との通信を行う。また、例えば、通信部103は、P2P(Peer To Peer)技術を用いて、自車の近傍に存在する端末(例えば、歩行者若しくは店舗の端末、又は、MTC(Machine Type Communication)端末)との通信を行う。さらに、例えば、通信部103は、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、自車と家との間(Vehicle to Home)の通信、及び、歩車間(Vehicle to Pedestrian)通信等のV2X通信を行う。また、例えば、通信部103は、ビーコン受信部を備え、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行規制又は所要時間等の情報を取得する。 Further, for example, the communication unit 103 communicates with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a network unique to a business operator) via a base station or an access point. Communicate. In addition, for example, the communication unit 103 uses a P2P (PeerToPeer) technology to communicate with a terminal (for example, a pedestrian or a shop terminal, or an MTC (MachineType Communication) terminal) that exists near the vehicle. Communicate. Furthermore, for example, the communication unit 103 may communicate between the vehicle and the vehicle, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) Perform V2X communication such as communication. In addition, for example, the communication unit 103 includes a beacon receiving unit, receives radio waves or electromagnetic waves transmitted from a wireless station installed on the road, and obtains information such as the current position, traffic congestion, traffic regulation, and required time. To do.
 車内機器104は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、自車に搬入され若しくは取り付けられる情報機器、及び、任意の目的地までの経路探索を行うナビゲーション装置等を含む。 The in-vehicle device 104 includes, for example, a mobile device or a wearable device that the passenger has, an information device that is carried in or attached to the vehicle, a navigation device that searches for a route to an arbitrary destination, and the like.
 出力制御部105は、自車の搭乗者又は車外に対する各種の情報の出力を制御する。例えば、出力制御部105は、視覚情報(例えば、画像データ)及び聴覚情報(例えば、音声データ)のうちの少なくとも1つを含む出力信号を生成し、出力部106に供給することにより、出力部106からの視覚情報及び聴覚情報の出力を制御する。具体的には、例えば、出力制御部105は、データ取得部102の異なる撮像装置により撮像された画像データを合成して、俯瞰画像又はパノラマ画像等を生成し、生成した画像を含む出力信号を出力部106に供給する。また、例えば、出力制御部105は、衝突、接触、危険地帯への進入等の危険に対する警告音又は警告メッセージ等を含む音声データを生成し、生成した音声データを含む出力信号を出力部106に供給する。 The output control unit 105 controls the output of various information to the passengers of the own vehicle or the outside of the vehicle. For example, the output control unit 105 generates an output signal including at least one of visual information (for example, image data) and auditory information (for example, audio data), and supplies the output signal to the output unit 106 to output the output unit. The output of visual information and auditory information from 106 is controlled. Specifically, for example, the output control unit 105 synthesizes image data captured by different imaging devices of the data acquisition unit 102 to generate a bird's-eye view image or a panoramic image, and outputs an output signal including the generated image. It is supplied to the output unit 106. Further, for example, the output control unit 105 generates voice data including a warning sound or a warning message for a danger such as collision, contact, or entry into a dangerous zone, and outputs an output signal including the generated voice data to the output unit 106. Supply.
 出力部106は、自車の搭乗者又は車外に対して、視覚情報又は聴覚情報を出力することが可能な装置を備える。例えば、出力部106は、表示装置、インストルメントパネル、オーディオスピーカ、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ、ランプ等を備える。出力部106が備える表示装置は、通常のディスプレイを有する装置以外にも、例えば、ヘッドアップディスプレイ、透過型ディスプレイ、AR(Augmented Reality)表示機能を有する装置等の運転者の視野内に視覚情報を表示する装置であってもよい。 The output unit 106 includes a device capable of outputting visual information or auditory information to a passenger of the vehicle or outside the vehicle. For example, the output unit 106 includes a display device, an instrument panel, an audio speaker, headphones, a wearable device such as a glasses-type display worn by a passenger, a projector, a lamp, and the like. The display device included in the output unit 106 includes visual information in the driver's visual field, such as a device having a head-up display, a transmissive display, and an AR (Augmented Reality) display function, in addition to a device having a normal display. It may be a display device.
 駆動系制御部107は、各種の制御信号を生成し、駆動系システム108に供給することにより、駆動系システム108の制御を行う。また、駆動系制御部107は、必要に応じて、駆動系システム108以外の各部に制御信号を供給し、駆動系システム108の制御状態の通知等を行う。 The drive system control unit 107 controls the drive system system 108 by generating various control signals and supplying them to the drive system system 108. Further, the drive system control unit 107 supplies a control signal to each unit other than the drive system system 108 as necessary to notify the control state of the drive system system 108 and the like.
 駆動系システム108は、自車の駆動系に関わる各種の装置を備える。例えば、駆動系システム108は、内燃機関又は駆動用モータ等の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、舵角を調節するステアリング機構、制動力を発生させる制動装置、ABS(Antilock Brake System)、ESC(Electronic Stability Control)、並びに、電動パワーステアリング装置等を備える。 The drive system 108 includes various devices related to the drive system of the vehicle. For example, the drive system system 108 includes a drive force generation device for generating a drive force of an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to wheels, a steering mechanism for adjusting a steering angle, It is equipped with a braking device that generates a braking force, an ABS (Antilock Brake System), an ESC (Electronic Stability Control), an electric power steering device, and the like.
 ボディ系制御部109は、各種の制御信号を生成し、ボディ系システム110に供給することにより、ボディ系システム110の制御を行う。また、ボディ系制御部109は、必要に応じて、ボディ系システム110以外の各部に制御信号を供給し、ボディ系システム110の制御状態の通知等を行う。 The body system control unit 109 controls the body system 110 by generating various control signals and supplying them to the body system 110. Further, the body system control unit 109 supplies a control signal to each unit other than the body system system 110 as necessary to notify the control state of the body system system 110 and the like.
 ボディ系システム110は、車体に装備されたボディ系の各種の装置を備える。例えば、ボディ系システム110は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、パワーシート、ステアリングホイール、空調装置、及び、各種ランプ(例えば、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカ、フォグランプ等)等を備える。 The body system 110 includes various body-related devices mounted on the vehicle body. For example, the body system 110 includes a keyless entry system, a smart key system, a power window device, a power seat, a steering wheel, an air conditioner, and various lamps (for example, headlights, backlights, brake lights, blinkers, fog lights, etc.). And so on.
 記憶部111は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、及び、光磁気記憶デバイス等を備える。記憶部111は、車両制御システム100の各部が用いる各種プログラムやデータ等を記憶する。例えば、記憶部111は、ダイナミックマップ等の3次元の高精度地図、高精度地図より精度が低く、広いエリアをカバーするグローバルマップ、及び、自車の周囲の情報を含むローカルマップ等の地図データを記憶する。 The storage unit 111 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, and a magneto-optical storage device. .. The storage unit 111 stores various programs and data used by each unit of the vehicle control system 100. For example, the storage unit 111 stores map data such as a three-dimensional high-precision map such as a dynamic map, a global map having a lower accuracy than the high-precision map and covering a wide area, and a local map including information around the vehicle. Memorize
 自動運転制御部112は、自律走行又は運転支援等の自動運転に関する制御を行う。具体的には、例えば、自動運転制御部112は、自車の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、自車の衝突警告、又は、自車のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行う。また、例えば、自動運転制御部112は、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行う。自動運転制御部112は、検出部131、自己位置推定部132、状況分析部133、計画部134、及び、動作制御部135を備える。 The automatic driving control unit 112 performs control related to autonomous driving such as autonomous driving or driving support. Specifically, for example, the automatic driving control unit 112 may perform collision avoidance or impact mitigation of the own vehicle, follow-up traveling based on inter-vehicle distance, vehicle speed maintenance traveling, collision warning of the own vehicle, lane departure warning of the own vehicle, or the like. Coordinated control for the purpose of realizing the functions of ADAS (Advanced Driver Assistance System) including Further, for example, the automatic driving control unit 112 performs cooperative control for the purpose of autonomous driving that autonomously travels without depending on the operation of the driver. The automatic driving control unit 112 includes a detection unit 131, a self-position estimation unit 132, a situation analysis unit 133, a planning unit 134, and an operation control unit 135.
 検出部131は、自動運転の制御に必要な各種の情報の検出を行う。検出部131は、車外情報検出部141、車内情報検出部142、及び、車両状態検出部143を備える。 The detection unit 131 detects various kinds of information necessary for controlling automatic driving. The detection unit 131 includes a vehicle exterior information detection unit 141, a vehicle interior information detection unit 142, and a vehicle state detection unit 143.
 車外情報検出部141は、車両制御システム100の各部からのデータ又は信号に基づいて、自車の外部の情報の検出処理を行う。例えば、車外情報検出部141は、自車の周囲の物体の検出処理、認識処理、及び、追跡処理、並びに、物体までの距離の検出処理を行う。検出対象となる物体には、例えば、車両、人、障害物、構造物、道路、信号機、交通標識、道路標示等が含まれる。また、例えば、車外情報検出部141は、自車の周囲の環境の検出処理を行う。検出対象となる周囲の環境には、例えば、天候、気温、湿度、明るさ、及び、路面の状態等が含まれる。車外情報検出部141は、検出処理の結果を示すデータを自己位置推定部132、状況分析部133のマップ解析部151、交通ルール認識部152、及び、状況認識部153、並びに、動作制御部135の緊急事態回避部171等に供給する。 The outside-vehicle information detection unit 141 performs detection processing of information outside the own vehicle based on data or signals from each unit of the vehicle control system 100. For example, the vehicle exterior information detection unit 141 performs detection processing, recognition processing, and tracking processing of an object around the vehicle, and detection processing of a distance to the object. Objects to be detected include vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, road markings, and the like. Further, for example, the vehicle exterior information detection unit 141 performs a process of detecting the environment around the vehicle. The surrounding environment to be detected includes, for example, weather, temperature, humidity, brightness, and road surface condition. The vehicle exterior information detection unit 141 uses the data indicating the result of the detection process to obtain the self-position estimation unit 132, the map analysis unit 151 of the situation analysis unit 133, the traffic rule recognition unit 152, the situation recognition unit 153, and the operation control unit 135. To the emergency avoidance unit 171 and the like.
 車内情報検出部142は、車両制御システム100の各部からのデータ又は信号に基づいて、車内の情報の検出処理を行う。例えば、車内情報検出部142は、運転者の認証処理及び認識処理、運転者の状態の検出処理、搭乗者の検出処理、及び、車内の環境の検出処理等を行う。検出対象となる運転者の状態には、例えば、体調、覚醒度、集中度、疲労度、視線方向等が含まれる。検出対象となる車内の環境には、例えば、気温、湿度、明るさ、臭い等が含まれる。車内情報検出部142は、検出処理の結果を示すデータを状況分析部133の状況認識部153、及び、動作制御部135の緊急事態回避部171等に供給する。 The in-vehicle information detection unit 142 performs in-vehicle information detection processing based on data or signals from each unit of the vehicle control system 100. For example, the in-vehicle information detection unit 142 performs driver authentication processing and recognition processing, driver state detection processing, passenger detection processing, and in-vehicle environment detection processing. The driver's state to be detected includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight direction, and the like. The environment inside the vehicle to be detected includes, for example, temperature, humidity, brightness, odor, and the like. The in-vehicle information detection unit 142 supplies the data indicating the result of the detection processing to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
 車両状態検出部143は、車両制御システム100の各部からのデータ又は信号に基づいて、自車の状態の検出処理を行う。検出対象となる自車の状態には、例えば、速度、加速度、舵角、異常の有無及び内容、運転操作の状態、パワーシートの位置及び傾き、ドアロックの状態、並びに、その他の車載機器の状態等が含まれる。車両状態検出部143は、検出処理の結果を示すデータを状況分析部133の状況認識部153、及び、動作制御部135の緊急事態回避部171等に供給する。 The vehicle state detection unit 143 performs detection processing of the state of the vehicle based on data or signals from each unit of the vehicle control system 100. The state of the vehicle to be detected includes, for example, speed, acceleration, steering angle, presence / absence of abnormality, content of driving operation, position and inclination of power seat, state of door lock, and other in-vehicle devices. State etc. are included. The vehicle state detection unit 143 supplies the data indicating the result of the detection process to the situation recognition unit 153 of the situation analysis unit 133, the emergency situation avoidance unit 171 of the operation control unit 135, and the like.
 自己位置推定部132は、車外情報検出部141、及び、状況分析部133の状況認識部153等の車両制御システム100の各部からのデータ又は信号に基づいて、自車の位置及び姿勢等の推定処理を行う。また、自己位置推定部132は、必要に応じて、自己位置の推定に用いるローカルマップ(以下、自己位置推定用マップと称する)を生成する。自己位置推定用マップは、例えば、SLAM(Simultaneous Localization and Mapping)等の技術を用いた高精度なマップとされる。自己位置推定部132は、推定処理の結果を示すデータを状況分析部133のマップ解析部151、交通ルール認識部152、及び、状況認識部153等に供給する。また、自己位置推定部132は、自己位置推定用マップを記憶部111に記憶させる。 The self-position estimating unit 132 estimates the position and attitude of the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the vehicle exterior information detecting unit 141 and the situation recognizing unit 153 of the situation analyzing unit 133. Perform processing. The self-position estimation unit 132 also generates a local map (hereinafter, referred to as a self-position estimation map) used for estimating the self-position, if necessary. The self-position estimation map is, for example, a high-precision map using a technology such as SLAM (Simultaneous Localization and Mapping). The self-position estimation unit 132 supplies data indicating the result of the estimation process to the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and the like of the situation analysis unit 133. Further, the self-position estimation unit 132 stores the self-position estimation map in the storage unit 111.
 状況分析部133は、自車及び周囲の状況の分析処理を行う。状況分析部133は、マップ解析部151、交通ルール認識部152、状況認識部153、及び、状況予測部154を備える。 The situation analysis unit 133 analyzes the situation of the vehicle and surroundings. The situation analysis unit 133 includes a map analysis unit 151, a traffic rule recognition unit 152, a situation recognition unit 153, and a situation prediction unit 154.
 マップ解析部151は、自己位置推定部132及び車外情報検出部141等の車両制御システム100の各部からのデータ又は信号を必要に応じて用いながら、記憶部111に記憶されている各種のマップの解析処理を行い、自動運転の処理に必要な情報を含むマップを構築する。マップ解析部151は、構築したマップを、交通ルール認識部152、状況認識部153、状況予測部154、並びに、計画部134のルート計画部161、行動計画部162、及び、動作計画部163等に供給する。 The map analysis unit 151 uses data or signals from each unit of the vehicle control system 100, such as the self-position estimation unit 132 and the vehicle exterior information detection unit 141, as necessary, and stores various maps stored in the storage unit 111. Performs analysis processing and builds a map containing information necessary for automatic driving processing. The map analysis unit 151 uses the constructed map as a traffic rule recognition unit 152, a situation recognition unit 153, a situation prediction unit 154, a route planning unit 161, a behavior planning unit 162, and a motion planning unit 163 of the planning unit 134. Supply to.
 交通ルール認識部152は、自己位置推定部132、車外情報検出部141、及び、マップ解析部151等の車両制御システム100の各部からのデータ又は信号に基づいて、自車の周囲の交通ルールの認識処理を行う。この認識処理により、例えば、自車の周囲の信号の位置及び状態、自車の周囲の交通規制の内容、並びに、走行可能な車線等が認識される。交通ルール認識部152は、認識処理の結果を示すデータを状況予測部154等に供給する。 The traffic rule recognition unit 152 determines the traffic rules around the vehicle based on data or signals from the vehicle position control unit 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, and the map analysis unit 151. Perform recognition processing. By this recognition processing, for example, the position and state of the signal around the own vehicle, the content of traffic regulation around the own vehicle, the lane in which the vehicle can travel, and the like are recognized. The traffic rule recognition unit 152 supplies data indicating the result of the recognition process to the situation prediction unit 154 and the like.
 状況認識部153は、自己位置推定部132、車外情報検出部141、車内情報検出部142、車両状態検出部143、及び、マップ解析部151等の車両制御システム100の各部からのデータ又は信号に基づいて、自車に関する状況の認識処理を行う。例えば、状況認識部153は、自車の状況、自車の周囲の状況、及び、自車の運転者の状況等の認識処理を行う。また、状況認識部153は、必要に応じて、自車の周囲の状況の認識に用いるローカルマップ(以下、状況認識用マップと称する)を生成する。状況認識用マップは、例えば、占有格子地図(Occupancy Grid Map)とされる。 The situation recognition unit 153 converts data or signals from the vehicle position control unit 100 such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151. Based on this, recognition processing of the situation regarding the own vehicle is performed. For example, the situation recognition unit 153 performs recognition processing of the situation of the own vehicle, the situation around the own vehicle, the situation of the driver of the own vehicle, and the like. The situation recognition unit 153 also generates a local map (hereinafter, referred to as a situation recognition map) used for recognizing the situation around the own vehicle, as necessary. The situation recognition map is, for example, an occupancy grid map (Occupancy Grid Map).
 認識対象となる自車の状況には、例えば、自車の位置、姿勢、動き(例えば、速度、加速度、移動方向等)、並びに、異常の有無及び内容等が含まれる。認識対象となる自車の周囲の状況には、例えば、周囲の静止物体の種類及び位置、周囲の動物体の種類、位置及び動き(例えば、速度、加速度、移動方向等)、周囲の道路の構成及び路面の状態、並びに、周囲の天候、気温、湿度、及び、明るさ等が含まれる。認識対象となる運転者の状態には、例えば、体調、覚醒度、集中度、疲労度、視線の動き、並びに、運転操作等が含まれる。 The situation of the subject vehicle to be recognized includes, for example, the position, posture, movement (for example, speed, acceleration, moving direction, etc.) of the subject vehicle, and the presence / absence and content of an abnormality. The situation around the subject vehicle to be recognized is, for example, the type and position of a stationary object in the surroundings, the type and position of a moving object in the surroundings, position and movement (for example, speed, acceleration, moving direction, etc.), and surrounding roads. The configuration and the condition of the road surface, and the surrounding weather, temperature, humidity, and brightness are included. The driver's state to be recognized includes, for example, physical condition, arousal level, concentration level, fatigue level, line-of-sight movement, and driving operation.
 状況認識部153は、認識処理の結果を示すデータ(必要に応じて、状況認識用マップを含む)を自己位置推定部132及び状況予測部154等に供給する。また、状況認識部153は、状況認識用マップを記憶部111に記憶させる。 The situation recognition unit 153 supplies data indicating the result of the recognition process (including a situation recognition map, if necessary) to the self-position estimation unit 132, the situation prediction unit 154, and the like. The situation recognition unit 153 also stores the situation recognition map in the storage unit 111.
 状況予測部154は、マップ解析部151、交通ルール認識部152及び状況認識部153等の車両制御システム100の各部からのデータ又は信号に基づいて、自車に関する状況の予測処理を行う。例えば、状況予測部154は、自車の状況、自車の周囲の状況、及び、運転者の状況等の予測処理を行う。 The situation predicting unit 154 performs a process of predicting the situation regarding the own vehicle based on data or signals from each unit of the vehicle control system 100 such as the map analyzing unit 151, the traffic rule recognizing unit 152, and the situation recognizing unit 153. For example, the situation prediction unit 154 performs a prediction process of the situation of the own vehicle, the situation around the own vehicle, the situation of the driver, and the like.
 予測対象となる自車の状況には、例えば、自車の挙動、異常の発生、及び、走行可能距離等が含まれる。予測対象となる自車の周囲の状況には、例えば、自車の周囲の動物体の挙動、信号の状態の変化、及び、天候等の環境の変化等が含まれる。予測対象となる運転者の状況には、例えば、運転者の挙動及び体調等が含まれる。 The situation of the subject vehicle to be predicted includes, for example, the behavior of the subject vehicle, the occurrence of abnormality, and the mileage that can be traveled. The situation around the subject vehicle to be predicted includes, for example, the behavior of a moving object around the subject vehicle, a change in the signal state, and a change in the environment such as the weather. The driver's situation to be predicted includes, for example, the driver's behavior and physical condition.
 状況予測部154は、予測処理の結果を示すデータを、交通ルール認識部152及び状況認識部153からのデータとともに、計画部134のルート計画部161、行動計画部162、及び、動作計画部163等に供給する。 The situation prediction unit 154, together with the data from the traffic rule recognition unit 152 and the situation recognition unit 153, data indicating the result of the prediction process, the route planning unit 161, the action planning unit 162, and the operation planning unit 163 of the planning unit 134. Etc.
 ルート計画部161は、マップ解析部151及び状況予測部154等の車両制御システム100の各部からのデータ又は信号に基づいて、目的地までのルートを計画する。例えば、ルート計画部161は、グローバルマップに基づいて、現在位置から指定された目的地までのルートを設定する。また、例えば、ルート計画部161は、渋滞、事故、通行規制、工事等の状況、及び、運転者の体調等に基づいて、適宜ルートを変更する。ルート計画部161は、計画したルートを示すデータを行動計画部162等に供給する。 The route planning unit 161 plans a route to a destination based on data or signals from each part of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. For example, the route planning unit 161 sets a route from the current position to the designated destination based on the global map. Further, for example, the route planning unit 161 appropriately changes the route based on traffic jams, accidents, traffic restrictions, construction conditions, and the physical condition of the driver. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.
 行動計画部162は、マップ解析部151及び状況予測部154等の車両制御システム100の各部からのデータ又は信号に基づいて、ルート計画部161により計画されたルートを計画された時間内で安全に走行するための自車の行動を計画する。例えば、行動計画部162は、発進、停止、進行方向(例えば、前進、後退、左折、右折、方向転換等)、走行車線、走行速度、及び、追い越し等の計画を行う。行動計画部162は、計画した自車の行動を示すデータを動作計画部163等に供給する
 動作計画部163は、マップ解析部151及び状況予測部154等の車両制御システム100の各部からのデータ又は信号に基づいて、行動計画部162により計画された行動を実現するための自車の動作を計画する。例えば、動作計画部163は、加速、減速、及び、走行軌道等の計画を行う。動作計画部163は、計画した自車の動作を示すデータを、動作制御部135の加減速制御部172及び方向制御部173等に供給する。
The action planning unit 162 safely operates the route planned by the route planning unit 161 within the planned time on the basis of data or signals from each unit of the vehicle control system 100 such as the map analysis unit 151 and the situation prediction unit 154. Plan your vehicle's behavior to drive. For example, the action planning unit 162 makes a plan such as start, stop, traveling direction (for example, forward, backward, left turn, right turn, direction change, etc.), lane, traveling speed, and overtaking. The action planning unit 162 supplies data indicating the planned action of the own vehicle to the action planning unit 163 and the like. The action planning unit 163 receives data from each unit of the vehicle control system 100, such as the map analysis unit 151 and the situation prediction unit 154. Alternatively, based on the signal, the action planning unit 162 plans the action of the own vehicle for realizing the action planned. For example, the operation planning unit 163 makes a plan such as acceleration, deceleration, and traveling track. The operation planning unit 163 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like of the operation control unit 135.
 動作制御部135は、自車の動作の制御を行う。動作制御部135は、緊急事態回避部171、加減速制御部172、及び、方向制御部173を備える。 The operation control unit 135 controls the operation of the own vehicle. The operation control unit 135 includes an emergency avoidance unit 171, an acceleration / deceleration control unit 172, and a direction control unit 173.
 緊急事態回避部171は、車外情報検出部141、車内情報検出部142、及び、車両状態検出部143の検出結果に基づいて、衝突、接触、危険地帯への進入、運転者の異常、車両の異常等の緊急事態の検出処理を行う。緊急事態回避部171は、緊急事態の発生を検出した場合、急停車や急旋回等の緊急事態を回避するための自車の動作を計画する。緊急事態回避部171は、計画した自車の動作を示すデータを加減速制御部172及び方向制御部173等に供給する。 The emergency avoidance unit 171 is based on the detection results of the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, and the vehicle state detection unit 143, and collides, touches, enters a dangerous zone, the driver's abnormality, and the vehicle Performs detection processing for emergencies such as abnormalities. When the occurrence of an emergency is detected, the emergency avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency such as a sudden stop or a sharp turn. The emergency avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the acceleration / deceleration control unit 172, the direction control unit 173, and the like.
 加減速制御部172は、動作計画部163又は緊急事態回避部171により計画された自車の動作を実現するための加減速制御を行う。例えば、加減速制御部172は、計画された加速、減速、又は、急停車を実現するための駆動力発生装置又は制動装置の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。 The acceleration / deceleration control unit 172 performs acceleration / deceleration control for realizing the operation of the vehicle planned by the operation planning unit 163 or the emergency situation avoiding unit 171. For example, the acceleration / deceleration control unit 172 calculates the control target value of the driving force generation device or the braking device for realizing the planned acceleration, deceleration, or sudden stop, and drives the control command indicating the calculated control target value. It is supplied to the system control unit 107.
 方向制御部173は、動作計画部163又は緊急事態回避部171により計画された自車の動作を実現するための方向制御を行う。例えば、方向制御部173は、動作計画部163又は緊急事態回避部171により計画された走行軌道又は急旋回を実現するためのステアリング機構の制御目標値を演算し、演算した制御目標値を示す制御指令を駆動系制御部107に供給する。 The direction control unit 173 performs direction control for realizing the operation of the vehicle planned by the operation planning unit 163 or the emergency avoidance unit 171. For example, the direction control unit 173 calculates the control target value of the steering mechanism for realizing the planned traveling track or steep turn by the operation planning unit 163 or the emergency situation avoidance unit 171 and performs control indicating the calculated control target value. The command is supplied to the drive system control unit 107.
 このような構成の車両制御システム100において、本技術の画像処理装置20は、例えば車外情報検出部141に設けて、データ取得部102で取得された例えば車両前方の撮像画像を用いて、走行レーンと他の領域の境界線、例えば路肩との境界や他の走行レーンとの区分を示す白線部分等の境界線を検出する。車外情報検出部141は、検出した推定境界線を示す推定境界線情報を状況認識部153や緊急事態回避部171へ出力する。 In the vehicle control system 100 having such a configuration, the image processing device 20 according to an embodiment of the present technology is provided in, for example, the vehicle exterior information detection unit 141 and uses, for example, a captured image of the front of the vehicle acquired by the data acquisition unit 102 to drive the traveling lane. And a boundary line of another area, for example, a boundary line such as a boundary with a road shoulder or a white line portion indicating a section from another traveling lane is detected. The vehicle exterior information detection unit 141 outputs estimated boundary line information indicating the detected estimated boundary line to the situation recognition unit 153 and the emergency situation avoidance unit 171.
 この場合、フィルタ処理部22で用いる初期値は、予め設定された基準位置に限らず、所定ルールに基づき候補直線から選択した直線を示す座標位置を基準位置に設定してフィルタ処理を開始してもよい。例えば、所定のタイミング(走行開始時等のタイミングあるいは走行位置が所定位置となったときのタイミング等)で直線検出部で検出された直線から車両側面に最も近い直線を示す座標位置を基準位置に設定してフィルタ処理を開始してもよい。 In this case, the initial value used in the filter processing unit 22 is not limited to the preset reference position, but the coordinate position indicating the straight line selected from the candidate straight lines based on the predetermined rule is set as the reference position to start the filter processing. Good. For example, the coordinate position indicating the straight line closest to the vehicle side surface from the straight line detected by the straight line detection unit at a predetermined timing (timing at the start of traveling or timing when the traveling position reaches a predetermined position) is set as the reference position. You may set and start a filter process.
 状況認識部153は、推定境界線情報に基づいて、自車に関する状況の認識処理を行う。例えば、状況認識部153は、自車と道路の白線や路肩等との相対位置関係を判別して、判別結果を動作制御部135の方向制御部173へ供給することで、自車が走行車線から外れないように方向制御を行う。 The situation recognition unit 153 performs a situation recognition process for the vehicle based on the estimated boundary line information. For example, the situation recognition unit 153 determines the relative positional relationship between the own vehicle and the white line or the shoulder of the road, and supplies the determination result to the direction control unit 173 of the operation control unit 135 so that the own vehicle travels in the lane. The direction is controlled so that it does not come off.
 また、緊急事態回避部171は、推定境界線情報に基づいて、走行車線からの逸脱等の緊急事態の検出処理を行う。緊急事態回避部171は、緊急事態の発生を検出した場合、急停車や急旋回等の緊急事態を回避するための自車の動作を計画する。緊急事態回避部171は、計画した自車の動作を示すデータを方向制御部173等に供給して、緊急事態の回避制御を行う。 The emergency avoidance unit 171 also performs an emergency detection process such as deviation from the driving lane based on the estimated boundary line information. When the occurrence of an emergency is detected, the emergency avoidance unit 171 plans the operation of the own vehicle for avoiding an emergency such as a sudden stop or a sharp turn. The emergency avoidance unit 171 supplies data indicating the planned operation of the own vehicle to the direction control unit 173 and the like, and performs emergency avoidance control.
 このように、本技術を車両制御システム100に適用すれば、精度よく境界を検出できるので、より安全な走行が可能となる。また、画像処理装置20のフィルタ処理部22は、車両の移動状態または移動環境に応じて、フィルタ特性を変更してもよい。具体的には、ルート計画部161で計画されたルートに応じてフィルタ処理部22のフィルタ特性を変更する。例えば、高速道路では、一般道に比べて白線等が整備されており、連続性も確保されている。また、高速走行時には、路肩や白線等の境界を精度よく検出できることが望ましい。したがって、フィルタ処理部22は、高速道路を走行している時あるいは高速走行時に、パラメータρやパラメータθの許容範囲を一般道の走行時よりも厳しくして、正しく境界を検出できるようにする。また、市街地道路では白線の無い場所が多く、不連続である場合も多い。また、低速走行時には路肩や白線等の境界を検出精度が低くても状況に応じた運転を容易に行うことが可能である。したがって、フィルタ処理部22は、市街地道路を走行している時あるいは低速走行時に、パラメータρやパラメータθの許容範囲を高速道路走行時等よりも緩くして、境界を検出しやすくしてもよい。また、自車の運動モデルに応じてパラメータρやパラメータθの許容範囲を設定してもよい。例えば自車の動きに応じて移動する路肩や白線の位置を推定して、推定した位置を基準としてパラメータρやパラメータθの許容範囲を設定してもよい。このように許容範囲を設定すれば、自車の動きを考慮して推定境界線情報を生成できるようになる。 In this way, if the present technology is applied to the vehicle control system 100, the boundary can be detected with high accuracy, so safer driving is possible. Further, the filter processing unit 22 of the image processing device 20 may change the filter characteristic according to the moving state or moving environment of the vehicle. Specifically, the filter characteristic of the filter processing unit 22 is changed according to the route planned by the route planning unit 161. For example, on expressways, white lines and the like are maintained compared to ordinary roads, and continuity is secured. Further, it is desirable to be able to accurately detect boundaries such as road shoulders and white lines when traveling at high speed. Therefore, the filter processing unit 22 makes the permissible ranges of the parameters ρ and the parameters θ stricter than when traveling on a general road when traveling on a highway or during high-speed traveling so that the boundary can be correctly detected. In addition, there are many places where there are no white lines on city roads, and in many cases there are discontinuities. Further, when traveling at a low speed, it is possible to easily drive according to the situation even when the detection accuracy of the boundary such as the road shoulder or the white line is low. Therefore, the filter processing unit 22 may make the permissible range of the parameter ρ or the parameter θ more lenient when traveling on an urban road or at low speed than when traveling on a highway or the like so as to easily detect a boundary. .. Further, the allowable ranges of the parameter ρ and the parameter θ may be set according to the motion model of the own vehicle. For example, the position of the road shoulder or the white line that moves according to the movement of the vehicle may be estimated, and the allowable range of the parameter ρ or the parameter θ may be set based on the estimated position. By setting the allowable range in this way, the estimated boundary line information can be generated in consideration of the movement of the own vehicle.
 また、画像処理装置から出力されるロスト判定情報に基づき運転制御の切り替え等を行うようにすれば、境界が検出されないときでも他のセンサからの情報を利用して運転補助や自動運転等を行うことができる。また、ロスト判定情報をドライバーに通知して、注意を促すように、あるいは自動運転からマニュアル運転への切り替えを指示することもできる。 In addition, if the operation control is switched based on the lost determination information output from the image processing device, even if the boundary is not detected, information from other sensors is used to perform driving assistance or automatic driving. be able to. Further, it is possible to notify the driver of the lost judgment information to call attention, or to instruct the driver to switch from automatic operation to manual operation.
 また、本技術を車両制御システム100に適用する場合、伸長部23では探索基準領域と境界探索領域の長さ(境界シードの線分方向の長さ)を数十センチメートルから1メートル程度(好ましくは80センチメートル程度)として、幅(境界シードの線分方向に対して直交する方向の長さ)を数十センチメートル程度(好ましくは50センチメートル程度)とすれば、路肩や白線を示す境界伸長線を精度よく検出できるようになる。 When the present technology is applied to the vehicle control system 100, the extension unit 23 sets the length of the search reference area and the boundary search area (the length of the boundary seed in the line segment direction) to several tens of centimeters to 1 meter (preferably). Is about 80 cm), and the width (length in the direction orthogonal to the line segment direction of the boundary seed) is about several tens of centimeters (preferably about 50 cm), the boundary indicating the road shoulder or the white line. The extension line can be detected accurately.
 明細書中において説明した一連の処理はハードウェア、またはソフトウェア、あるいは両者の複合構成によって実行することが可能である。ソフトウェアによる処理を実行する場合は、処理シーケンスを記録したプログラムを、専用のハードウェアに組み込まれたコンピュータ内のメモリにインストールして実行させる。または、各種処理が実行可能な汎用コンピュータにプログラムをインストールして実行させることが可能である。 The series of processes described in the specification can be executed by hardware, software, or a composite configuration of both. When executing the processing by software, the program recording the processing sequence is installed in a memory in a computer incorporated in dedicated hardware and executed. Alternatively, the program can be installed and executed in a general-purpose computer that can execute various processes.
 例えば、プログラムは記録媒体としてのハードディスクやSSD(Solid State Drive)、ROM(Read Only Memory)に予め記録しておくことができる。あるいは、プログラムはフレキシブルディスク、CD-ROM(Compact Disc Read Only Memory),MO(Magneto optical)ディスク,DVD(Digital Versatile Disc)、BD(Blu-Ray Disc(登録商標))、磁気ディスク、半導体メモリカード等のリムーバブル記録媒体に、一時的または永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。 For example, the program can be recorded in advance in a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium. Alternatively, the program is a flexible disk, CD-ROM (Compact Disc Read Only Memory), MO (Magneto optical) disc, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disc, semiconductor memory card. It can be temporarily (or permanently) stored (recorded) in a removable recording medium such as. Such a removable recording medium can be provided as so-called package software.
 また、プログラムは、リムーバブル記録媒体からコンピュータにインストールする他、ダウンロードサイトからLAN(Local Area Network)やインターネット等のネットワークを介して、コンピュータに無線または有線で転送してもよい。コンピュータでは、そのようにして転送されてくるプログラムを受信し、内蔵するハードディスク等の記録媒体にインストールすることができる。 In addition to installing the program from the removable recording medium to the computer, the program may be wirelessly or wired transferred from the download site to the computer via a network such as a LAN (Local Area Network) or the Internet. In the computer, the program thus transferred can be received and installed in a recording medium such as a built-in hard disk.
 なお、本明細書に記載した効果はあくまで例示であって限定されるものではなく、記載されていない付加的な効果があってもよい。また、本技術は、上述した技術の実施の形態に限定して解釈されるべきではない。この技術の実施の形態は、例示という形態で本技術を開示しており、本技術の要旨を逸脱しない範囲で当業者が実施の形態の修正や代用をなし得ることは自明である。すなわち、本技術の要旨を判断するためには、請求の範囲を参酌すべきである。 Note that the effects described in this specification are merely examples and are not limited, and additional effects not described may be present. Further, the present technology should not be construed as being limited to the above-described embodiments of the technology. The embodiments of this technology disclose the present technology in the form of exemplification, and it is obvious that those skilled in the art can modify or substitute the embodiments without departing from the gist of the present technology. That is, in order to judge the gist of the present technology, the claims should be taken into consideration.
 また、本技術の画像処理装置は以下のような構成も取ることができる。
 (1) 撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択するフィルタ処理部を備える画像処理装置。
 (2) 前記所定の座標位置は、予め設定された座標位置または前記フィルタ処理部で前記撮像画像と異なる時刻に生成された撮像画像から検出されて選択された直線を示す座標位置である(1)に記載の画像処理装置。
 (3) 前記撮像画像を取得する撮像装置は移動体に設けられており、
 前記所定の座標位置は、前記移動体に最も近い直線を示す座標位置である(2)に記載の画像処理装置。
 (4) 前記フィルタ処理部は、前記所定の座標位置に対して許容範囲を設定して、前記撮像画像から検出された直線を示す前記許容範囲内の座標位置を用いる(1)乃至(3)のいずれかに記載の画像処理装置。
 (5) 前記撮像画像を取得する撮像装置は移動体に設けられており、
 前記フィルタ処理部は、前記移動体の移動状態または移動環境に応じて前記許容範囲を設定する(4)に記載の画像処理装置。
 (6) 前記フィルタ処理部は、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択できない場合、対応する直線を選択できないことを示すロスト判定情報を出力する(1)乃至(5)のいずれかに記載の画像処理装置。
 (7) 前記撮像画像から直線を示す極座標空間の座標位置を検出する直線検出部をさらに備える(1)乃至(6)のいずれかに記載の画像処理装置。
 (8) 前記直線検出部は、前記撮像画像における撮像位置近傍の画像領域から直線を示す前記座標位置を検出する(7)に記載の画像処理装置。
 (9) 前記直線検出部は、前記撮像画像を用いてハフ変換を行い、前記極座標空間であるρ-θ空間で、直線を示す座標位置を検出する(7)または(8)に記載の画像処理装置。
 (10) 前記フィルタ処理部で選択された直線を含む所定サイズの探索基準領域を設定して、前記探索基準領域内における前記選択された直線である境界シードの線分方向に境界探索領域を設定して、前記境界探索領域から前記境界シードに対する傾きが閾値よりも小さく前記境界シードに繋がる直線を境界伸長線として検出する伸長部をさらに備える(1)乃至(9)のいずれかに記載の画像処理装置。
 (11) 前記伸長部は、前記境界探索領域から前記境界伸長線を検出した場合、該検出した境界伸長線の線分方向に新たに前記境界探索領域を隣接して設定して、前記検出した境界伸長線に対する傾きが閾値よりも小さく前記境界伸長線に繋がる直線を前記新たな境界探索領域から検出して境界伸長線とする(10)に記載の画像処理装置。
 (12) 前記伸長部は、前記境界探索領域の領域サイズと前記閾値によって、前記境界シードと前記境界伸長線によって構成される推定境界線の曲率範囲を設定する(10)または(11)に記載の画像処理装置。
Further, the image processing device of the present technology can also have the following configurations.
(1) A straight line corresponding to the straight line indicated by the predetermined coordinate position is selected from the detected straight lines based on the distance between the coordinate position in the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position. An image processing apparatus including a filter processing unit.
(2) The predetermined coordinate position is a preset coordinate position or a coordinate position indicating a straight line detected and selected from a captured image generated at a time different from the captured image by the filter processing unit (1 ) The image processing device described in (1).
(3) An imaging device that acquires the captured image is provided on the moving body,
The image processing device according to (2), wherein the predetermined coordinate position is a coordinate position indicating a straight line closest to the moving body.
(4) The filter processing unit sets an allowable range for the predetermined coordinate position and uses the coordinate position within the allowable range indicating a straight line detected from the captured image (1) to (3). The image processing device according to any one of 1.
(5) An imaging device that acquires the captured image is provided on the moving body,
The image processing device according to (4), wherein the filter processing unit sets the allowable range according to a moving state or a moving environment of the moving body.
(6) When the straight line corresponding to the straight line indicated by the predetermined coordinate position cannot be selected from the detected straight lines, the filter processing unit outputs the lost determination information indicating that the corresponding straight line cannot be selected ( The image processing device according to any one of 1) to (5).
(7) The image processing device according to any one of (1) to (6), further including a straight line detection unit that detects a coordinate position in a polar coordinate space indicating a straight line from the captured image.
(8) The image processing device according to (7), wherein the straight line detection unit detects the coordinate position indicating a straight line from an image region near the image pickup position in the captured image.
(9) The image according to (7) or (8), wherein the straight line detection unit performs Hough transform using the captured image to detect a coordinate position indicating a straight line in the ρ-θ space that is the polar coordinate space. Processing equipment.
(10) A search reference area having a predetermined size including the straight line selected by the filter processing unit is set, and a boundary search area is set in a line segment direction of a boundary seed that is the selected straight line in the search reference area. The image according to any one of (1) to (9), further including a decompression unit that detects, as a boundary decompression line, a straight line that has a slope with respect to the boundary seed that is smaller than a threshold value and that is connected to the boundary seed from the boundary search region. Processing equipment.
(11) When the extension unit detects the boundary extension line from the boundary search region, the extension unit newly sets the boundary search region adjacent in the direction of a line segment of the detected boundary extension line, and detects the boundary extension line. The image processing apparatus according to (10), wherein a straight line having a slope with respect to the boundary extension line smaller than a threshold value and connected to the boundary extension line is detected as the boundary extension line by detecting from the new boundary search area.
(12) The decompression unit sets a curvature range of an estimated boundary line formed by the boundary seed and the boundary expansion line based on the area size of the boundary search area and the threshold value. (10) or (11) Image processing device.
 10・・・撮像装置
 20・・・画像処理装置
 21・・・直線検出部
 22・・・フィルタ処理部
 23・・・伸長部
10 ... Imaging device 20 ... Image processing device 21 ... Straight line detection unit 22 ... Filter processing unit 23 ... Expansion unit

Claims (14)

  1.  撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択するフィルタ処理部
    を備える画像処理装置。
    Filtering processing for selecting a straight line corresponding to the straight line indicated by the predetermined coordinate position from the detected straight lines based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position Image processing apparatus including a unit.
  2.  前記所定の座標位置は、予め設定された座標位置または前記フィルタ処理部で前記撮像画像と異なる時刻に生成された撮像画像から検出されて選択された直線を示す座標位置である
    請求項1に記載の画像処理装置。
    The predetermined coordinate position is a preset coordinate position or a coordinate position indicating a straight line detected and selected from a captured image generated at a time different from the captured image by the filter processing unit. Image processing device.
  3.  前記撮像画像を取得する撮像装置は移動体に設けられており、
     前記所定の座標位置は、前記移動体に最も近い直線を示す座標位置である
    請求項2に記載の画像処理装置。
    An imaging device that acquires the captured image is provided on the moving body,
    The image processing apparatus according to claim 2, wherein the predetermined coordinate position is a coordinate position indicating a straight line closest to the moving body.
  4.  前記フィルタ処理部は、前記所定の座標位置に対して許容範囲を設定して、前記撮像画像から検出された直線を示す前記許容範囲内の座標位置を用いる
    請求項1に記載の画像処理装置。
    The image processing apparatus according to claim 1, wherein the filter processing unit sets an allowable range for the predetermined coordinate position and uses a coordinate position within the allowable range indicating a straight line detected from the captured image.
  5.  前記撮像画像を取得する撮像装置は移動体に設けられており、
     前記フィルタ処理部は、前記移動体の移動状態または移動環境に応じて前記許容範囲を設定する
    請求項4に記載の画像処理装置。
    An imaging device that acquires the captured image is provided on the moving body,
    The image processing apparatus according to claim 4, wherein the filter processing unit sets the allowable range according to a moving state or a moving environment of the moving body.
  6.  前記フィルタ処理部は、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択できない場合、対応する直線を選択できないことを示すロスト判定情報を出力する
    請求項1に記載の画像処理装置。
    If the straight line corresponding to the straight line indicated by the predetermined coordinate position cannot be selected from the detected straight lines, the filter processing unit outputs lost determination information indicating that the corresponding straight line cannot be selected. The image processing device described.
  7.  前記撮像画像から直線を示す極座標空間の座標位置を検出する直線検出部をさらに備える
    請求項1に記載の画像処理装置。
    The image processing apparatus according to claim 1, further comprising a straight line detection unit that detects a coordinate position in a polar coordinate space that indicates a straight line from the captured image.
  8.  前記直線検出部は、前記撮像画像における撮像位置近傍の画像領域から直線を示す前記座標位置を検出する
    請求項7に記載の画像処理装置。
    The image processing apparatus according to claim 7, wherein the straight line detection unit detects the coordinate position indicating a straight line from an image area near the image capturing position in the captured image.
  9.  前記直線検出部は、前記撮像画像を用いてハフ変換を行い、前記極座標空間であるρ-θ空間で直線を示す座標位置を検出する
    請求項7に記載の画像処理装置。
    The image processing apparatus according to claim 7, wherein the straight line detection unit performs Hough transform using the captured image to detect a coordinate position indicating a straight line in the ρ-θ space that is the polar coordinate space.
  10.  前記フィルタ処理部で選択された直線を含む所定サイズの探索基準領域を設定して、前記探索基準領域内における前記選択された直線である境界シードの線分方向に境界探索領域を設定して、前記境界探索領域から前記境界シードに対する傾きが閾値よりも小さく前記境界シードに繋がる直線を境界伸長線として抽出する伸長部をさらに備える
    請求項1に記載の画像処理装置。
    By setting a search reference area of a predetermined size including the straight line selected by the filter processing unit, set the boundary search area in the line segment direction of the boundary seed that is the selected straight line in the search reference area, The image processing apparatus according to claim 1, further comprising a decompressing unit that extracts a straight line that has a slope with respect to the boundary seed smaller than a threshold value and that is connected to the boundary seed from the boundary search region and that is a boundary expansion line.
  11.  前記伸長部は、前記境界探索領域から前記境界伸長線を抽出できたことに応じて、該検出した境界伸長線の線分方向に新たに前記境界探索領域を隣接して設定して、前記検出した境界伸長線に対する傾きが閾値よりも小さく前記境界伸長線に繋がる直線を前記新たな境界探索領域から抽出して境界伸長線とする
    請求項10に記載の画像処理装置。
    The decompression unit newly sets the boundary search regions adjacent to each other in the line segment direction of the detected boundary decompression line in response to being able to extract the boundary decompression line from the boundary search region, and performs the detection. The image processing device according to claim 10, wherein a straight line having a slope with respect to the boundary extension line smaller than a threshold value and connected to the boundary extension line is extracted as the boundary extension line from the new boundary search area.
  12.  前記伸長部は、前記境界探索領域の領域サイズと前記閾値によって、前記境界シードと前記境界伸長線によって構成される推定境界線の曲率範囲を設定する
    請求項10に記載の画像処理装置。
    The image processing apparatus according to claim 10, wherein the decompression unit sets a curvature range of an estimated boundary line formed by the boundary seed and the boundary decompression line based on the area size of the boundary search area and the threshold value.
  13.  撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線からフィルタ処理部で選択すること
    を含む画像処理方法。
    Based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position, a straight line corresponding to the straight line indicated by the predetermined coordinate position is obtained from the detected straight line by the filter processing unit. An image processing method including selecting.
  14.  撮像画像の画像処理をコンピュータで実行させるプログラムであって、
     前記撮像画像から検出された直線を示す極座標空間の座標位置と所定の座標位置との距離に基づき、前記所定の座標位置で示された直線に対応する直線を前記検出された直線から選択する手順
    を前記コンピュータで実行させるプログラム。
    A program that causes a computer to perform image processing of a captured image,
    A procedure for selecting, from the detected straight lines, a straight line corresponding to the straight line indicated by the predetermined coordinate position based on the distance between the coordinate position of the polar coordinate space indicating the straight line detected from the captured image and the predetermined coordinate position. A program that causes the computer to execute.
PCT/JP2019/036083 2018-10-31 2019-09-13 Image processing apparatus, image processing method and program WO2020090250A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018204987 2018-10-31
JP2018-204987 2018-10-31

Publications (1)

Publication Number Publication Date
WO2020090250A1 true WO2020090250A1 (en) 2020-05-07

Family

ID=70463033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/036083 WO2020090250A1 (en) 2018-10-31 2019-09-13 Image processing apparatus, image processing method and program

Country Status (1)

Country Link
WO (1) WO2020090250A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560610A (en) * 2020-12-03 2021-03-26 西南交通大学 Video monitoring object analysis method, device, equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164479A (en) * 2002-11-15 2004-06-10 Nippon Hoso Kyokai <Nhk> Device, method, and program for generating image deformation information
JP2007323381A (en) * 2006-06-01 2007-12-13 Fuji Xerox Co Ltd Image inclination detector, image processor, image inclination detection program and image processing program
JP2014067406A (en) * 2012-09-24 2014-04-17 Ricoh Co Ltd Method and apparatus for detecting continuous road partition
JP2015125695A (en) * 2013-12-27 2015-07-06 株式会社メガチップス Traffic lane identification device and traffic lane identification method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164479A (en) * 2002-11-15 2004-06-10 Nippon Hoso Kyokai <Nhk> Device, method, and program for generating image deformation information
JP2007323381A (en) * 2006-06-01 2007-12-13 Fuji Xerox Co Ltd Image inclination detector, image processor, image inclination detection program and image processing program
JP2014067406A (en) * 2012-09-24 2014-04-17 Ricoh Co Ltd Method and apparatus for detecting continuous road partition
JP2015125695A (en) * 2013-12-27 2015-07-06 株式会社メガチップス Traffic lane identification device and traffic lane identification method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560610A (en) * 2020-12-03 2021-03-26 西南交通大学 Video monitoring object analysis method, device, equipment and readable storage medium
CN112560610B (en) * 2020-12-03 2021-09-28 西南交通大学 Video monitoring object analysis method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
JP7136106B2 (en) VEHICLE DRIVING CONTROL DEVICE, VEHICLE DRIVING CONTROL METHOD, AND PROGRAM
US11531354B2 (en) Image processing apparatus and image processing method
US11468574B2 (en) Image processing apparatus and image processing method
US20200241549A1 (en) Information processing apparatus, moving apparatus, and method, and program
US20220169245A1 (en) Information processing apparatus, information processing method, computer program, and mobile body device
US11501461B2 (en) Controller, control method, and program
WO2019181284A1 (en) Information processing device, movement device, method, and program
US11014494B2 (en) Information processing apparatus, information processing method, and mobile body
US11200795B2 (en) Information processing apparatus, information processing method, moving object, and vehicle
WO2020116195A1 (en) Information processing device, information processing method, program, mobile body control device, and mobile body
US11377101B2 (en) Information processing apparatus, information processing method, and vehicle
US20240054793A1 (en) Information processing device, information processing method, and program
US20210300401A1 (en) Information processing device, moving body, information processing method, and program
US11615628B2 (en) Information processing apparatus, information processing method, and mobile object
WO2021241189A1 (en) Information processing device, information processing method, and program
US20220253065A1 (en) Information processing apparatus, information processing method, and information processing program
WO2019073795A1 (en) Information processing device, own-position estimating method, program, and mobile body
JP7192771B2 (en) Information processing device, information processing method, program, and vehicle
US20220277556A1 (en) Information processing device, information processing method, and program
WO2020090250A1 (en) Image processing apparatus, image processing method and program
JPWO2020036043A1 (en) Information processing equipment, information processing methods and programs
JP2020101960A (en) Information processing apparatus, information processing method, and program
WO2020158489A1 (en) Visible light communication device, visible light communication method, and visible light communication program
JP2019100942A (en) Mobile object, positioning system, positioning program and positioning method
WO2020036044A1 (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19880616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19880616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP