WO2016027408A1 - Image processing apparatus, and failure diagnosis method for image processing apparatus - Google Patents

Image processing apparatus, and failure diagnosis method for image processing apparatus Download PDF

Info

Publication number
WO2016027408A1
WO2016027408A1 PCT/JP2015/003554 JP2015003554W WO2016027408A1 WO 2016027408 A1 WO2016027408 A1 WO 2016027408A1 JP 2015003554 W JP2015003554 W JP 2015003554W WO 2016027408 A1 WO2016027408 A1 WO 2016027408A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
correction unit
camera
corrected
image processing
Prior art date
Application number
PCT/JP2015/003554
Other languages
French (fr)
Japanese (ja)
Inventor
後藤 宏明
丙辰 王
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2016027408A1 publication Critical patent/WO2016027408A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/097Supervising of traffic control systems, e.g. by giving an alarm if two crossing streets have green light simultaneously
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an image processing device and a failure diagnosis method for the image processing device.
  • abnormality diagnosis based on the luminance distribution as described above, whether the abnormality of the camera image is abnormal due to an external environmental factor (for example, adhesion of foreign matter to the camera) or due to a malfunction of the device (for example, malfunction of hardware or software). It is difficult to distinguish whether it is abnormal.
  • an external environmental factor for example, adhesion of foreign matter to the camera
  • a malfunction of the device for example, malfunction of hardware or software
  • An object of the present disclosure is to provide a technique capable of diagnosing that the image processing apparatus is out of order.
  • the image processing apparatus includes an image correction unit and a failure diagnosis unit.
  • the image correction unit corrects distortion of at least one camera image captured by the camera, and outputs a corrected image that is an image in which the distortion is corrected and has at least one vanishing point.
  • the failure diagnosis unit determines whether or not the image correction unit has failed based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal.
  • the image correction unit fails and the distortion correction cannot be performed correctly, the vanishing point cannot be correctly obtained from the corrected image corrected by such an image correction unit. Using this, it is determined that the image correction unit is out of order.
  • a failure diagnosis method for an image processing apparatus corrects distortion of a camera image captured by a camera, and is an image in which the distortion is corrected, and includes at least one vanishing point.
  • a failure of the image correction unit is determined for an image processing apparatus having an image correction unit that outputs a correction image that is an image having the.
  • the failure diagnosis method for the image processing apparatus includes a procedure for determining whether or not the image correction unit is defective based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal. .
  • FIG. 1 is a block diagram showing the configuration of an in-vehicle image processing system
  • Figure 2 Flow chart of real-time failure detection processing
  • FIG. 3 is a flowchart of the first image abnormality determination process.
  • FIG. 4 is a flowchart of the second image abnormality determination process.
  • FIG. 5 is a flowchart of start-up failure detection processing.
  • FIG. 6 is an explanatory diagram showing an example in which the corrected image is divided into a plurality of segments.
  • an in-vehicle image processing system mounted on a vehicle described below is mainly configured by an electronic control unit 1 (Electronic Control Unit; hereinafter abbreviated as ECU 1) that functions as the above-described image processing device. Is done.
  • ECU 1 Electronic Control Unit
  • the camera 3 an external factor determination device 5, a display 7, a diagnosis processor 9, a vehicle speed sensor 11, a rudder angle sensor 13, and an attitude sensor 15 are provided.
  • a vehicle equipped with an in-vehicle image processing system is also referred to as a host vehicle.
  • the camera 3 is mounted on the vehicle and is configured to be able to photograph the front, rear, or side of the vehicle.
  • the camera 3 includes a wide angle lens. Therefore, each image photographed by the camera 3 (hereinafter also referred to as a camera image) has a distortion that causes the degree of reduction of the subject to increase in the peripheral portion compared to the central portion of the photographing range.
  • One camera image can be provided as a still image.
  • a plurality of camera images can be provided as moving images.
  • a still image corresponds to one frame.
  • a moving image corresponds to a plurality of frames provided at a predetermined frame rate.
  • the external factor determination device 5 is configured by hardware and software that can input and output images and information other than images from the camera 3.
  • the external factor determination unit 5 is configured to be able to determine whether or not the camera 3 is functioning normally as hardware. When the camera 3 is functioning normally, the camera 1 input from the camera 3 is processed by the ECU 1. To output.
  • the display 7 is configured to be able to display various types of information to a vehicle driver or the like, for example, using a liquid crystal display device or the like.
  • the diagnosis processor 9, the vehicle speed sensor 11, the rudder angle sensor 13, and the attitude sensor 15 are configured to be able to communicate with various devices including the ECU 1 via a CAN (Controller Area Network) 17.
  • the diagnosis processor 9 executes a process for storing such information as diagnosis information. Further, the diagnosis processor 9 performs processing for alerting the driver of the vehicle and the like, such as turning on a warning lamp as necessary.
  • the vehicle speed sensor 11 detects the speed of the vehicle and transmits the detected speed to the ECU 1 via the CAN 17.
  • the steering angle sensor 13 detects the steering angle of the steering, and transmits the detected steering angle to the ECU 1 via the CAN 17.
  • the attitude sensor 15 includes a gyro, a yaw rate sensor, a gradient sensor, and the like.
  • the attitude sensor 15 includes vehicle attitude information (for example, a rotation angle (yaw) with a direction perpendicular to the traveling surface of the vehicle as an axis, and a traveling direction of the vehicle as an axis. Rotation angle (roll), rotation angle (pitch) etc. with the vehicle width direction of the vehicle as an axis) is detected, and the detected attitude information is transmitted to the ECU 1 via the CAN 17.
  • the ECU 1 includes an image correction unit 21, a recognition unit 23, a failure diagnosis unit 25, and the like. Portions 21, 23, 25 are also referred to as devices 21, 23, 25 or modules 21, 23, 25, respectively.
  • a camera image input from the camera 3 to the ECU 1 via the external factor determination unit 5 is subjected to distortion correction processing in the image correction unit 21.
  • an image subjected to the distortion correction processing in the image correction unit 21 is also referred to as a corrected image.
  • the corrected image output from the image correction unit 21 is input to each of the recognition unit 23 and the failure diagnosis unit 25.
  • the ECU 1 can be configured as a microcomputer including a memory such as a ROM and a RAM, a CPU (Central Control Unit), and an input / output interface.
  • the memory stores programs for various processes executed by components such as the image correction unit 21, the recognition unit 23, or the failure diagnosis unit 25.
  • the ECU 1 may configure a part or all of each component as hardware without using a program.
  • a memory is also referred to as a memory, or even a non-transitional tangible medium.
  • the recognition unit 23 executes processing for recognizing various targets based on the corrected image input from the image correction unit 21. For example, a white line recognition process for recognizing a marking line on a road, a sign recognition process for recognizing a sign installed on the road, and the like are executed based on a corrected image obtained by photographing the front of the vehicle. Alternatively, for example, based on a corrected image in which the rear of the vehicle is captured, a rear recognition process for recognizing a target existing behind the vehicle, or on the vehicle side based on a corrected image in which the side of the vehicle is captured. Side recognition processing for recognizing a target existing on the side is executed. And if these processes predict the occurrence of a lane departure or contact with an obstacle, for example, warning information to that effect is output to the display 7 to the driver of the vehicle. Call attention.
  • the failure diagnosis unit 25 executes a failure diagnosis process for the image correction unit 21 based on the corrected image input from the image correction unit 21. Details of this failure diagnosis processing will be described later.
  • the external factor determination unit 5 determines that the camera 3 is not functioning normally, information to that effect is transmitted from the external factor determination unit 5 to the failure diagnosis unit 25.
  • the failure diagnosis unit 25 when any failure is detected in at least one of the information transmitted from the external factor determination unit 5 and the result of the failure diagnosis process performed by the failure diagnosis unit 25, information to that effect Is transmitted to the diagnosis processor 9 via the CAN 17.
  • the failure diagnosis process described below can include a real-time failure detection process shown in FIGS. 2 to 4 and a startup failure detection process shown in FIG.
  • the real-time failure detection process is a process for detecting a failure of the image correction unit 21 in real time while the vehicle is traveling.
  • the start-up failure detection process is a process for detecting a failure in the image correction unit 21 at the time of vehicle start-up or factory shipment.
  • the real-time failure detection process is a process that starts with the start of the vehicle and is repeatedly executed thereafter until the vehicle stops.
  • a still image or a moving image as a plurality of camera images provided by the camera 3 is input (S10).
  • S10 a moving image may be input at a predetermined frame rate or a still image may be input regularly or irregularly according to a failure determination method described later.
  • the description is continued on the assumption that a moving image is input at a predetermined frame rate, and when necessary, a plurality of images constituting the moving image are also used as still images.
  • the external factor determination unit 5 executes external factor determination (S20).
  • S20 the abnormal state of the camera 3 itself is determined. For example, when the camera 3 is not functioning normally, such as a power supply short circuit or a signal abnormality, the external factor determination unit 5 determines that such an abnormality exists. This makes it possible to determine whether an abnormality has occurred in the image correction unit 21 or whether an abnormality has occurred in the camera 3 in the previous stage of the image correction unit 21.
  • the image correction unit 21 performs image correction (S30).
  • the image correction unit 21 corrects distortion caused by the above-described wide-angle lens.
  • the image corrected by the image correction unit 21 is an image having at least one vanishing point.
  • this pixel group is also referred to as a moving element.
  • this pixel group is also referred to as a moving element.
  • this pixel group is also referred to as a moving element.
  • the extracted edge is a pixel group extending linearly toward the vanishing point in the corrected image (hereinafter, this pixel group is also referred to as a linear element).
  • the failure diagnosis unit 25 executes an image abnormality determination process based on the corrected image input from the image correction unit 21 (S40).
  • the failure diagnosis unit 25 extracts the moving elements and the linear elements as described above, and determines whether or not the vanishing point is correctly obtained.
  • the first image abnormality determination process illustrated in FIG. 3 is executed for the moving element as described above.
  • a feature point estimated as an image of a moving object is calculated by comparing frames before and after among a plurality of images constituting a moving image (S110).
  • the number of feature points calculated in S110 may be any number as long as it is a number according to the calculation performance of the ECU 1, but it is preferable that there are several or more points in order to improve reliability.
  • the moving direction from the previous frame is calculated for the feature point calculated in S110 (S120), and the convergence point of the moving vector of the feature point is calculated as a vanishing point (S130).
  • the direction in which the vanishing point is estimated to exist in association with each moving element is estimated by the above-described method, and it is determined whether or not the estimated vanishing point direction converges to a predetermined position. In this determination, in consideration of noise and errors during image processing, if the vanishing point converges within a predetermined range, it may be determined that the image correction unit 21 is functioning normally.
  • the entire corrected image is divided into a plurality of segments (nine in FIG. 6), and the convergence point is the segment in the center of the corrected image (effective segment shown in FIG. 6). Anything that deviates from may not be used. For example, when the vehicle is traveling on a curved road, the convergence point may not converge near the center of the correction image.
  • an area for example, a segment other than the effective segment shown in FIG. 6
  • a moving element expected to not converge at the center of the corrected image may appear on the corrected image in advance.
  • Such moving elements may be excluded from the determination targets.
  • the vanishing point calculation is temporarily suspended. You may comprise as follows.
  • the image correction unit 21 diagnoses an image abnormality of the corrected image (S140), and the diagnosis processor 9 is notified of the abnormality determination result (S150).
  • the second image abnormality determination process illustrated in FIG. 4 can be executed for the linear element as described above.
  • image processing is performed on each of a plurality of images constituting a moving image to extract edges (linear elements) (S210).
  • an edge that can be used for vanishing point calculation is determined (S220).
  • S220 as an edge that can be used for vanishing point calculation, there is a length that can specify the extending direction of the edge, and its position and orientation are substantially within a predetermined allowable range in relation to the vanishing point. Is selected.
  • the convergence point is calculated as the vanishing point from the edge determined to be usable in S220 (S230).
  • the direction in which the vanishing point is estimated is specified based on the direction in which the extracted edge extends.
  • S210-S230 when a disturbance such as sudden vibration or impact is detected, it is preferable to exclude an image in the meantime from a failure determination target.
  • the image correction unit 21 diagnoses an image abnormality of the corrected image (S240), and the diagnosis processor 9 is abnormal. The determination result is notified (S250).
  • the failure diagnosis unit 25 subsequently performs external factor determination (S ⁇ b> 50).
  • S50 a temporal abnormality such as a secular change of the mounting position of the camera 3 or road surface unevenness is determined.
  • a temporal abnormality such as a secular change of the mounting position of the camera 3 or road surface unevenness
  • the failure of the image correction unit 21 is not immediately determined.
  • the vanishing point is not calculated only between specific frames, if the vanishing point is calculated after that, it may be caused by a temporary abnormality such as road surface unevenness, so that the failure of the image correction unit 21 Do not judge.
  • the vanishing point is calculated at an unexpected position, but the state continues steadily, there is a possibility that the attachment position of the camera 3 may change over time, so that it is not determined that the image correction unit 21 is out of order.
  • the failure diagnosis part 25 determines the hardware abnormality in the image correction part 21 (S60). In S60, even if an exceptional case is excluded in the determination in S50 or the like, if the state in which the abnormality is detected continues for a certain time or more, it is determined that the image correction unit 21 is out of order. In that case, a failure diagnosis result is notified to the diagnosis processor 9 (S70).
  • start-up failure detection processing is executed only once as an initial process when the vehicle is started.
  • the start-up failure detection process may be executed as a process that is specially executed at the time of vehicle shipment or maintenance inspection. Good.
  • the image correction unit 21 inputs an evaluation image stored in the memory 25A (S310). That is, in the above-described real-time failure detection process, each camera image is input from the camera 3 in S10. Instead, in S310, an evaluation image stored in the memory 25A is input.
  • the image for evaluation is an image having the same distortion as that taken by the camera 3.
  • the image correction unit 21 performs image correction (S320), and the failure diagnosis unit 25 determines a hardware abnormality in the image correction unit 21 based on the corrected image (S330).
  • the memory 25A in addition to the evaluation image described above, a correction image that should be obtained when the image correction unit 21 is normal is stored in advance as a reference image.
  • the matching between the corrected image obtained in S320 and the reference image stored in the memory is determined. If the images are not equivalent, it is determined that the image correction unit 21 has a hardware abnormality.
  • a failure diagnosis result is notified to the diagnosis processor 9 (S340).
  • the failure diagnosis unit 25 is corrected by the image correction unit 21. Based on the fact that the vanishing point cannot be correctly obtained from the corrected image, it is determined that the image correction unit 21 is malfunctioning.
  • the failure diagnosis unit 25 includes a moving element that is estimated to be an image of the moving object to be photographed in S110 to S150, whether or not the vanishing point can be estimated from the moving state of the moving element. Based on the above, it is possible to determine whether or not the image correction unit 21 is out of order. Therefore, the vanishing point can be calculated from the movement of the feature points between frames using the moving image, and the failure of the image correction unit 21 can be determined.
  • the failure diagnosis unit 25 includes a linear element that is presumed to be an image of an imaging target extending linearly in the depth direction of the imaging region by the camera 3 in S210 to S250. Based on whether or not the vanishing point can be estimated from the direction in which the linear elements extend, it is possible to determine whether or not the image correction unit 21 has failed. Therefore, it is possible to calculate the vanishing point by extracting the edge of the still image and determine the failure of the image correction unit 21.
  • the failure diagnosis unit 25 selects a part of the range included in the corrected image as an effective segment, and determines whether or not the vanishing point in the corrected image is abnormal in the selected range. Based on this, it is possible to determine whether or not the image correction unit 21 is out of order. Therefore, even when the vehicle is traveling on a curve, erroneous recognition that there is a vanishing point at an unexpected location can be suppressed.
  • the in-vehicle image processing system is mounted on the vehicle together with the camera 3, and the image correction unit 21 corrects the distortion of the camera image taken by the camera 3 while the vehicle is moving to correct the distortion.
  • the failure diagnosis unit 25 is configured to be able to output an image, and the image correction unit 21 determines whether the vanishing point in the corrected image output from the image correction unit 21 is abnormal while the vehicle is moving. It is configured to be able to determine whether or not a failure has occurred. Therefore, the failure of the image correction unit 21 can be detected in real time while the vehicle is traveling.
  • vehicle state information and suspension load obtained from the vehicle speed sensor 11, the steering angle sensor 13, and the attitude sensor 15 (which correspond to an example of a detector) mounted on the host vehicle.
  • the ECU 1 that is, the failure diagnosis unit 25
  • the image in between is excluded from the target of failure determination. Therefore, when the image is greatly shaken by the vibration of the vehicle, the image can be removed from the determination target, thereby improving the accuracy of failure diagnosis.
  • the in-vehicle image processing system includes a memory 25A (corresponding to an example of a storage device) that stores an evaluation image prepared in advance for failure diagnosis.
  • the image correction unit 21 stores the memory 25A. Is corrected so as to be able to output a diagnostic correction image in which the distortion is corrected.
  • the failure diagnosis unit 25 is configured to output the diagnostic correction image output from the image correction unit 21. A failure of the image correction unit 21 can be determined based on the abnormal state. Therefore, at the time of factory inspection or start-up, it is possible to carry out a failure diagnosis using an evaluation image prepared in advance, and in addition to the real-time failure detection processing, such as the secular change of the camera 3 and the running state of the vehicle The influence can be eliminated and more accurate fault diagnosis can be performed.
  • the correction image is divided into nine segments at a predetermined position.
  • the division position and the number of divisions can be appropriately set, and are not limited to those illustrated.
  • the memory 25A is illustrated as a configuration provided in the failure diagnosis unit 25 for convenience.
  • any memory that can be accessed by the image correction unit 21 and the failure diagnosis unit 25 may be used.
  • the memory provided in the image correction unit 21 may be a memory in the ECU 1 that is independent from the image correction unit 21 and the failure diagnosis unit 25.
  • a memory outside the ECU 1 may be used.
  • the in-vehicle image processing system is exemplified, but an equivalent system other than the vehicle may be mounted.
  • a system including the image processing apparatus as a constituent element, a program including instructions for causing a computer to function as the image processing apparatus (also referred to as a program product) Or a program (also referred to as a program product) that includes instructions for providing a fault diagnosis method, a computer-readable ROM, RAM, or other non-transitional tangible medium that stores each of these programs, etc.
  • a program including instructions for causing a computer to function as the image processing apparatus
  • a program also referred to as a program product
  • a program also referred to as a program product
  • the present disclosure can also be realized in various forms.

Abstract

An image processing apparatus (1) comprises: an image adjusting unit (21) which adjusts distortion of each camera image captured by a camera (3), and outputs an adjusted image, which is an image in which said distortion has been adjusted, and which has at least one vanishing point; and a failure diagnosis unit (25) which determines whether or not the image adjusting unit has failed, on the basis of whether or not there is an abnormality in the vanishing point in the adjusted image output from the image adjusting unit (21).

Description

画像処理装置、及び画像処理装置の故障診断方法Image processing apparatus and fault diagnosis method for image processing apparatus 関連出願の相互参照Cross-reference of related applications
 本出願は、2014年8月22日に出願された日本出願番号2014-169555号に基づくもので、ここにその記載内容を援用する。 This application is based on Japanese Application No. 2014-169555 filed on August 22, 2014, the contents of which are incorporated herein by reference.
 本開示は、画像処理装置、及び画像処理装置の故障診断方法に関する。 The present disclosure relates to an image processing device and a failure diagnosis method for the image processing device.
 車載カメラで車両の周辺を撮影した際に、撮影されたカメラ画像の輝度分布に基づいて、カメラへの異物付着等の異常を検出する手法が知られている(例えば、特許文献1参照)。 There is a known method for detecting abnormalities such as adhesion of foreign matter to the camera based on the luminance distribution of the captured camera image when the periphery of the vehicle is captured by the in-vehicle camera (see, for example, Patent Document 1).
JP 2003-312408 AJP 2003-312408 A
 上述のような輝度分布による異常診断では、カメラ画像の異常が、外的な環境要因(例えば、カメラへの異物付着)による異常なのか、機器の故障(例えば、ハードウェアやソフトウェアの故障)による異常なのかを区別することは難しい。 In the abnormality diagnosis based on the luminance distribution as described above, whether the abnormality of the camera image is abnormal due to an external environmental factor (for example, adhesion of foreign matter to the camera) or due to a malfunction of the device (for example, malfunction of hardware or software). It is difficult to distinguish whether it is abnormal.
 例えば、カメラにより撮影されたカメラ画像の歪みを補正する画像処理装置が故障した場合、そのような画像処理装置でカメラ画像を補正すると、その補正後の画像にも異常な輝度分布が生じる可能性がある。しかし、そのような異常が生じた場合、上記従来技術では、画像処理装置が故障していると判断することは困難である。 For example, when an image processing apparatus that corrects distortion of a camera image taken by a camera fails, correcting the camera image with such an image processing apparatus may cause an abnormal luminance distribution in the corrected image. There is. However, when such an abnormality occurs, it is difficult to determine that the image processing apparatus is out of order using the conventional technology.
 本開示の目的は、画像処理装置が故障していることを診断可能な技術を提供することである。 An object of the present disclosure is to provide a technique capable of diagnosing that the image processing apparatus is out of order.
 本開示の一つの態様によれば、画像処理装置は、画像補正部と故障診断部とを有する。画像補正部は、カメラにより撮影された少なくとも一つのカメラ画像の歪みを補正し、当該歪みが補正された画像であって少なくとも一つの消失点を有する画像である補正画像を出力する。故障診断部は、画像補正部から出力される補正画像中の消失点に異常があるか否かに基づいて、画像補正部が故障しているか否かを判定する。 According to one aspect of the present disclosure, the image processing apparatus includes an image correction unit and a failure diagnosis unit. The image correction unit corrects distortion of at least one camera image captured by the camera, and outputs a corrected image that is an image in which the distortion is corrected and has at least one vanishing point. The failure diagnosis unit determines whether or not the image correction unit has failed based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal.
 この構成によれば、画像補正部が故障して、歪補正が正しくできない状態に陥っている場合に、そのような画像補正部で補正された補正画像からは消失点が正しく求められないことを利用して、画像補正部が故障していると判定する。 According to this configuration, when the image correction unit fails and the distortion correction cannot be performed correctly, the vanishing point cannot be correctly obtained from the corrected image corrected by such an image correction unit. Using this, it is determined that the image correction unit is out of order.
 したがって、補正画像から消失点が正しく求められるか否かには着目していない技術とは異なり、画像補正部において補正画像を生成する段階で、何らかの障害が発生していることを特定できる。よって、例えば、そのような補正画像を生成する画像補正部に異常があることを利用者に対して報知することができる。あるいは、例えば、そのような異常な補正画像に基づく制御を中止するなどの対処をすることができる。 Therefore, unlike a technique that does not focus on whether or not the vanishing point is correctly obtained from the corrected image, it is possible to specify that some kind of failure has occurred at the stage of generating the corrected image in the image correcting unit. Therefore, for example, it is possible to notify the user that there is an abnormality in the image correction unit that generates such a corrected image. Alternatively, for example, it is possible to take measures such as stopping the control based on such an abnormal corrected image.
 また、本開示のもう一つの態様によれば、画像処理装置の故障診断方法は、カメラにより撮影されたカメラ画像の歪みを補正し、当該歪みが補正された画像であって少なくとも一つの消失点を有する画像である補正画像を出力する画像補正部を有する画像処理装置を対象にして、画像補正部の故障を判定する。この画像処理装置の故障診断方法は、画像補正部から出力される補正画像中の消失点に異常があるか否かに基づいて、画像補正部が故障しているか否かを判定する手順を含む。 According to another aspect of the present disclosure, a failure diagnosis method for an image processing apparatus corrects distortion of a camera image captured by a camera, and is an image in which the distortion is corrected, and includes at least one vanishing point. A failure of the image correction unit is determined for an image processing apparatus having an image correction unit that outputs a correction image that is an image having the. The failure diagnosis method for the image processing apparatus includes a procedure for determining whether or not the image correction unit is defective based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal. .
 したがって、このような故障診断方法によれば、上述の画像処理装置について述べた通りの作用、効果を奏し、画像補正部において補正画像を生成する段階で、何らかの障害が発生していることを特定できる。 Therefore, according to such a failure diagnosis method, the operation and effect as described for the above-described image processing apparatus are achieved, and it is determined that some kind of failure has occurred at the stage of generating a corrected image in the image correction unit. it can.
 本開示についての上記目的およびその他の目的、特徴や利点は、添付の図面を参照しながら下記の詳細な記述により、より明確になる。
図1は車載画像処理システムの構成を示すブロック図 図2リアルタイム故障検出処理のフローチャート 図3は第一の画像異常判定処理のフローチャート 図4は第二の画像異常判定処理のフローチャート 図5は始動時故障検出処理のフローチャート 図6は補正画像を複数のセグメントに分割した例を示す説明図
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings.
FIG. 1 is a block diagram showing the configuration of an in-vehicle image processing system Figure 2 Flow chart of real-time failure detection processing FIG. 3 is a flowchart of the first image abnormality determination process. FIG. 4 is a flowchart of the second image abnormality determination process. FIG. 5 is a flowchart of start-up failure detection processing. FIG. 6 is an explanatory diagram showing an example in which the corrected image is divided into a plurality of segments.
 次に、上述の画像処理装置、及び画像処理装置の故障診断方法について、例示的な実施形態を挙げて説明する。 Next, the image processing apparatus and the failure diagnosis method for the image processing apparatus will be described with reference to exemplary embodiments.
 [画像処理装置の構成]
 以下に説明する車両に搭載される車載画像処理システムは、図1に示すように、上述の画像処理装置として機能する電子制御装置1(Electronic Control Unit;以下、ECU1と略称する)を中心に構成される。ECU1の他には、カメラ3、外的要因判定器5、表示器7、ダイアグ処理器9、車速センサ11、舵角センサ13、及び姿勢センサ15などを備える。車載画像処理システムを搭載する車両は、ホスト車両とも言及される。
[Configuration of image processing apparatus]
As shown in FIG. 1, an in-vehicle image processing system mounted on a vehicle described below is mainly configured by an electronic control unit 1 (Electronic Control Unit; hereinafter abbreviated as ECU 1) that functions as the above-described image processing device. Is done. In addition to the ECU 1, the camera 3, an external factor determination device 5, a display 7, a diagnosis processor 9, a vehicle speed sensor 11, a rudder angle sensor 13, and an attitude sensor 15 are provided. A vehicle equipped with an in-vehicle image processing system is also referred to as a host vehicle.
 カメラ3は、車両上に取り付けられ、車両の前方、後方、あるいは側方などを撮影可能に構成されている。カメラ3は、広角レンズを備える。そのため、カメラ3によって撮影された各々の画像(以下、カメラ画像とも称する)は、撮影範囲の中心部に比べ、周辺部ほど被写体の縮小度合いが大きくなるような歪みを有する。なお、一つのカメラ画像は静止画像として提供されることができる。一方、複数のカメラ画像は、動画像として提供されることができる。更に、静止画像は一つのフレームに対応するとも言える。一方、動画像は、所定のフレームレートで提供される複数のフレームに対応するともいえる。 The camera 3 is mounted on the vehicle and is configured to be able to photograph the front, rear, or side of the vehicle. The camera 3 includes a wide angle lens. Therefore, each image photographed by the camera 3 (hereinafter also referred to as a camera image) has a distortion that causes the degree of reduction of the subject to increase in the peripheral portion compared to the central portion of the photographing range. One camera image can be provided as a still image. On the other hand, a plurality of camera images can be provided as moving images. Furthermore, it can be said that a still image corresponds to one frame. On the other hand, it can be said that a moving image corresponds to a plurality of frames provided at a predetermined frame rate.
 外的要因判定器5は、カメラ3から画像や画像以外の情報を入出力可能なハードウェア及びソフトウェアによって構成される。外的要因判定器5は、カメラ3がハードウェアとして正常に機能しているか否かを判定可能に構成され、カメラ3が正常に機能している場合に、カメラ3から入力したカメラ画像をECU1へと出力する。 The external factor determination device 5 is configured by hardware and software that can input and output images and information other than images from the camera 3. The external factor determination unit 5 is configured to be able to determine whether or not the camera 3 is functioning normally as hardware. When the camera 3 is functioning normally, the camera 1 input from the camera 3 is processed by the ECU 1. To output.
 表示器7は、例えば液晶ディスプレイ装置などにより、車両の運転者等に対して各種情報を表示可能に構成されている。ダイアグ処理器9、車速センサ11、舵角センサ13、及び姿勢センサ15は、CAN(Controller Area Network)17を介してECU1を含む各種機器と通信可能に構成されている。 The display 7 is configured to be able to display various types of information to a vehicle driver or the like, for example, using a liquid crystal display device or the like. The diagnosis processor 9, the vehicle speed sensor 11, the rudder angle sensor 13, and the attitude sensor 15 are configured to be able to communicate with various devices including the ECU 1 via a CAN (Controller Area Network) 17.
 ECU1を含む各種機器からダイアグ処理器9に対しては各種診断情報が伝送される。ダイアグ処理器9では、それらの情報をダイアグ情報として蓄積する処理が実行される。また、ダイアグ処理器9では、必要に応じて警告灯を点灯させるなど、車両の運転者等に対して注意喚起を行うための処理が実行される。 Various diagnostic information is transmitted from the various devices including the ECU 1 to the diagnosis processor 9. The diagnosis processor 9 executes a process for storing such information as diagnosis information. Further, the diagnosis processor 9 performs processing for alerting the driver of the vehicle and the like, such as turning on a warning lamp as necessary.
 車速センサ11は、車両の速度を検出し、検出した速度をCAN17経由でECU1へ伝送する。舵角センサ13は、ステアリングの操舵角を検出し、検出した操舵角をCAN17経由でECU1へ伝送する。姿勢センサ15は、ジャイロや、ヨーレートセンサ、勾配センサなどから構成され、車両の姿勢情報(例えば、車両の走行面に垂直な方向を軸とする回転角(ヨー)、車両の進行方向を軸とする回転角(ロール)、車両の車幅方向を軸とする回転角(ピッチ)など)を検出し、検出した姿勢情報をCAN17経由でECU1へ伝送する。 The vehicle speed sensor 11 detects the speed of the vehicle and transmits the detected speed to the ECU 1 via the CAN 17. The steering angle sensor 13 detects the steering angle of the steering, and transmits the detected steering angle to the ECU 1 via the CAN 17. The attitude sensor 15 includes a gyro, a yaw rate sensor, a gradient sensor, and the like. The attitude sensor 15 includes vehicle attitude information (for example, a rotation angle (yaw) with a direction perpendicular to the traveling surface of the vehicle as an axis, and a traveling direction of the vehicle as an axis. Rotation angle (roll), rotation angle (pitch) etc. with the vehicle width direction of the vehicle as an axis) is detected, and the detected attitude information is transmitted to the ECU 1 via the CAN 17.
 ECU1は、画像補正部21、認識部23、及び故障診断部25などを備える。部21,23,25は、それぞれ、デバイス21,23,25あるいはモジュール21,23,25とも言及される。カメラ3から外的要因判定器5経由でECU1へ入力されるカメラ画像は、画像補正部21において歪補正処理が施される。なお、以下の説明では、画像補正部21において歪補正処理が施された画像のことを補正画像とも称する。画像補正部21から出力される補正画像は、認識部23及び故障診断部25それぞれに入力される。ECU1は、一例として、ROM、RAM等のメモリー、CPU(Central Control Unit)、入出力インターフェースを備えるマイクロコンピュータとして構成することも可能である。この場合、メモリーには、画像補正部21、認識部23、あるいは故障診断部25などの構成部で実行される各種処理のためのプログラムが格納されている。一方、ECU1は、各々の構成部の一部あるいは全てをプログラムを使用せずに、ハードウエアとして構成しても良い。更に、メモリーは、記憶器、あるいは更に、非遷移の有形の媒体とも言及される。 The ECU 1 includes an image correction unit 21, a recognition unit 23, a failure diagnosis unit 25, and the like. Portions 21, 23, 25 are also referred to as devices 21, 23, 25 or modules 21, 23, 25, respectively. A camera image input from the camera 3 to the ECU 1 via the external factor determination unit 5 is subjected to distortion correction processing in the image correction unit 21. In the following description, an image subjected to the distortion correction processing in the image correction unit 21 is also referred to as a corrected image. The corrected image output from the image correction unit 21 is input to each of the recognition unit 23 and the failure diagnosis unit 25. For example, the ECU 1 can be configured as a microcomputer including a memory such as a ROM and a RAM, a CPU (Central Control Unit), and an input / output interface. In this case, the memory stores programs for various processes executed by components such as the image correction unit 21, the recognition unit 23, or the failure diagnosis unit 25. On the other hand, the ECU 1 may configure a part or all of each component as hardware without using a program. A memory is also referred to as a memory, or even a non-transitional tangible medium.
 認識部23では、画像補正部21から入力される補正画像に基づき、各種物標を認識する処理が実行される。例えば、車両の前方が撮影された補正画像に基づいて、道路上の区画線を認識する白線認識処理や、道路に設置された標識を認識する標識認識処理などが実行される。あるいは、例えば、車両の後方が撮影された補正画像に基づいて、車両の後方に存在する物標を認識する後方認識処理や、車両の側方が撮影された補正画像に基づいて、車両の側方に存在する物標を認識する側方認識処理などが実行される。そして、これらの処理により、例えば車線逸脱や障害物との接触などの発生が予測される場合には、その旨の警告情報を表示器7へと出力することにより、車両の運転者等に対して注意喚起を行う。 The recognition unit 23 executes processing for recognizing various targets based on the corrected image input from the image correction unit 21. For example, a white line recognition process for recognizing a marking line on a road, a sign recognition process for recognizing a sign installed on the road, and the like are executed based on a corrected image obtained by photographing the front of the vehicle. Alternatively, for example, based on a corrected image in which the rear of the vehicle is captured, a rear recognition process for recognizing a target existing behind the vehicle, or on the vehicle side based on a corrected image in which the side of the vehicle is captured. Side recognition processing for recognizing a target existing on the side is executed. And if these processes predict the occurrence of a lane departure or contact with an obstacle, for example, warning information to that effect is output to the display 7 to the driver of the vehicle. Call attention.
 故障診断部25では、画像補正部21から入力される補正画像に基づき、画像補正部21に対する故障診断処理が実行される。この故障診断処理の詳細については後述する。また、外的要因判定器5において、カメラ3が正常に機能していないと判定された場合には、その旨の情報が外的要因判定器5から故障診断部25へと伝達される。 The failure diagnosis unit 25 executes a failure diagnosis process for the image correction unit 21 based on the corrected image input from the image correction unit 21. Details of this failure diagnosis processing will be described later. When the external factor determination unit 5 determines that the camera 3 is not functioning normally, information to that effect is transmitted from the external factor determination unit 5 to the failure diagnosis unit 25.
 故障診断部25は、外的要因判定器5から伝達される情報、及び故障診断部25で実施した故障診断処理の結果に基づき、少なくとも一方において何らかの故障が検出されている場合、その旨の情報をCAN17経由でダイアグ処理器9へ伝送する。 The failure diagnosis unit 25, when any failure is detected in at least one of the information transmitted from the external factor determination unit 5 and the result of the failure diagnosis process performed by the failure diagnosis unit 25, information to that effect Is transmitted to the diagnosis processor 9 via the CAN 17.
 [故障診断処理]
 次に、上述のシステムによって実行される故障診断処理について、図2-図5に示すフローチャートに基づいて説明する。以下に説明する故障診断処理には、図2-図4に示すリアルタイム故障検出処理と、図5に示す始動時故障検出処理が含まれ得る。リアルタイム故障検出処理は、車両の走行中にリアルタイムで画像補正部21の故障を検出するための処理である。始動時故障検出処理は、車両の始動時あるいは工場出荷時などに画像補正部21の故障を検出するための処理である。
[Failure diagnosis processing]
Next, failure diagnosis processing executed by the above-described system will be described based on the flowcharts shown in FIGS. The failure diagnosis process described below can include a real-time failure detection process shown in FIGS. 2 to 4 and a startup failure detection process shown in FIG. The real-time failure detection process is a process for detecting a failure of the image correction unit 21 in real time while the vehicle is traveling. The start-up failure detection process is a process for detecting a failure in the image correction unit 21 at the time of vehicle start-up or factory shipment.
 まず、リアルタイム故障検出処理について説明する。リアルタイム故障検出処理は、車両の始動に伴って開始され、以降は車両が停止するまで繰り返し実行される処理である。リアルタイム故障検出処理を開始すると、まず、カメラ3によって提供される一つのカメラ画像としての静止画像又は複数のカメラ画像としての動画像を入力する(S10)。S10では、後述する故障判定手法に応じて、所定のフレームレートで動画像を入力してもよいし、定期的又は不定期に静止画像を入力してもよい。ただし、本実施形態では、所定のフレームレートで動画像を入力し、必要時には動画像を構成する複数の画像それぞれを静止画像としても利用するものとして説明を続ける。 First, the real-time failure detection process will be described. The real-time failure detection process is a process that starts with the start of the vehicle and is repeatedly executed thereafter until the vehicle stops. When the real-time failure detection process is started, first, a still image or a moving image as a plurality of camera images provided by the camera 3 is input (S10). In S10, a moving image may be input at a predetermined frame rate or a still image may be input regularly or irregularly according to a failure determination method described later. However, in this embodiment, the description is continued on the assumption that a moving image is input at a predetermined frame rate, and when necessary, a plurality of images constituting the moving image are also used as still images.
 S10において静止画像又は動画像を入力したら、引き続いて、外的要因判定器5において、外的要因判定を実行する(S20)。S20では、カメラ3自体の異常状態が判定される。例えば、電源ショートや信号異常など、カメラ3が正常に機能していない場合には、そのような異常があることを、外的要因判定器5において判定する。これにより、画像補正部21で異常が発生しているのか、画像補正部21よりも前段階にあるカメラ3で異常が発生しているのかを、切り分けて判定することができる。 When a still image or a moving image is input in S10, the external factor determination unit 5 executes external factor determination (S20). In S20, the abnormal state of the camera 3 itself is determined. For example, when the camera 3 is not functioning normally, such as a power supply short circuit or a signal abnormality, the external factor determination unit 5 determines that such an abnormality exists. This makes it possible to determine whether an abnormality has occurred in the image correction unit 21 or whether an abnormality has occurred in the camera 3 in the previous stage of the image correction unit 21.
 S20において、カメラ3が正常に機能している場合には、画像補正部21において画像補正が実行される(S30)。S30において、画像補正部21は、上述した広角レンズに起因する歪みを補正する。画像補正部21が正常に機能している場合、画像補正部21によって補正された画像は、少なくとも一つの消失点を有する画像となる。 In S20, when the camera 3 is functioning normally, the image correction unit 21 performs image correction (S30). In S30, the image correction unit 21 corrects distortion caused by the above-described wide-angle lens. When the image correction unit 21 functions normally, the image corrected by the image correction unit 21 is an image having at least one vanishing point.
 そのため、ホスト車両に向かって接近する撮影対象物(例えば、車両前方の路肩にある物標)が存在する場合、その撮影対象物は、補正画像中において消失点から画像周辺部に向かって経時的に移動する画素群(以下、この画素群を移動要素とも称する)として表れる。また、ホスト車両から遠ざかる撮影対象物(例えば、車両後方の路肩にある物標)が存在する場合、その撮影対象物は、補正画像中において画像周辺部から消失点に向かって経時的に移動する画素群(以下、この画素群を移動要素とも称する)として表れる。 For this reason, when there is an object to be photographed that approaches the host vehicle (for example, a target on the road shoulder in front of the vehicle), the object to be photographed is time-lapsed from the vanishing point to the image periphery in the corrected image. Appear as a pixel group (hereinafter, this pixel group is also referred to as a moving element). In addition, when there is a photographing object that moves away from the host vehicle (for example, a target on the road shoulder behind the vehicle), the photographing object moves from the image periphery to the vanishing point in the corrected image over time. It appears as a pixel group (hereinafter, this pixel group is also referred to as a moving element).
 さらに、カメラによる撮影領域の奥行き方向に向かって線状に延びている撮影対象物(例えば、ホスト車両の進行方向に向かって直線的に延びる路面上の白線や路肩にあるガードレール等)が存在する場合、その撮影対象物のエッジを画像処理によって抽出すると、抽出されたエッジは、補正画像中において消失点に向かって線状に延びる画素群(以下、この画素群を線状要素とも称する)として表れる。 Furthermore, there is a subject to be photographed that extends linearly in the depth direction of the photographing region of the camera (for example, a white line on the road surface that extends linearly in the traveling direction of the host vehicle or a guard rail on the shoulder). In this case, when the edge of the photographing object is extracted by image processing, the extracted edge is a pixel group extending linearly toward the vanishing point in the corrected image (hereinafter, this pixel group is also referred to as a linear element). appear.
 そこで、故障診断部25では、画像補正部21から入力される補正画像に基づき、画像異常判定処理を実行する(S40)。S40において、故障診断部25では、上述のような移動要素や線状要素を抽出し、消失点が正しく求められるかどうかを判定する。 Therefore, the failure diagnosis unit 25 executes an image abnormality determination process based on the corrected image input from the image correction unit 21 (S40). In S40, the failure diagnosis unit 25 extracts the moving elements and the linear elements as described above, and determines whether or not the vanishing point is correctly obtained.
 例えば、S40では、上述のような移動要素を対象として、図3に例示するような第一の画像異常判定処理を実行する。この画像異常判定処理では、まず、動画像を構成する複数の画像のうち、前後のフレームを対比して、移動している対象物の画像と推定される特徴点を算出する(S110)。S110において算出される特徴点は、ECU1の演算性能に応じた数であれば、いくつであってもよいが、信頼性を高めるためには数点以上あることが好ましい。続いて、S110で算出した特徴点を対象に、前フレームからの移動方向を算出し(S120)、特徴点の移動ベクトルの収束点を消失点として算出する(S130)。 For example, in S40, the first image abnormality determination process illustrated in FIG. 3 is executed for the moving element as described above. In this image abnormality determination process, first, a feature point estimated as an image of a moving object is calculated by comparing frames before and after among a plurality of images constituting a moving image (S110). The number of feature points calculated in S110 may be any number as long as it is a number according to the calculation performance of the ECU 1, but it is preferable that there are several or more points in order to improve reliability. Subsequently, the moving direction from the previous frame is calculated for the feature point calculated in S110 (S120), and the convergence point of the moving vector of the feature point is calculated as a vanishing point (S130).
 補正画像の中からいくつかの移動要素を抽出した場合、それらの移動要素それぞれの移動方向に基づいて、各移動要素に対応付けて消失点が存在すると推定される方向を特定することができる。S130では、上述のような手法により、移動要素ごとに消失点が存在する方向を推定し、それら推定された消失点の方向が所定の位置に収束するか否かを判断する。この判断では、画像処理時のノイズや誤差を考慮して、所定の範囲内に消失点が収束すれば、画像補正部21が正常に機能しているものと判断すればよい。 When several moving elements are extracted from the corrected image, it is possible to specify the direction in which the vanishing point is estimated to exist in association with each moving element based on the moving direction of each moving element. In S130, the direction in which the vanishing point exists for each moving element is estimated by the above-described method, and it is determined whether or not the estimated vanishing point direction converges to a predetermined position. In this determination, in consideration of noise and errors during image processing, if the vanishing point converges within a predetermined range, it may be determined that the image correction unit 21 is functioning normally.
 また、S110-S130では、図6に示すように、補正画像全体を複数(図6では9つ)のセグメントに分割して、収束点が補正画像中央のセグメント(図6中に示す有効セグメント)から逸脱するものは使用しないようにしてもよい。例えば、車両がカーブした道路を走行している場合、収束点が補正画像の中央付近に収束しなくなる可能性がある。 In S110 to S130, as shown in FIG. 6, the entire corrected image is divided into a plurality of segments (nine in FIG. 6), and the convergence point is the segment in the center of the corrected image (effective segment shown in FIG. 6). Anything that deviates from may not be used. For example, when the vehicle is traveling on a curved road, the convergence point may not converge near the center of the correction image.
 したがって、カーブでは補正画像中央に収束しないと予想される移動要素が出現する可能性があるエリア(例えば、図6中に示す有効セグメント以外のセグメント)を、補正画像上であらかじめ規定し、そのような移動要素については判定対象から外すようにして
もよい。また、車速センサ11、舵角センサ13、及び姿勢センサ15などから得られる情報に基づき、車両がカーブした道路を走行していると推定される場合には、消失点の算出を一時的に見合わせるように構成してもよい。
Therefore, in the curve, an area (for example, a segment other than the effective segment shown in FIG. 6) where a moving element expected to not converge at the center of the corrected image may appear on the corrected image in advance. Such moving elements may be excluded from the determination targets. Further, when it is estimated that the vehicle is traveling on a curved road based on information obtained from the vehicle speed sensor 11, the rudder angle sensor 13, the attitude sensor 15, and the like, the vanishing point calculation is temporarily suspended. You may comprise as follows.
 また、車速センサ11、舵角センサ13、及び姿勢センサ15などから得られる車両状態の情報やサスペンションの負荷情報などに基づき、突発的な振動や衝撃などの外乱が検知された場合には、その間の画像を故障判定の対象から除外する。 In addition, when a disturbance such as sudden vibration or impact is detected based on vehicle state information or suspension load information obtained from the vehicle speed sensor 11, the steering angle sensor 13, the attitude sensor 15, etc., Are excluded from the target of failure determination.
 以上のような手順で、特徴点の収束点を求めた結果、所定の範囲内に消失点が収束しない場合は、画像補正部21による補正画像の画像異常と診断し(S140)、ダイアグ処理器9に対して異常判定結果を通知する(S150)。 If the vanishing point does not converge within a predetermined range as a result of obtaining the convergence point of the feature points by the procedure as described above, the image correction unit 21 diagnoses an image abnormality of the corrected image (S140), and the diagnosis processor 9 is notified of the abnormality determination result (S150).
 また、上述のS40では、例えば、上述のような線状要素を対象として、図4に例示するような第二の画像異常判定処理を実行することもできる。この画像異常判定処理では、まず、動画像を構成する複数の画像それぞれに対して画像処理を施して、エッジ(線状要素)を抽出する(S210)。そして、抽出したエッジのうち、消失点算出に使えるエッジを判断する(S220)。S220では、消失点算出に使えるエッジとして、エッジが延びる方向を特定できる長さがあり、かつ、その位置や向きが、実質的に消失点との関係で所定の許容可能な範囲内にあるものが選ばれる。 In S40 described above, for example, the second image abnormality determination process illustrated in FIG. 4 can be executed for the linear element as described above. In this image abnormality determination process, first, image processing is performed on each of a plurality of images constituting a moving image to extract edges (linear elements) (S210). Then, of the extracted edges, an edge that can be used for vanishing point calculation is determined (S220). In S220, as an edge that can be used for vanishing point calculation, there is a length that can specify the extending direction of the edge, and its position and orientation are substantially within a predetermined allowable range in relation to the vanishing point. Is selected.
 続いて、S220において使えると判断したエッジから、収束点を消失点として算出する(S230)。S230では、抽出されたエッジが延びる方向に基づいて、消失点が存在すると推定される方向を特定する。S210-S230においても、路面の白線等、カーブでは画面中央に収束しないと予想される要素について、上述のような有効セグメントなどを利用する手法で除外することが好ましい。また、S210-S230においても、突発的な振動や衝撃などの外乱が検知された場合に、その間の画像を故障判定の対象から除外することが好ましい。 Subsequently, the convergence point is calculated as the vanishing point from the edge determined to be usable in S220 (S230). In S230, the direction in which the vanishing point is estimated is specified based on the direction in which the extracted edge extends. Also in S210 to S230, it is preferable to exclude elements that are expected not to converge to the center of the screen by a curve, such as a white line on the road surface, by the method using the effective segment as described above. Also in S210-S230, when a disturbance such as sudden vibration or impact is detected, it is preferable to exclude an image in the meantime from a failure determination target.
 そして、特徴点の収束点を求めた結果、所定の範囲内に消失点が収束しない場合は、画像補正部21による補正画像の画像異常と診断し(S240)、ダイアグ処理器9に対して異常判定結果を通知する(S250)。 If the vanishing point does not converge within a predetermined range as a result of obtaining the convergence point of the feature point, the image correction unit 21 diagnoses an image abnormality of the corrected image (S240), and the diagnosis processor 9 is abnormal. The determination result is notified (S250).
 図2に示すS40において、上述のような画像異常判定処理(図3及び図4に示す処理)を実行したら、引き続いて、故障診断部25は、外的要因判定を実行する(S50)。S50では、カメラ3の取付位置の経年変化や、路面凹凸等の一時的な異常を判定する。これにより、S40において消失点が算出できない場合でも、S50において、何らかの異常が検出された場合には、直ちに画像補正部21の故障とは判断しないようにすることができる。例えば、特定のフレーム間だけ消失点が求められなくても、その後は消失点が求められる場合は、路面凹凸等の一時的な異常に起因する可能性があるので、画像補正部21の故障とは判断しないようにする。あるいは、消失点が予期しない位置に算出されるものの、その状態が定常的に続く場合は、カメラ3の取付位置の経年変化による可能性があるので、画像補正部21の故障とは判断しないようにする。 2, when the above-described image abnormality determination process (the process illustrated in FIGS. 3 and 4) is executed, the failure diagnosis unit 25 subsequently performs external factor determination (S <b> 50). In S50, a temporal abnormality such as a secular change of the mounting position of the camera 3 or road surface unevenness is determined. As a result, even if the vanishing point cannot be calculated in S40, if any abnormality is detected in S50, it can be determined that the failure of the image correction unit 21 is not immediately determined. For example, even if the vanishing point is not calculated only between specific frames, if the vanishing point is calculated after that, it may be caused by a temporary abnormality such as road surface unevenness, so that the failure of the image correction unit 21 Do not judge. Alternatively, if the vanishing point is calculated at an unexpected position, but the state continues steadily, there is a possibility that the attachment position of the camera 3 may change over time, so that it is not determined that the image correction unit 21 is out of order. To.
 そして、S50を終えたら、故障診断部25は、画像補正部21におけるハード異常を判定する(S60)。S60では、S50等による判定で例外ケースを除外しても、一定時間以上にわたって異常が検出される状態が継続している場合に、画像補正部21の故障と判定する。そして、その場合は、ダイアグ処理器9に対して故障診断結果を通知する(S70)。 And after S50 is complete | finished, the failure diagnosis part 25 determines the hardware abnormality in the image correction part 21 (S60). In S60, even if an exceptional case is excluded in the determination in S50 or the like, if the state in which the abnormality is detected continues for a certain time or more, it is determined that the image correction unit 21 is out of order. In that case, a failure diagnosis result is notified to the diagnosis processor 9 (S70).
 次に、始動時故障検出処理について説明する。始動時故障検出処理は、車両の始動時に初期処理として一度だけ実行される。あるいは、上述のようなリアルタイム故障検出処理を採用している場合は、車両の工場出荷時や整備点検時に特別に実行される処理として、始動時故障検出処理が実行されるように構成してもよい。 Next, start-up failure detection processing will be described. The start-up failure detection process is executed only once as an initial process when the vehicle is started. Alternatively, when the above-described real-time failure detection process is employed, the start-up failure detection process may be executed as a process that is specially executed at the time of vehicle shipment or maintenance inspection. Good.
 始動時故障検出処理を開始すると、まず、画像補正部21は、メモリ25A内に保存された評価用画像を入力する(S310)。すなわち、上述のリアルタイム故障検出処理において、S10ではカメラ3から各々のカメラ画像を入力していたが、これに代えて、S310では、メモリ25A内に保存された評価用画像を入力する。評価用画像は、カメラ3で撮影した場合と同様な歪みを有する画像である。 When starting failure detection processing is started, first, the image correction unit 21 inputs an evaluation image stored in the memory 25A (S310). That is, in the above-described real-time failure detection process, each camera image is input from the camera 3 in S10. Instead, in S310, an evaluation image stored in the memory 25A is input. The image for evaluation is an image having the same distortion as that taken by the camera 3.
 S310を終えたら、続いて、画像補正部21において画像補正を実行し(S320)、故障診断部25は、その補正画像に基づいて画像補正部21におけるハード異常を判定する(S330)。メモリ25A内には、上述の評価用画像の他に、画像補正部21が正常であった場合に得られるべき補正画像が、基準画像としてあらかじめ記憶されている。S330では、S320において得られた補正画像と、メモリ内に記憶されている基準画像との整合を判定し、同等な画像ではなかった場合に、画像補正部21におけるハード異常と判定する。画像補正部21におけるハード異常が検出された場合は、ダイアグ処理器9に対して故障診断結果を通知する(S340)。 After S310, the image correction unit 21 performs image correction (S320), and the failure diagnosis unit 25 determines a hardware abnormality in the image correction unit 21 based on the corrected image (S330). In the memory 25A, in addition to the evaluation image described above, a correction image that should be obtained when the image correction unit 21 is normal is stored in advance as a reference image. In S330, the matching between the corrected image obtained in S320 and the reference image stored in the memory is determined. If the images are not equivalent, it is determined that the image correction unit 21 has a hardware abnormality. When a hardware abnormality is detected in the image correction unit 21, a failure diagnosis result is notified to the diagnosis processor 9 (S340).
 [効果]
 以上説明したように、上記車載画像処理システムによれば、画像補正部21が故障して、歪補正が正しくできない状態に陥っている場合に、故障診断部25は、画像補正部21で補正された補正画像からは消失点が正しく求められないことを利用して、画像補正部21が故障していると判定する。
[effect]
As described above, according to the on-vehicle image processing system, when the image correction unit 21 is out of order and the distortion correction cannot be performed correctly, the failure diagnosis unit 25 is corrected by the image correction unit 21. Based on the fact that the vanishing point cannot be correctly obtained from the corrected image, it is determined that the image correction unit 21 is malfunctioning.
 したがって、補正画像から消失点が正しく求められるか否かには着目していない技術とは異なり、画像補正部21において補正画像を生成する段階で(すなわち、カメラ3の異常とは別の段階で)、何らかの障害が発生していることを特定できる。よって、例えば、そのような補正画像を生成する画像補正部21に異常があることを利用者に対して報知することができる。あるいは、例えば、そのような異常な補正画像に基づく制御を中止するなどの対処をすることができる。 Therefore, unlike the technique that does not focus on whether or not the vanishing point is correctly obtained from the corrected image, at the stage of generating the corrected image in the image correction unit 21 (that is, at a stage different from the abnormality of the camera 3). ), It can be identified that some kind of failure has occurred. Therefore, for example, it is possible to notify the user that there is an abnormality in the image correction unit 21 that generates such a corrected image. Alternatively, for example, it is possible to take measures such as stopping the control based on such an abnormal corrected image.
 また、故障診断部25は、S110-S150により、移動している撮影対象物の画像と推定される移動要素が含まれている場合に、その移動要素の移動状態から消失点を推定可能か否かに基づいて、画像補正部21が故障しているか否かを判定可能に構成されている。したがって、動画像を用いて、フレーム間での特徴点の移動から消失点を演算し、画像補正部21の故障を判定することができる。 Further, when the failure diagnosis unit 25 includes a moving element that is estimated to be an image of the moving object to be photographed in S110 to S150, whether or not the vanishing point can be estimated from the moving state of the moving element. Based on the above, it is possible to determine whether or not the image correction unit 21 is out of order. Therefore, the vanishing point can be calculated from the movement of the feature points between frames using the moving image, and the failure of the image correction unit 21 can be determined.
 また、故障診断部25は、S210-S250により、カメラ3による撮影領域の奥行き方向に向かって線状に延びている撮影対象物の画像と推定される線状要素が含まれている場合に、線状要素が延びる方向から消失点を推定可能か否かに基づいて、画像補正部21が故障しているか否かを判定可能に構成されている。したがって、静止画像のエッジを抽出することで消失点を演算し、画像補正部21の故障を判定することができる。 In addition, the failure diagnosis unit 25 includes a linear element that is presumed to be an image of an imaging target extending linearly in the depth direction of the imaging region by the camera 3 in S210 to S250. Based on whether or not the vanishing point can be estimated from the direction in which the linear elements extend, it is possible to determine whether or not the image correction unit 21 has failed. Therefore, it is possible to calculate the vanishing point by extracting the edge of the still image and determine the failure of the image correction unit 21.
 故障診断部25は、図6に示したように、補正画像中に含まれる一部の範囲を有効セグメントとして選択し、選択された範囲において補正画像中の消失点に異常があるか否かに基づいて、画像補正部21が故障しているか否かを判定可能に構成されている。したがって、車両がカーブを走行中でも、予期しない箇所に消失点があると誤認識するのを抑制することができる。 As shown in FIG. 6, the failure diagnosis unit 25 selects a part of the range included in the corrected image as an effective segment, and determines whether or not the vanishing point in the corrected image is abnormal in the selected range. Based on this, it is possible to determine whether or not the image correction unit 21 is out of order. Therefore, even when the vehicle is traveling on a curve, erroneous recognition that there is a vanishing point at an unexpected location can be suppressed.
 また、上記車載画像処理システムは、カメラ3とともに車両に搭載され、画像補正部21は、車両の移動中に、カメラ3により撮影されたカメラ画像の歪みを補正してその歪みが補正された補正画像を出力可能に構成され、故障診断部25は、車両の移動中に、画像補正部21から出力される補正画像中の消失点に異常があるか否かに基づいて、画像補正部21が故障しているか否かを判定可能に構成されている。したがって、車両の走行中にリアルタイムに画像補正部21の故障を検出することができる。 The in-vehicle image processing system is mounted on the vehicle together with the camera 3, and the image correction unit 21 corrects the distortion of the camera image taken by the camera 3 while the vehicle is moving to correct the distortion. The failure diagnosis unit 25 is configured to be able to output an image, and the image correction unit 21 determines whether the vanishing point in the corrected image output from the image correction unit 21 is abnormal while the vehicle is moving. It is configured to be able to determine whether or not a failure has occurred. Therefore, the failure of the image correction unit 21 can be detected in real time while the vehicle is traveling.
 また、上記車載画像処理システムでは、ホスト車両に搭載された車速センサ11、舵角センサ13、及び姿勢センサ15(これらは検出器の一例に相当)などから得られる車両状態の情報やサスペンションの負荷情報などに基づき、ECU1(すなわち故障診断部25)が、突発的な振動や衝撃などのホスト車両あるいはカメラ3に加わる力であるところの外乱が検知されたと判定した場合には、その外乱が加わる間の画像を故障判定の対象から除外している。したがって、車両の振動で、画像が大きく揺れた場合などには、その画像を判定対象から外すことができ、これにより、故障診断の精度向上を図ることができる。 In the on-vehicle image processing system, vehicle state information and suspension load obtained from the vehicle speed sensor 11, the steering angle sensor 13, and the attitude sensor 15 (which correspond to an example of a detector) mounted on the host vehicle. If the ECU 1 (that is, the failure diagnosis unit 25) determines that a disturbance that is a force applied to the host vehicle or the camera 3 such as sudden vibration or impact is detected based on the information or the like, the disturbance is added. The image in between is excluded from the target of failure determination. Therefore, when the image is greatly shaken by the vibration of the vehicle, the image can be removed from the determination target, thereby improving the accuracy of failure diagnosis.
 また、上記車載画像処理システムは、故障診断用としてあらかじめ用意された評価用画像を記憶するメモリ25A(記憶器の一例に相当)を有し、S310-S340により、画像補正部21は、メモリ25Aに記憶された評価用画像の歪みを補正して、その歪みが補正された診断用補正画像を出力可能に構成され、故障診断部25は、画像補正部21から出力される診断用補正画像の異常状態に基づいて、画像補正部21の故障を判定可能に構成されている。したがって、工場検査時や始動時には、あらかじめ用意されている評価用画像を用いて、故障診断を実施することができ、リアルタイム故障検出処理とは別に、カメラ3の経年変化や車両の走行状態などの影響を排除して、より高精度な故障診断を実施することができる。 The in-vehicle image processing system includes a memory 25A (corresponding to an example of a storage device) that stores an evaluation image prepared in advance for failure diagnosis. Through S310-S340, the image correction unit 21 stores the memory 25A. Is corrected so as to be able to output a diagnostic correction image in which the distortion is corrected. The failure diagnosis unit 25 is configured to output the diagnostic correction image output from the image correction unit 21. A failure of the image correction unit 21 can be determined based on the abnormal state. Therefore, at the time of factory inspection or start-up, it is possible to carry out a failure diagnosis using an evaluation image prepared in advance, and in addition to the real-time failure detection processing, such as the secular change of the camera 3 and the running state of the vehicle The influence can be eliminated and more accurate fault diagnosis can be performed.
 [他の実施形態]
 以上、画像処理装置、及び画像処理装置の故障診断方法について、例示的な実施形態を挙げて説明したが、本開示は、上述の例示的な実施形態に限定されることなく、種々の形態を採用し得る。
[Other Embodiments]
As described above, the image processing apparatus and the failure diagnosis method for the image processing apparatus have been described with reference to exemplary embodiments. However, the present disclosure is not limited to the above-described exemplary embodiments, and various forms are possible. Can be adopted.
 例えば、上記実施形態では、リアルタイム故障検出処理と始動時故障検出処理の双方を採用している例を示したが、これらはいずれか一方だけを採用してもよい。 For example, in the above-described embodiment, an example is shown in which both the real-time failure detection process and the start-up failure detection process are employed, but only one of them may be employed.
 また、上記実施形態では、補正画像を所定の位置で9つのセグメントに分割する例を示したが、セグメントの分割位置や分割数は適宜設定可能であり、例示したものに限られない。 In the above embodiment, the correction image is divided into nine segments at a predetermined position. However, the division position and the number of divisions can be appropriately set, and are not limited to those illustrated.
 また、上記実施形態では、図示の都合上、便宜的にメモリ25Aを故障診断部25が備える構成として図示してあるが、画像補正部21及び故障診断部25がアクセス可能なメモリであればよく、画像補正部21が備えるメモリであっても、画像補正部21及び故障診断部25からは独立したECU1内のメモリであってもよい。あるいは、ECU1がECU1の外部にあるメモリにアクセス可能な構成を備えていれば、ECU1の外部にあるメモリであってもよい。 In the above embodiment, for convenience of illustration, the memory 25A is illustrated as a configuration provided in the failure diagnosis unit 25 for convenience. However, any memory that can be accessed by the image correction unit 21 and the failure diagnosis unit 25 may be used. The memory provided in the image correction unit 21 may be a memory in the ECU 1 that is independent from the image correction unit 21 and the failure diagnosis unit 25. Alternatively, as long as the ECU 1 has a configuration capable of accessing a memory outside the ECU 1, a memory outside the ECU 1 may be used.
 また、上記実施形態では、車載画像処理システムを例示したが、車両以外に同等なシステムを搭載してもよい。 In the above embodiment, the in-vehicle image processing system is exemplified, but an equivalent system other than the vehicle may be mounted.
 また、上述した画像処理装置、及び画像処理装置の故障診断方法の他、当該画像処理装置を構成要素とするシステム、当該画像処理装置としてコンピュータを機能させるためのインストラクションを含むプログラム(プログラム製品とも言及される)、あるいは、故障診断方法を備えるインストラクションを含むプログラム(プログラム製品とも言及される)、これらのプログラムの各々を格納するコンピュータで読取り可能なROM,RAM等の非遷移の有形の媒体など、種々の形態で本開示を実現することもできる。 Further, in addition to the above-described image processing apparatus and image processing apparatus failure diagnosis method, a system including the image processing apparatus as a constituent element, a program including instructions for causing a computer to function as the image processing apparatus (also referred to as a program product) Or a program (also referred to as a program product) that includes instructions for providing a fault diagnosis method, a computer-readable ROM, RAM, or other non-transitional tangible medium that stores each of these programs, etc. The present disclosure can also be realized in various forms.
 本開示は、実施例に準拠して記述されたが、本開示は当該実施例や構造に限定されるものではないと理解される。本開示は、様々な変形例や均等範囲内の変形をも包含する。加えて、様々な組み合わせや形態、さらには、それらに一要素のみ、それ以上、あるいはそれ以下、を含む他の組み合わせや形態をも、本開示の範疇や思想範囲に入るものである。

 
Although the present disclosure has been described with reference to the embodiments, it is understood that the present disclosure is not limited to the embodiments and structures. The present disclosure includes various modifications and modifications within the equivalent range. In addition, various combinations and forms, as well as other combinations and forms including only one element, more or less, are within the scope and spirit of the present disclosure.

Claims (8)

  1.  カメラ(3)により撮影された少なくとも一つのカメラ画像の歪みを補正し、当該歪みが補正された画像であって少なくとも一つの消失点を有する画像である補正画像を出力する画像補正部(21,S30)と、
     前記画像補正部から出力される前記補正画像中の前記消失点に異常があるか否かに基づいて、前記画像補正部が故障しているか否かを判定する故障診断部(25,S40-S60)と
     を有する画像処理装置。
    An image correcting unit (21, 21) that corrects distortion of at least one camera image captured by the camera (3) and outputs a corrected image that is an image in which the distortion is corrected and has at least one vanishing point. S30)
    A failure diagnosis unit (25, S40-S60) that determines whether or not the image correction unit is out of order based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal. And an image processing apparatus.
  2.  請求項1に記載の画像処理装置であって、
     前記画像補正部は、前記カメラによって撮影される動画像中に含まれる複数の前記カメラ画像それぞれの歪みを補正することにより、複数の前記補正画像を出力するように構成され、
     前記故障診断部は、複数の前記補正画像に含まれる特徴要素として、移動している撮影対象物の画像と推定される移動要素が含まれている場合に、当該移動要素の移動状態から前記消失点を推定可能か否かに基づいて、前記画像補正部が故障しているか否かを判定可能に構成されている(S110-S140)
     画像処理装置。
    The image processing apparatus according to claim 1,
    The image correcting unit is configured to output a plurality of the corrected images by correcting distortion of each of the plurality of camera images included in a moving image photographed by the camera.
    When the failure diagnosis unit includes a moving element estimated as an image of a moving object to be photographed as a feature element included in the plurality of corrected images, the disappearance from the moving state of the moving element Based on whether or not the point can be estimated, it is possible to determine whether or not the image correction unit is out of order (S110 to S140).
    Image processing device.
  3.  請求項1又は請求項2に記載の画像処理装置であって、
     前記故障診断部は、前記補正画像に含まれる特徴要素として、前記カメラによる撮影領域の奥行き方向に向かって線状に延びている撮影対象物の画像と推定される線状要素が含まれている場合に、前記線状要素が延びる方向から前記消失点を推定可能か否かに基づいて、前記画像補正部が故障しているか否かを判定可能に構成されている(S210-S240)
     画像処理装置。
    The image processing apparatus according to claim 1 or 2,
    The failure diagnosis unit includes, as characteristic elements included in the corrected image, a linear element that is estimated to be an image of an object to be imaged that extends linearly in the depth direction of the imaging region of the camera. In this case, it is possible to determine whether or not the image correction unit is out of order based on whether or not the vanishing point can be estimated from the direction in which the linear element extends (S210 to S240).
    Image processing device.
  4.  請求項1-請求項3のいずれか一項に記載の画像処理装置であって、
     前記故障診断部は、前記補正画像中に含まれる一部の範囲を選択し、選択された範囲において前記補正画像中の前記消失点に異常があるか否かに基づいて、前記画像補正部が故障しているか否かを判定可能に構成されている(S110-S130,S210-S230)
     画像処理装置。
    The image processing apparatus according to any one of claims 1 to 3,
    The failure diagnosis unit selects a part of the range included in the corrected image, and based on whether the vanishing point in the corrected image is abnormal in the selected range, the image correction unit It is configured to be able to determine whether or not a failure has occurred (S110-S130, S210-S230).
    Image processing device.
  5.  請求項1-請求項4のいずれか一項に記載の画像処理装置であって、
     前記画像処理装置は、前記カメラとともに、少なくとも車両を含む移動体に搭載され、
     前記画像補正部は、前記移動体の移動中に、前記カメラにより撮影された各々のカメラ画像の歪みを補正して当該歪みが補正された補正画像を出力可能に構成され、
     前記故障診断部は、前記移動体の移動中に、前記画像補正部から出力される前記補正画像中の前記消失点に異常があるか否かに基づいて、前記画像補正部が故障しているか否かを判定可能に構成されている(S10-S70)
     画像処理装置。
    An image processing apparatus according to any one of claims 1 to 4, comprising:
    The image processing apparatus is mounted on a moving body including at least a vehicle together with the camera,
    The image correction unit is configured to be able to output a corrected image in which the distortion is corrected by correcting distortion of each camera image captured by the camera while the moving body is moving,
    The failure diagnosis unit determines whether the image correction unit has failed based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal while the moving body is moving. It is configured to be able to determine whether or not (S10-S70)
    Image processing device.
  6.  請求項1-請求項5のいずれか一項に記載の画像処理装置であって、
     前記カメラによる撮影に影響を及ぼす外乱を検出可能な検出器と協調し、
     前記故障診断部は、前記検出器で前記外乱を検出したと判定した時点で、前記カメラによって撮影された前記カメラ画像を判定対象から除外するように構成されている(S110-S130,S210-S230)
     画像処理装置。
    An image processing apparatus according to any one of claims 1 to 5, comprising:
    In cooperation with a detector capable of detecting disturbances affecting the shooting by the camera,
    The failure diagnosis unit is configured to exclude the camera image taken by the camera from the determination target when it is determined that the disturbance is detected by the detector (S110-S130, S210-S230). )
    Image processing device.
  7.  請求項1-請求項6のいずれか一項に記載の画像処理装置であって、
     故障診断用としてあらかじめ用意された評価用画像を記憶する記憶器(25A)を有し、
     前記画像補正部は、前記記憶器に記憶された前記評価用画像の歪みを補正して、当該歪みが補正された診断用補正画像を出力可能に構成され(S320)、
     前記故障診断部は、前記画像補正部から出力される前記診断用補正画像の異常状態に基づいて、前記画像補正部の故障を判定可能に構成されている(S330)
     画像処理装置。
    The image processing apparatus according to any one of claims 1 to 6,
    A storage device (25A) for storing an evaluation image prepared in advance for failure diagnosis;
    The image correction unit is configured to be able to correct distortion of the evaluation image stored in the storage unit and output a diagnostic correction image in which the distortion is corrected (S320),
    The failure diagnosis unit is configured to be able to determine a failure of the image correction unit based on an abnormal state of the diagnostic correction image output from the image correction unit (S330).
    Image processing device.
  8.  カメラにより撮影された少なくとも一つのカメラ画像の歪みを補正し、当該歪みが補正された画像であって少なくとも一つの消失点を有する画像である補正画像を出力する画像補正部(21)を有する画像処理装置(1)を対象にして、前記画像補正部の故障を判定する方法であって、
     前記画像補正部から出力される前記補正画像中の前記消失点に異常があるか否かに基づいて、前記画像補正部が故障しているか否かを判定する手順を含む
     画像処理装置の故障診断方法。

     
    An image having an image correction unit (21) that corrects distortion of at least one camera image captured by the camera and outputs a corrected image that is an image in which the distortion is corrected and that has at least one vanishing point. A method of determining a failure of the image correction unit for a processing device (1),
    A failure diagnosis of an image processing apparatus including a procedure for determining whether or not the image correction unit is out of order based on whether or not the vanishing point in the corrected image output from the image correction unit is abnormal Method.

PCT/JP2015/003554 2014-08-22 2015-07-14 Image processing apparatus, and failure diagnosis method for image processing apparatus WO2016027408A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-169555 2014-08-22
JP2014169555A JP6350113B2 (en) 2014-08-22 2014-08-22 Image processing apparatus and fault diagnosis method for image processing apparatus

Publications (1)

Publication Number Publication Date
WO2016027408A1 true WO2016027408A1 (en) 2016-02-25

Family

ID=55350381

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/003554 WO2016027408A1 (en) 2014-08-22 2015-07-14 Image processing apparatus, and failure diagnosis method for image processing apparatus

Country Status (2)

Country Link
JP (1) JP6350113B2 (en)
WO (1) WO2016027408A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017208585A1 (en) * 2016-05-30 2017-12-07 アイシン精機株式会社 Image display system
CN107862883A (en) * 2017-12-21 2018-03-30 天津市中环系统工程有限责任公司 The fault detect and alarm of traffic lights and operation management system and implementation method
JP2018079839A (en) * 2016-11-17 2018-05-24 株式会社デンソー Vehicular display system and freeze detection device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6635056B2 (en) * 2017-01-17 2020-01-22 トヨタ自動車株式会社 Driving support device for vehicles
JP6698779B2 (en) * 2018-10-10 2020-05-27 三菱電機株式会社 Image processing device
CN115136088B (en) * 2020-02-21 2023-04-18 三菱电机株式会社 Programmable display, control system and analysis method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012156672A (en) * 2011-01-25 2012-08-16 Clarion Co Ltd Vehicle periphery monitoring device
WO2014045344A1 (en) * 2012-09-18 2014-03-27 トヨタ自動車株式会社 Foe setting device and foe setting method for on-vehicle camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012156672A (en) * 2011-01-25 2012-08-16 Clarion Co Ltd Vehicle periphery monitoring device
WO2014045344A1 (en) * 2012-09-18 2014-03-27 トヨタ自動車株式会社 Foe setting device and foe setting method for on-vehicle camera

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017208585A1 (en) * 2016-05-30 2017-12-07 アイシン精機株式会社 Image display system
JP2017216517A (en) * 2016-05-30 2017-12-07 アイシン精機株式会社 Image display system
JP2018079839A (en) * 2016-11-17 2018-05-24 株式会社デンソー Vehicular display system and freeze detection device
CN107862883A (en) * 2017-12-21 2018-03-30 天津市中环系统工程有限责任公司 The fault detect and alarm of traffic lights and operation management system and implementation method
CN107862883B (en) * 2017-12-21 2023-06-30 天津市中环系统工程有限责任公司 Fault detection, alarm and operation management system of traffic signal lamp and implementation method

Also Published As

Publication number Publication date
JP6350113B2 (en) 2018-07-04
JP2016045716A (en) 2016-04-04

Similar Documents

Publication Publication Date Title
WO2016027408A1 (en) Image processing apparatus, and failure diagnosis method for image processing apparatus
US9769469B2 (en) Failure detection apparatus and failure detection program
US6310546B1 (en) Stereo type vehicle monitoring apparatus with a fail-safe function
US10235772B2 (en) Installation error display control apparatus and installation error detection apparatus of onboard camera
US10165265B2 (en) Online sensor calibration verification system
JP6259132B2 (en) In-vehicle camera device
JP2019105872A5 (en)
JP2008131250A (en) Correcting device for on-board camera and production method for vehicle using same correcting device
JP2010239409A (en) Calibrating apparatus for on-board camera of vehicle
US10723347B2 (en) Vehicle control device and vehicle control method
JP2008131177A (en) Correcting device for on-board camera, correcting method, and production method for vehicle using same correcting method
JP6458579B2 (en) Image processing device
JP7270499B2 (en) Abnormality detection device, abnormality detection method, posture estimation device, and mobile body control system
US9852502B2 (en) Image processing apparatus
JP2019191806A (en) Abnormality detection device and abnormality detection method
JP2011155687A (en) Device for calibration of onboard camera
US20220254064A1 (en) External parameter calibration method, device and system for image acquisition apparatus
US20200090347A1 (en) Apparatus for estimating movement information
JP2019219719A (en) Abnormality detection device and abnormality detection method
JP6798926B2 (en) In-vehicle camera calibration device
JP4539427B2 (en) Image processing device
JP2010067058A (en) Face direction detection device
JP7183729B2 (en) Imaging abnormality diagnosis device
JP2019191808A (en) Abnormality detection device and abnormality detection method
JP6986962B2 (en) Camera misalignment detection device and camera misalignment detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15833513

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15833513

Country of ref document: EP

Kind code of ref document: A1