WO2014203658A1 - Distance measurement device and distance measurement method - Google Patents

Distance measurement device and distance measurement method Download PDF

Info

Publication number
WO2014203658A1
WO2014203658A1 PCT/JP2014/062897 JP2014062897W WO2014203658A1 WO 2014203658 A1 WO2014203658 A1 WO 2014203658A1 JP 2014062897 W JP2014062897 W JP 2014062897W WO 2014203658 A1 WO2014203658 A1 WO 2014203658A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
unit
state
image
captured image
Prior art date
Application number
PCT/JP2014/062897
Other languages
French (fr)
Japanese (ja)
Inventor
自広 山谷
基広 浅野
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2015522667A priority Critical patent/JPWO2014203658A1/en
Publication of WO2014203658A1 publication Critical patent/WO2014203658A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement

Definitions

  • the present invention relates to a distance measuring device and a distance measuring method for measuring a distance from an image to a predetermined object.
  • the distance to another vehicle is required as follows. First, another vehicle is detected from an image captured by a camera mounted on the host vehicle, and the detected image of the other vehicle is defined and stored as a reference template. Next, the correlation between the input image newly picked up by the camera and the reference template is obtained while enlarging or reducing the input image or the reference template, whereby a new vehicle of the other vehicle in the input image is obtained. The position and size are detected. Then, the distance between the host vehicle and the other vehicle is obtained based on the detected position and size of the other vehicle.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide a distance measuring device and a distance measuring method capable of improving detection accuracy even when the imaging state changes.
  • the latest imaging based on the latest latest captured image of the time-series captured images captured by the imaging unit and the reference image stored in the reference information storage unit.
  • a distance to a predetermined object included in the image is obtained.
  • the imaging state of the imaging unit is acquired, it is determined whether or not the acquired imaging state satisfies a predetermined condition, and the reference stored in the reference information storage unit based on the determination result It is determined whether or not to discard the image. For this reason, such a distance measuring apparatus and distance measuring method can improve detection accuracy even when the imaging state changes.
  • FIG. 1 It is a block diagram which shows the structure of the distance measuring device in embodiment. It is a figure for demonstrating the distance calculation method in the distance measuring device of embodiment. It is a flowchart which shows operation
  • FIG. 1 is a block diagram showing a configuration of a distance measuring device in the embodiment.
  • FIG. 2 is a diagram for explaining a distance calculation method in the distance measuring apparatus according to the embodiment.
  • FIG. 2A is a diagram for describing a reference image extracted from a captured image
  • FIG. 2B illustrates a region that matches the reference image in the latest latest captured image by pattern matching processing (correlation processing). It is a figure for doing.
  • the distance measuring device is a device that measures a distance from the moving body to a predetermined object included in the captured image based on a captured image captured by an imaging unit mounted on the moving body.
  • the moving body is, for example, a vehicle such as an automobile or a train, a robot having a moving function, a movement support device for a visually impaired person, or the like.
  • the mobile body may be capable of self-propelled.
  • Such a distance measuring device DM of the present embodiment includes, for example, an imaging unit 1, a processing unit 2, a storage unit 3, and a notification unit 4 as shown in FIG.
  • the imaging unit 1 is a device that is mounted on the mobile body (not shown) and continuously images a subject.
  • the imaging unit 1 is disposed on the moving body such that the imaging direction (optical axis direction) coincides with the moving direction of the moving body.
  • the imaging unit 1 is mounted on a vehicle, the imaging unit 1 is disposed on a dashboard, for example, with the imaging direction (optical axis direction) facing forward.
  • the imaging unit 1 includes, for example, an imaging optical system, a color or monochrome image sensor, and an image processing unit, and an optical image of a subject is formed on the imaging surface (light receiving surface) of the image sensor by the imaging optical system.
  • the optical image of the subject is photoelectrically converted by the image sensor, and the captured image of the subject is generated from the signal obtained by the photoelectric conversion by the image processing of the image processing unit.
  • the imaging unit 1 continuously images the subject at predetermined time intervals. For example, the imaging unit 1 continuously images the subject in time so as to obtain a predetermined frame rate such as 15 frames / second, 24 frames / second, and 30 frames / second.
  • the imaging unit 1 is connected to the processing unit 2 and sequentially outputs the generated captured images to the processing unit 2.
  • the storage unit 3 is connected to the processing unit 2, and stores various programs executed by the processing unit 2 and data necessary for the execution in advance (ROM (Read Only Memory)) or EEPROM (Electrically Erasable Programmable Read Only Memory). And the like, a volatile memory element such as a RAM (Random Access Memory) serving as a so-called working memory of the processing unit 2, and a peripheral circuit thereof.
  • the storage unit 3 is functionally configured with a reference information storage unit 31 that stores a later-described reference image.
  • the processing unit 2 controls each unit of the distance measuring device DM according to the function of each unit in order to measure the distance from the moving body to a predetermined object included in the captured image.
  • the processing unit 2 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
  • the processing unit 2 functionally executes a ranging program for measuring the distance from the moving object to a predetermined object included in the captured image, thereby functionally controlling the control unit 21, the distance calculation unit 22, and the state.
  • An acquisition unit 23, a discard determination unit 24, and a notification unit 25 are configured.
  • the control unit 21 controls the entire distance measuring device DM.
  • the distance calculation unit 22 stores one of time-series captured images captured continuously in time by the imaging unit 1 in the reference information storage unit 31 as a reference image, and when captured by the imaging unit 1 A distance to a predetermined object included in the latest captured image is obtained based on the latest latest captured image of the series of captured images and the reference image stored in the reference information storage unit 31.
  • the reference image may be the entire captured image so that a predetermined image of a predetermined object is extracted from the captured image when the distance is obtained.
  • the reference image is captured. This is an image of a predetermined object extracted from the image and is a part of the captured image.
  • the distance calculation unit 22 of the present embodiment has a predetermined timing set in advance at a predetermined timing (for example, when the distance measurement is started or the reference image is not stored in the reference information storage unit 31).
  • the object image (object image) is extracted from the captured image, and the extracted object image is stored in the reference information storage unit 31 as a reference image. More specifically, as shown in FIG.
  • the distance calculation unit 22 detects a predetermined object Ob set in advance from a captured image Pp captured temporally before the latest captured image, and this detection.
  • the rectangular image (object image) surrounding the predetermined object Ob is extracted from the captured image Pp as the reference image TP, and the extracted object image is stored in the reference information storage unit 31 as the reference image TP. If the predetermined object is not present in the image, the process returns to the beginning without extracting the reference image TP.
  • the predetermined object Ob may be an obstacle to movement of a moving object such as a vehicle (automobile and train), a motorcycle (bicycle and motorcycle), a person, an animal, and an artificial object. It is.
  • the predetermined object Ob is an automobile, as disclosed in Patent Document 1, for vehicle detection by image processing, for example, (1) It is sandwiched between white lines drawn on a road A first horizontal edge (lateral edge) is detected from the bottom to the top of the image, (2) a second horizontal edge above the first horizontal edge is detected, and (3) the first Well-known conventional means (conventional method) such as detecting a pair of first and second vertical edges in a region sandwiched between the horizontal edge and the second horizontal edge is used. That is, for vehicle detection by image processing, first and second vertical edges that form a pair of top and bottom or left and right pairs from a histogram (columnar diagram) of an edge sandwiched between white lines.
  • Known conventional means such as edge detection is used.
  • the distance calculation unit 22 detects the predetermined object Ob from the latest captured image Pt by template matching using the reference image TP as a template while enlarging or reducing the reference image TP or the latest captured image Pt. Thereby, the distance calculation unit 22 detects a new position and size of the predetermined object Ob on the latest captured image Pt. Then, the distance calculation unit 22 obtains a distance between the moving body and the predetermined object Ob based on the detected new position and size.
  • the distance z is given by the following equations (1) and (2), as disclosed in Patent Document 1.
  • atan (f / xc) (1)
  • z f ⁇ w / wc (2)
  • is the direction in which the object Ob at the position detected at the center xc in the width direction of the object Ob on the image exists
  • f is the focal length of the imaging unit 1
  • w is the actual object
  • the width of Ob, and wc is the width of the object Ob on the image.
  • the state acquisition unit 23 acquires the imaging state of the imaging unit 1. More specifically, the state acquisition unit 23 acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest captured image Pt captured by the imaging unit 1. For example, preferably, the state acquisition unit 23 acquires a statistical value (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount. Further, for example, preferably, the state acquisition unit 23 acquires a frequency characteristic in the latest captured image as the predetermined feature amount. When calculating the predetermined feature amount, the state acquisition unit 23 may use a part of the latest captured image.
  • the state acquisition unit 23 may acquire the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1.
  • the discard determination unit 24 determines whether or not the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition, and uses the reference image TP stored in the reference information storage unit 31 based on the determination result. It is determined whether or not to discard.
  • the discard determination unit 24 discards the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. That is, the discard determination unit 24 discards the reference image TP in the reference information storage unit 31.
  • the discard determination unit 24 functionally includes a state determination unit 241, a holdability determination unit 242, and a discard control unit 243.
  • the state determination unit 241 determines whether the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition.
  • the holdability determination unit 242 determines whether or not to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241.
  • the discard control unit 243 discards the reference image TP in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition.
  • the notification unit 25 determines that the state acquisition unit 25 determines that the continuous determination count n continuously determined by the discard determination unit 24 is equal to or greater than a threshold value Thn.
  • the notification unit 4 notifies the outside that the imaging state acquired by the video camera 23 satisfies a predetermined condition.
  • the notification unit 4 is a device that is connected to the processing unit 2 and notifies the outside that the imaging state acquired by the state acquisition unit 23 in accordance with the control of the notification unit 25 satisfies a predetermined condition.
  • the notification unit 4 includes, for example, a light emitting diode that emits light, and turns on (including the case of blinking) in accordance with the control of the notification unit 25 to notify the outside.
  • the notification unit 4 is configured to include a buzzer or a speaker that emits a sound, and notifies the outside by issuing a warning sound according to the control of the notification unit 25.
  • the notification unit 4 is configured to include a display device such as a CRT display or a liquid crystal display, for example, and notifies the outside by displaying a message indicating the above in accordance with the control of the notification unit 25.
  • FIG. 3 is a flowchart showing the operation of the distance measuring apparatus in the embodiment.
  • the imaging unit 1 when the distance measurement is started by an input of an unillustrated start switch, an input of a distance measurement start instruction switch, or the like, the imaging unit 1 follows the control of the control unit 21 of the processing unit 2.
  • the captured image of the subject is captured at a predetermined frame rate, and the time-series captured images are sequentially output to the processing unit 2.
  • the distance calculation unit 22 of the processing unit 2 first determines whether or not the reference image TP is stored in the reference information storage unit 31 of the storage unit 3 in FIG. (S1). As a result of this determination, when the reference image TP is not stored (is not stored) in the reference information storage unit 31 (N), the distance calculation unit 22 executes the next process S3. On the other hand, when the reference image TP is stored (exists) in the reference information storage unit 31 as a result of the determination in step S1 (Y), template matching is performed on the captured image (input image) Pt. Then, the reference image TP is acquired (S2), and the process S3 is executed.
  • the distance calculation unit 22 detects a predetermined object Ob. More specifically, when it is determined in step S1 that there is no reference image TP, the distance calculation unit 22 newly adds a predetermined object Ob to the captured image (input image) Pt by, for example, the above-described known technique. Is detected. On the other hand, when it is determined in step S1 that the reference image TP is present, the distance calculation unit 22 performs template matching on the captured image (input image) using the reference image TP acquired in step S2 as a template. The predetermined object Ob is detected, and the position and size of the new predetermined object Ob on the captured image (input image) Pt are detected.
  • the state acquisition unit 23 of the processing unit 2 acquires the imaging state of the imaging unit 1, and notifies the state determination unit 241 of the discard determination unit 24 of the acquired imaging state of the imaging unit 1 (S4).
  • the state determination unit 241 determines whether the imaging state of the imaging unit 1 satisfies a predetermined condition and notifies the determination result to the holdability determination unit 242. (S5). The acquisition of the imaging state in the process S4 and the determination of the imaging state in the process S5 will be described later.
  • the state determination unit 241 notifies the distance calculation unit 22 accordingly, thereby the distance calculation unit. 22 stores the image including the predetermined object Ob detected in the process S3 in the reference information storage unit 31 as a new reference image TP and stores it (S6).
  • the distance calculation unit 22 adds a new image on the image image (input image) Pt obtained in the process S3. Based on the position and size of the predetermined object Ob, a distance between the moving object and the predetermined object Ob is obtained. Then, the distance calculation unit 22 returns the process to the process S1 in order to process the next captured image (next input image) sequentially input in time series.
  • step S5 when the imaging state of the imaging unit 1 satisfies a predetermined condition (Y), the state determination unit 241 notifies the hold determination unit 242 of the discard determination unit 24 to that effect. Accordingly, the holdability determination unit 242 determines whether to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241. That is, the holdability determination unit 242 determines whether or not the reference image TP is stored in the reference information storage unit 31 (S7).
  • the holdability determination unit 242 notifies the discard control unit of the discard determination unit 24 to that effect.
  • the discard control unit 243 causes the reference information storage unit 31 to discard the reference image TP stored in the reference information storage unit 31 (S9), and processes the next captured image (next input image). In order to do so, the process returns to process S1.
  • the reference image TP may be discarded by deleting the data of the reference image TP from the reference information storage unit 31.
  • the reference image TP stores the data of the reference image TP. It may be abandoned by releasing the storage area.
  • the holdability determination unit 242 determines that the imaging state of the imaging unit 1 is predetermined.
  • the notification unit 25 notifies the notification unit 25 that the condition is satisfied, and when the notification unit 25 is continuously notified that the imaging state of the imaging unit 1 satisfies the predetermined condition, the notification unit 25 counts the number of notifications as the continuous determination number n. Then, it is determined whether or not the counted number n of consecutive determinations is equal to or greater than a threshold value Thn (S8).
  • step S8 when the number of consecutive determinations n is less than the threshold value Thn (N), the notification unit 25 processes the next captured image (next input image) sequentially input in time series. In order to do so, the process returns to process S1.
  • step S8 determines whether the result of determination in step S8 is that the number of consecutive determinations n is equal to or greater than the threshold value Thn (N)
  • the notification unit 25 indicates that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. Is notified to the outside by the notification unit 4 (S10).
  • the notification unit 25 returns the process to step S1 in order to process the next captured image (next input image) sequentially input in time series.
  • the processes S1 to S10 shown in FIG. 3 are executed for each captured image (input image) Pt sequentially input in time series as described above. Therefore, when it is continuously notified in the process S8 that the imaging state of the imaging unit 1 satisfies the predetermined condition, the captured images (input images) Pt sequentially input in time series are continuously displayed. This is a case where images are continuously captured in an imaging state that satisfies the predetermined condition.
  • FIG. 4 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment when the vehicle equipped with the imaging unit enters the tunnel. It is a figure for doing.
  • FIG. 5 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring device of the embodiment when the vehicle equipped with the imaging unit exits the tunnel. It is a figure for doing.
  • FIG. 4A and 5A are diagrams showing temporal changes in the median luminance value in a time-series captured image (each frame), the horizontal axis is a frame (time), and the vertical axis is the luminance value.
  • FIG. 4B and FIG. 5B are diagrams showing the time change of the change amount of the median value, the horizontal axis is the frame (time), and the vertical axis is the change amount of the luminance value.
  • FIG. 6 is a diagram schematically illustrating captured images captured by the imaging unit before and after water droplets adhere to the imaging unit in the distance measuring apparatus according to the embodiment. 6A schematically illustrates a captured image captured by the imaging unit before water droplets adhere to the imaging unit, and FIG.
  • FIG. 6B illustrates an image captured by the imaging unit after water droplets adhere to the imaging unit. An image is shown typically.
  • FIG. 7 is a diagram for describing a method for determining the imaging state when the frequency characteristics of the image are used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment.
  • FIG. 7A shows each frequency characteristic of each captured image before and after water droplets adhere to the imaging unit
  • FIG. 7B illustrates a determination method according to the first aspect for determining whether or not water droplets have adhered to the imaging unit.
  • FIG. 7C is a diagram for explaining a determination method of the second mode for determining whether or not water droplets have adhered to the imaging unit.
  • FIG. 8 is a diagram for explaining an image region used by the state acquisition unit in the distance measuring apparatus according to the embodiment.
  • FIG. 8A is a diagram for explaining an image area of the first mode used by the state acquisition unit
  • FIG. 8B is a diagram for explaining an image region of the second mode used by the state acquisition unit.
  • FIG. 9 is a diagram for explaining a method for determining the imaging state when the steering angle of a vehicle equipped with the imaging unit is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment.
  • FIG. 9A is a diagram showing the time change of the steering angle, the horizontal axis is time, and the vertical axis is the steering angle.
  • FIG. 9B is a diagram showing the change over time in the change amount of the steering angle, the horizontal axis is time, and the vertical axis is the change amount of the steering angle.
  • the state acquisition unit 23 described above acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest latest captured image Pt among the time-series captured images captured by the imaging unit 1. It may be.
  • Such a distance measuring device DM uses the latest captured image (input image) Pt imaged by the imaging unit 1, and therefore does not require a separate device for acquiring the imaging state of the imaging unit 1.
  • the predetermined feature amount includes an average value of each pixel value of each pixel in the latest captured image Pt, a median value of each pixel value of each pixel in the latest captured image Pt, and each luminance value of each pixel in the latest captured image Pt. And a predetermined statistic of each pixel in the latest captured image Pt, such as a median value of each luminance value of each pixel in the latest captured image Pt.
  • the predetermined feature amount is preferably a median value. More specifically, the median value of each pixel value of each pixel in the latest captured image Pt is shown in FIG.
  • the discard determination unit 24 determines whether or not the change amount in the median value of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th1. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is equal to or greater than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is less than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Judge that there is no.
  • the predetermined threshold Th1 is statistically appropriately set by measuring a plurality of samples, for example.
  • the change in the median value and the change in the median value shown in FIGS. 4 and 5 are not only in the case of entering / exiting the tunnel, but also when entering / exiting a relatively long guard, or from normal light to backlight. This may occur when the brightness changes, such as when the image is changed, when the backlight changes to the forward light, or when the imaging unit 1 is irradiated on the light of the oncoming vehicle.
  • the distance measuring apparatus DM may be configured to further include a reference information updating unit that stores the image TP in the reference information storage unit 31. By including such a reference information update unit, storing the captured image captured in this unstable state in the reference information storage unit 31 is reduced, and the distance measuring device DM can improve the detection accuracy. it can.
  • the predetermined feature amount is a frequency characteristic in the latest captured image Pt.
  • the frequency characteristic of the captured image P is a frequency spectrum that is a distribution of frequency components (intensities) included in the captured image P, and is an intensity with respect to the frequency.
  • the captured image P1 is relatively clear as a whole as schematically shown in FIG. 6A, but the water droplet Dp adheres. Later, as shown schematically in FIG.
  • the frequency characteristic ⁇ of the captured image P1 before the water droplet Dp is attached is different from the frequency characteristic ⁇ of the captured image P2 after the water droplet Dp is attached, and the captured image after the water droplet Dp is attached.
  • the frequency characteristic ⁇ of P2 is higher in intensity on the low frequency side (for example, the frequency range of frequency fL or lower) than the frequency characteristic ⁇ of the captured image P1 before the water drop Dp is attached, while on the high frequency side (for example, frequency of frequency fH or higher) (Range) is a small profile.
  • the discard determination unit 24 since the frequency characteristics of the captured image change before and after the attachment of the foreign matter, the discard determination unit 24, for example, as shown in FIG. 7B, the frequency of the latest captured image Pt acquired by the state acquisition unit 23a.
  • a predetermined threshold Th21 By determining whether or not the amount of change in the low frequency fL in the characteristic is greater than or equal to a predetermined threshold Th21, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th21, the imaging state acquired by the state acquisition unit 23a is predetermined. It is determined that the above condition is satisfied.
  • the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, as illustrated in FIG. 7B, the discard determination unit 24 determines whether or not the amount of change in the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th22. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the amount of change in the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th22. It is determined that the above condition is satisfied. When determining that the amount of change is less than a predetermined threshold Th22, the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined.
  • the threshold Th22 By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
  • the discard determination unit 24 determines whether the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th31. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th31, the imaging state acquired by the state acquisition unit 23a is a predetermined value. It is determined that the condition is satisfied.
  • the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th31, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, as illustrated in FIG. 7C, the discard determination unit 24 determines whether the intensity amount of the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th32. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the intensity amount of the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th32. It is determined that the above condition is satisfied. If the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th32, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined.
  • the threshold Th22 By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
  • the predetermined threshold values Th21, Th22, Th31, and Th32 are statistically appropriately set, for example, by measuring a plurality of samples.
  • the state acquisition unit 23 acquires one feature amount, and the discard determination unit 24 determines the imaging state based on the one feature amount acquired by the state acquisition unit 23.
  • the unit 23 may acquire a plurality of feature amounts, and the discard determination unit 24 may determine the imaging state based on the plurality of feature amounts acquired by the state acquisition unit 23.
  • the state acquisition unit 23a uses a part of the latest captured image Pt.
  • a part of the latest captured image Pt may be, for example, a rectangular region Pt11 from the lower end closest to the moving body to the vanishing point in the latest captured image Pt1, as shown in FIG. 9A.
  • an area above the vanishing point of the road surface is often drawn in the sky, and can be excluded from the area for determining an obstacle to the movement of the moving body.
  • the distance measuring device DM determines the feature amount from the captured image P. It is possible to appropriately set the image area for detecting the image. Further, since it is important that an obstacle to the movement of the moving body is present in the traveling area of the moving body, for example, a part of the latest captured image Pt is traveled by the moving body as shown in FIG. 9B. It may be a region Pt21.
  • the traveling area Pt21 of the moving object may be, for example, a triangular area (virtual traveling area) surrounded by both ends of the lower end closest to the moving object and the vanishing point in the latest captured image Pt2.
  • the travel area Pt21 may be a triangular area (actual travel area) surrounded by a lower end closest to the moving body in the latest captured image Pt2 and a pair of white lines drawn on both ends of the road surface.
  • a pair of white lines drawn at both ends of the road surface intersect at a vanishing point.
  • the vanishing point is a point where parallel lines that are actually parallel intersect in the image, and occurs because the captured image is formed using the perspective method.
  • the state acquisition unit 23 described above may be a state acquisition unit 23b that acquires the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1. Since such a distance measuring device DM sets the imaging direction of the imaging unit 1 to the imaging state of the imaging unit 1, the moving direction of the moving body can be diverted to the imaging state of the imaging unit 1. It is possible to cope with a change in the moving direction of the moving body such as a right turn.
  • the imaging direction of the imaging unit 1 since the imaging unit 1 mounted on the moving body is normally arranged to match the moving direction of the moving body as described above, the moving direction of the moving body is acquired. Can be obtained. More specifically, for example, the distance measuring device DM described above is mounted on a moving body and further includes a gyro sensor that detects the moving direction of the moving body, and the state acquisition unit 23b is based on the detection result of the gyro sensor. The moving direction of the moving body, that is, the imaging direction of the imaging unit 1 is acquired.
  • the state acquisition unit 23b acquires the steering angle information from vehicle information that transmits a so-called CAN (Controller Area Network), and the moving body is based on the acquired steering angle information.
  • Movement direction that is, the imaging direction of the imaging unit 1 is acquired.
  • the vehicle information includes lateral acceleration, yaw rate, operation information of the skid prevention device, and the like, and the state acquisition unit 23b considers these information based on these information.
  • the moving direction of the moving body that is, the imaging direction of the imaging unit 1 may be acquired.
  • the discard determination unit 24 determines whether the imaging direction of the imaging unit 1 acquired by the state acquisition unit 23b satisfies a predetermined condition. It is possible to determine whether or not the imaging state satisfies a predetermined condition. For example, when the state acquisition unit 23b uses the steering angle as the imaging direction of the imaging unit 1, as the driver steers, the steering angle starts to start as shown in FIG. The steering starts to increase as the steering wheel starts rotating, and the steering is maintained during the change of direction of the moving body (because the steering wheel is fixed), so that the direction of the moving body becomes the desired direction of the driver. Since the steering is returned (as the steering wheel is returned), the profile starts to decrease.
  • the discard determination unit 24 determines whether or not the steering angle acquired by the state acquisition unit 23b is equal to or greater than a predetermined threshold Th41 by the state acquisition unit 23b. It can be determined whether or not the acquired imaging state satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th41, the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the steering angle is less than the predetermined threshold Th41, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition.
  • the discard determination unit 24 determines that the change amount of the steering angle acquired by the state acquisition unit 23b is greater than or equal to a predetermined threshold Th42. By determining whether or not there is, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition.
  • the discard determination unit 24 determines that the change amount (absolute value) of the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th42, the imaging acquired by the state acquisition unit 23b. It is determined that the state satisfies a predetermined condition. If the discard determination unit 24 determines that the change amount (absolute value) of the steering angle is less than the predetermined threshold Th42, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition.
  • These predetermined threshold values Th41 and Th42 are statistically appropriately set by measuring, for example, a plurality of samples.
  • the distance measuring device DM and the distance measuring method implemented in the present embodiment it is determined whether or not the imaging state satisfies a predetermined condition, and the reference information storage unit 31 is based on the determination result. It is determined whether or not the reference image TP stored in is to be discarded. For this reason, since the distance measuring device DM and the distance measuring method in this embodiment can determine whether or not the reference image TP stored in the reference information storage unit 31 should be discarded based on the determination result of the discarding, the imaging state Even when the change occurs, it is possible to reduce the above-described erroneous detection and non-detection and improve the detection accuracy.
  • the distance measuring device DM and the distance measuring method in the present embodiment discard the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state satisfies a predetermined condition. Detection and non-detection can be reliably prevented.
  • the user continuously determines that the imaging state satisfies a predetermined condition by the notification of the notification unit 25 using the notification unit 4. Can be recognized.
  • a non-temporary abnormality such as sticking of a foreign object in the imaging unit 1 or a failure of the distance measuring device DM has occurred. Therefore, since the distance measuring device DM and the distance measuring method in the present embodiment include the notification unit 25, it is possible to prompt the user to take appropriate measures such as inspecting the imaging unit 1, for example. Therefore, the threshold value Thn is set to an appropriate number of times according to the frame rate, for example, 15 times or 30 times, from this viewpoint.
  • a distance measuring apparatus is mounted on a moving body, and an imaging unit that continuously images a subject in time and one of time-series captured images captured by the imaging unit as a reference image Based on the reference information storage unit to be stored, the latest latest captured image of the time-series captured images captured by the imaging unit, and the reference image stored in the reference information storage unit, A distance calculation unit that obtains a distance to a predetermined object included, a state acquisition unit that acquires an imaging state of the imaging unit, and whether or not the imaging state acquired by the state acquisition unit satisfies a predetermined condition And a discard determination unit that determines whether to discard the reference image stored in the reference information storage unit based on the determination result.
  • the distance measuring method is mounted on a moving body and is one of an imaging process for imaging a subject continuously in time, and a time-series captured image captured by the imaging process.
  • a reference image stored in the reference information storage unit, a latest latest captured image of the time-series captured images captured in the imaging step, and the reference image stored in the reference information storage unit A distance calculation step for obtaining a distance to a predetermined object included in the latest captured image, a state acquisition step for acquiring an imaging state of the imaging unit, and the imaging state acquired by the state acquisition step are predetermined.
  • a discard determination step of determining whether or not a condition is satisfied and determining whether or not to discard the reference image stored in the reference information storage unit based on the determination result.
  • the distance measuring apparatus and the distance measuring method it is determined whether or not the imaging state satisfies a predetermined condition, and the reference image stored in the reference information storage unit is discarded based on the determination result. It is determined whether or not. For this reason, since the distance measuring apparatus and the distance measuring method can determine whether or not the reference image stored in the reference information storage unit should be discarded based on the determination result of the discard, the imaging state changes. Even in this case, the detection accuracy can be improved.
  • the discard determination unit when the discard determination unit determines that the imaging state acquired by the state acquisition unit satisfies a predetermined condition, the discard determination unit stores the reference information storage unit.
  • the stored reference image is discarded.
  • the distance measuring device is captured by the imaging unit after capturing a predetermined number of images after capturing the latest captured image.
  • a reference information update unit that stores the captured image as a new reference image in the reference information storage unit.
  • Such a distance measuring device discards the reference image stored in the reference information storage unit when it is determined that the imaging state satisfies a predetermined condition, the above-described erroneous detection and non-detection are ensured. Can be prevented.
  • the state acquisition unit acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit. .
  • the state acquisition unit acquires a statistic (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount.
  • the state acquisition unit acquires a frequency characteristic in the latest captured image as the predetermined feature amount.
  • Such a distance measuring device acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit, an apparatus for acquiring the imaging state of the imaging unit Is not required separately.
  • the state acquisition unit uses a part of the latest captured image.
  • Such a distance measuring device acquires the imaging state of the imaging unit by using a part of the latest captured image, so that the load of information processing can be reduced.
  • the state acquisition unit acquires an imaging direction of the imaging unit as an imaging state of the imaging unit.
  • the imaging unit is arranged on the moving body so that the imaging direction coincides with the moving direction of the moving body. Therefore, since such a distance measuring device sets the imaging direction of the imaging unit to the imaging state of the imaging unit, the moving direction of the moving body can be diverted to the imaging state of the imaging unit. It is possible to cope with changes in the moving direction of the moving body, such as turning right or turning.
  • the number of continuous determinations continuously determined by the discard determination unit when the imaging state acquired by the state acquisition unit satisfies a predetermined condition is a threshold value.
  • the information processing apparatus further includes a notification unit that notifies the outside that the imaging state acquired by the state acquisition unit satisfies a predetermined condition.
  • the user can recognize that the imaging state is continuously determined to satisfy a predetermined condition by the notification of the notification unit. Therefore, since such a distance measuring device includes the notification unit, it is possible to prompt the user to take appropriate measures such as checking the imaging unit.
  • a distance measuring device and a distance measuring method can be provided.

Abstract

In the distance measurement device and distance measurement method according to the present invention, the distance to a prescribed object included in a latest photographed image that is the most recent photographed image from among chronological photographed images photographed by a photography unit is determined on the basis of the latest photographed image and a reference image stored in a reference information storage unit. The photography state of the photography unit is obtained, a determination is made about whether the obtained photography state meets a prescribed condition, and on the basis of the determination result, a determination is made about whether to discard the reference image stored in the reference information storage unit.

Description

測距装置および測距方法Ranging device and ranging method
 本発明は、画像から所定の物体までの距離を測定する測距装置および測距方法に関する。 The present invention relates to a distance measuring device and a distance measuring method for measuring a distance from an image to a predetermined object.
 近年、安全性を高める取り組みとして、例えば車両やロボット等の移動体に搭載されたカメラによって、前記移動体の移動方向に存在する所定の物体までの距離を測定し、前記物体までの距離に応じて前記移動体を制御するシステムが研究、開発されている。このようなシステムに用いられる測距装置として、例えば、特許文献1に開示の装置がある。 In recent years, as an effort to improve safety, for example, a camera mounted on a moving body such as a vehicle or a robot is used to measure the distance to a predetermined object existing in the moving direction of the moving body, and according to the distance to the object A system for controlling the moving body has been researched and developed. As a distance measuring device used in such a system, for example, there is a device disclosed in Patent Document 1.
 この特許文献1に開示の前方車両追跡システムでは、次のように他車両までの距離が求められている。まず、自車両に搭載されたカメラによって撮像された画像中から他車両が検出され、この検出された他車両の画像が参照テンプレートとして定義され、記憶される。次に、前記カメラによって新たに撮像された入力画像と前記参照テンプレートとの相関が、前記入力画像または前記参照テンプレートを拡大縮小しつつ求められ、これによって、前記入力画像中における他車両の新たな位置および大きさが検出される。そして、この検出した他車両の位置および大きさに基づいて自車両と他車両との距離が求められる。 In the forward vehicle tracking system disclosed in Patent Document 1, the distance to another vehicle is required as follows. First, another vehicle is detected from an image captured by a camera mounted on the host vehicle, and the detected image of the other vehicle is defined and stored as a reference template. Next, the correlation between the input image newly picked up by the camera and the reference template is obtained while enlarging or reducing the input image or the reference template, whereby a new vehicle of the other vehicle in the input image is obtained. The position and size are detected. Then, the distance between the host vehicle and the other vehicle is obtained based on the detected position and size of the other vehicle.
 ところで、前記特許文献1に開示の測距手法では、距離を求める際に、前記カメラによって新たに撮像された入力画像と当該入力画像よりも以前に前記カメラによって撮像された画像から得られた前記参照テンプレートとが用いられている。このため、前記カメラの撮像方向や露光条件等の撮像状態が比較的大きく変化すると、前記入力画像と前記参照テンプレートとが食い違うこととなり、この結果、前記入力画像中における他車両の新たな位置および大きさを誤検出あるいは未検出となる虞がある。 By the way, in the distance measuring method disclosed in Patent Document 1, when obtaining the distance, the input image newly captured by the camera and the image acquired by the camera before the input image are obtained. A reference template is used. For this reason, when the imaging state such as the imaging direction of the camera or the exposure condition changes relatively greatly, the input image and the reference template will be inconsistent. As a result, the new position of the other vehicle in the input image and There is a possibility that the size is erroneously detected or not detected.
特開2004-112144号公報JP 2004-112144 A
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、前記撮像状態が変化した場合でも検出精度を向上できる測距装置および測距方法を提供することである。 The present invention has been made in view of the above circumstances, and an object thereof is to provide a distance measuring device and a distance measuring method capable of improving detection accuracy even when the imaging state changes.
 本発明にかかる測距装置および測距方法では、撮像部によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と参照情報記憶部に記憶された参照画像とに基づいて前記最新撮像画像に含まれる所定の物体までの距離が求められる。ここで、前記撮像部の撮像状態が取得され、この取得された撮像状態が所定の条件を満たすか否かが判定され、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かが判定される。このため、このような測距装置および測距方法は、前記撮像状態が変化した場合でも検出精度を向上できる。 In the distance measuring apparatus and the distance measuring method according to the present invention, the latest imaging based on the latest latest captured image of the time-series captured images captured by the imaging unit and the reference image stored in the reference information storage unit. A distance to a predetermined object included in the image is obtained. Here, the imaging state of the imaging unit is acquired, it is determined whether or not the acquired imaging state satisfies a predetermined condition, and the reference stored in the reference information storage unit based on the determination result It is determined whether or not to discard the image. For this reason, such a distance measuring apparatus and distance measuring method can improve detection accuracy even when the imaging state changes.
 上記並びにその他の本発明の目的、特徴及び利点は、以下の詳細な記載と添付図面から明らかになるであろう。 The above and other objects, features and advantages of the present invention will become apparent from the following detailed description and the accompanying drawings.
実施形態における測距装置の構成を示すブロック図である。It is a block diagram which shows the structure of the distance measuring device in embodiment. 実施形態の測距装置における距離算出方法を説明するための図である。It is a figure for demonstrating the distance calculation method in the distance measuring device of embodiment. 実施形態における測距装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the ranging apparatus in embodiment. 撮像部を搭載した車両がトンネルに入る際に、実施形態の測距装置における撮像部の撮像状態として撮像画像の輝度値の中央値を用いた場合における前記撮像状態の判定手法を説明するための図である。When a vehicle equipped with an imaging unit enters a tunnel, a method for determining the imaging state when the median luminance value of the captured image is used as the imaging state of the imaging unit in the distance measuring apparatus of the embodiment is described. FIG. 撮像部を搭載した車両がトンネルを出る際に、実施形態の測距装置における撮像部の撮像状態として撮像画像の輝度値の中央値を用いた場合における前記撮像状態の判定手法を説明するための図である。When a vehicle equipped with an image pickup unit exits a tunnel, the method for determining the image pickup state when the median luminance value of the picked-up image is used as the image pickup state of the image pickup unit in the distance measuring device of the embodiment FIG. 実施形態の測距装置における撮像部に水滴が付着する前後における前記撮像部によって撮像される撮像画像を模式的に示す図である。It is a figure which shows typically the picked-up image imaged by the said imaging part before and after a water droplet adheres to the imaging part in the distance measuring device of embodiment. 実施形態の測距装置における撮像部の撮像状態として撮像画像の周波数特性を用いた場合における前記撮像状態の判定手法を説明するための図である。It is a figure for demonstrating the determination method of the said imaging state in case the frequency characteristic of a captured image is used as an imaging state of the imaging part in the distance measuring device of embodiment. 実施形態の測距装置における状態取得部によって用いられる画像領域を説明するための図である。It is a figure for demonstrating the image area | region used by the state acquisition part in the distance measuring device of embodiment. 実施形態の測距装置における撮像部の撮像状態として、前記撮像部を搭載した車両の操舵角を用いた場合における前記撮像状態の判定手法を説明するための図である。It is a figure for demonstrating the determination method of the said imaging state at the time of using the steering angle of the vehicle carrying the said imaging part as an imaging state of the imaging part in the distance measuring device of embodiment.
 以下、本発明にかかる実施の一形態を図面に基づいて説明する。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。また、本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, an embodiment according to the present invention will be described with reference to the drawings. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, The description is abbreviate | omitted suitably. Further, in this specification, when referring generically, it is indicated by a reference symbol without a suffix, and when referring to an individual configuration, it is indicated by a reference symbol with a suffix.
 図1は、実施形態における測距装置の構成を示すブロック図である。図2は、実施形態の測距装置における距離算出方法を説明するための図である。図2Aは、撮像画像から抽出される参照画像を説明するための図であり、図2Bは、パターンマッチング処理(相関処理)によって、最新の最新撮像画像中における前記参照画像に一致した領域を説明するための図である。 FIG. 1 is a block diagram showing a configuration of a distance measuring device in the embodiment. FIG. 2 is a diagram for explaining a distance calculation method in the distance measuring apparatus according to the embodiment. FIG. 2A is a diagram for describing a reference image extracted from a captured image, and FIG. 2B illustrates a region that matches the reference image in the latest latest captured image by pattern matching processing (correlation processing). It is a figure for doing.
 実施形態における測距装置は、移動体に搭載された撮像部によって撮像された撮像画像に基づいて、当該移動体から前記撮像画像に含まれる所定の物体までの距離を測定する装置である。前記移動体は、例えば自動車や列車等の車両、移動機能を備えたロボットおよび視覚障害者の移動支援装置等である。前記移動体は、自走可能であってもよい。 The distance measuring device according to the embodiment is a device that measures a distance from the moving body to a predetermined object included in the captured image based on a captured image captured by an imaging unit mounted on the moving body. The moving body is, for example, a vehicle such as an automobile or a train, a robot having a moving function, a movement support device for a visually impaired person, or the like. The mobile body may be capable of self-propelled.
 このような本実施形態の測距装置DMは、例えば、図1に示すように、撮像部1と、処理部2と、記憶部3と、報知部4とを備える。 Such a distance measuring device DM of the present embodiment includes, for example, an imaging unit 1, a processing unit 2, a storage unit 3, and a notification unit 4 as shown in FIG.
 撮像部1は、図略の前記移動体に搭載され、被写体を時間的に連続して撮像する装置である。好ましくは、撮像部1は、その撮像方向(光軸方向)と前記移動体の移動方向とを一致させて前記移動体に配設される。例えば、撮像部1は、車両に搭載される場合、撮像方向(光軸方向)を前方に向けて例えばダッシュボード上に配設される。撮像部1は、例えば、撮像光学系と、カラーまたはモノクロのイメージセンサと、画像処理部とを備え、撮像光学系によってイメージセンサの撮像面(受光面)上に被写体の光学像が結像され、イメージセンサによって前記被写体の光学像が光電変換され、この光電変換によって得られた信号から画像処理部の画像処理によって前記被写体の撮像画像が生成される。撮像部1は、被写体を所定の時間間隔で連続的に撮像する。例えば、撮像部1は、15フレーム/秒、24フレーム/秒および30フレーム/秒等の所定のフレームレートになるように、被写体を時間的に連続して撮像する。撮像部1は、処理部2に接続され、この生成した撮像画像を順次に処理部2へ出力する。 The imaging unit 1 is a device that is mounted on the mobile body (not shown) and continuously images a subject. Preferably, the imaging unit 1 is disposed on the moving body such that the imaging direction (optical axis direction) coincides with the moving direction of the moving body. For example, when the imaging unit 1 is mounted on a vehicle, the imaging unit 1 is disposed on a dashboard, for example, with the imaging direction (optical axis direction) facing forward. The imaging unit 1 includes, for example, an imaging optical system, a color or monochrome image sensor, and an image processing unit, and an optical image of a subject is formed on the imaging surface (light receiving surface) of the image sensor by the imaging optical system. The optical image of the subject is photoelectrically converted by the image sensor, and the captured image of the subject is generated from the signal obtained by the photoelectric conversion by the image processing of the image processing unit. The imaging unit 1 continuously images the subject at predetermined time intervals. For example, the imaging unit 1 continuously images the subject in time so as to obtain a predetermined frame rate such as 15 frames / second, 24 frames / second, and 30 frames / second. The imaging unit 1 is connected to the processing unit 2 and sequentially outputs the generated captured images to the processing unit 2.
 記憶部3は、処理部2に接続され、処理部2によって実行される種々のプログラムやその実行に必要なデータ等を予め記憶するROM(Read Only Memory)やEEPROM(Electrically Erasable Programmable Read Only Memory)等の不揮発性記憶素子、処理部2のいわゆるワーキングメモリとなるRAM(Random Access Memory)等の揮発性記憶素子およびその周辺回路を備えて構成される。そして、記憶部3には、機能的に、後述の参照画像を記憶する参照情報記憶部31が構成される。 The storage unit 3 is connected to the processing unit 2, and stores various programs executed by the processing unit 2 and data necessary for the execution in advance (ROM (Read Only Memory)) or EEPROM (Electrically Erasable Programmable Read Only Memory). And the like, a volatile memory element such as a RAM (Random Access Memory) serving as a so-called working memory of the processing unit 2, and a peripheral circuit thereof. The storage unit 3 is functionally configured with a reference information storage unit 31 that stores a later-described reference image.
 処理部2は、当該移動体から前記撮像画像に含まれる所定の物体までの距離を測定するべく、測距装置DMの各部を当該各部の機能に応じてそれぞれ制御するものである。処理部2は、例えば、CPU(Central Processing Unit)およびその周辺回路を備えて構成される。処理部2には、当該移動体から前記撮像画像に含まれる所定の物体までの距離を測定するための測距プログラムを実行することによって、機能的に、制御部21、距離演算部22、状態取得部23、破棄判定部24および通知部25が構成される。制御部21は、測距装置DM全体の制御を司るものである。 The processing unit 2 controls each unit of the distance measuring device DM according to the function of each unit in order to measure the distance from the moving body to a predetermined object included in the captured image. The processing unit 2 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits. The processing unit 2 functionally executes a ranging program for measuring the distance from the moving object to a predetermined object included in the captured image, thereby functionally controlling the control unit 21, the distance calculation unit 22, and the state. An acquisition unit 23, a discard determination unit 24, and a notification unit 25 are configured. The control unit 21 controls the entire distance measuring device DM.
 距離演算部22は、撮像部1によって時間的に連続して撮像された時系列な撮像画像のうちの1つを参照画像として参照情報記憶部31に記憶し、撮像部1によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と参照情報記憶部31に記憶された参照画像とに基づいて最新撮像画像に含まれる所定の物体までの距離を求めるものである。 The distance calculation unit 22 stores one of time-series captured images captured continuously in time by the imaging unit 1 in the reference information storage unit 31 as a reference image, and when captured by the imaging unit 1 A distance to a predetermined object included in the latest captured image is obtained based on the latest latest captured image of the series of captured images and the reference image stored in the reference information storage unit 31.
 参照画像は、前記距離を求める際に、予め設定された所定の物体の画像を撮像画像から抽出するように、前記撮像画像全体であってもよいが、本実施形態では、参照画像は、撮像画像から抽出された、予め設定された所定の物体の画像であり、前記撮像画像の一部である。このため、本実施形態の距離演算部22は、予め設定された所定のタイミング(例えば測距開始時や参照情報記憶部31に参照画像が未保存である場合等)で、予め設定された所定の物体の画像(対象物画像)を撮像画像から抽出し、この抽出した対象物画像を参照画像として参照情報記憶部31に記憶する。より具体的には、距離演算部22は、図2Aに示すように、最新撮像画像よりも時間的に前に撮像された撮像画像Ppから予め設定された所定の物体Obを検出し、この検出した前記所定の物体Ob全体を囲む矩形領域の画像(対象物画像)を参照画像TPとして撮像画像Ppから取り出し、この取り出した対象物画像を参照画像TPとして参照情報記憶部31に記憶する。なお、前記所定の物体が画像中にない場合には、参照画像TPを抽出せずに処理が最初に戻される。前記所定の物体Obは、例えば、車両(自動車および列車等)、二輪車(自転車および自動二輪車等)、人、動物および人工物等の、移動体の移動に対して障害となる可能性があるものである。前記所定の物体Obが自動車である場合には、前記特許文献1に開示されているように、画像処理による車両検出には、例えば、(1)道路上に引かれた白線の間に挟まれる第1の水平エッジ(横エッジ)を、画像の下から上に向けて検出し、(2)第1の水平エッジより上方にある第2の水平エッジを検出し、そして、(3)第1の水平エッジと第2の水平エッジとに挟まれる領域にある一対の第1および第2の垂直エッジを検出する等の公知の常套手段(従来方法)が用いられる。すなわち、画像処理による車両検出には、白線の間に挟まれるエッジのヒストグラム(柱状図)から上下のペアとなる第1および第2の水平エッジまたは左右のペアとなる第1および第2の垂直エッジを検出する等の公知の常套手段(従来方法)が用いられる。 The reference image may be the entire captured image so that a predetermined image of a predetermined object is extracted from the captured image when the distance is obtained. In this embodiment, the reference image is captured. This is an image of a predetermined object extracted from the image and is a part of the captured image. For this reason, the distance calculation unit 22 of the present embodiment has a predetermined timing set in advance at a predetermined timing (for example, when the distance measurement is started or the reference image is not stored in the reference information storage unit 31). The object image (object image) is extracted from the captured image, and the extracted object image is stored in the reference information storage unit 31 as a reference image. More specifically, as shown in FIG. 2A, the distance calculation unit 22 detects a predetermined object Ob set in advance from a captured image Pp captured temporally before the latest captured image, and this detection. The rectangular image (object image) surrounding the predetermined object Ob is extracted from the captured image Pp as the reference image TP, and the extracted object image is stored in the reference information storage unit 31 as the reference image TP. If the predetermined object is not present in the image, the process returns to the beginning without extracting the reference image TP. The predetermined object Ob may be an obstacle to movement of a moving object such as a vehicle (automobile and train), a motorcycle (bicycle and motorcycle), a person, an animal, and an artificial object. It is. When the predetermined object Ob is an automobile, as disclosed in Patent Document 1, for vehicle detection by image processing, for example, (1) It is sandwiched between white lines drawn on a road A first horizontal edge (lateral edge) is detected from the bottom to the top of the image, (2) a second horizontal edge above the first horizontal edge is detected, and (3) the first Well-known conventional means (conventional method) such as detecting a pair of first and second vertical edges in a region sandwiched between the horizontal edge and the second horizontal edge is used. That is, for vehicle detection by image processing, first and second vertical edges that form a pair of top and bottom or left and right pairs from a histogram (columnar diagram) of an edge sandwiched between white lines. Known conventional means (conventional method) such as edge detection is used.
 そして、距離演算部22は、前記最新撮像画像Ptから前記所定の物体Obを、参照画像TPまたは前記最新撮像画像Ptを拡大や縮小しつつ、参照画像TPをテンプレートとするテンプレートマッチングによって検出する。これによって距離演算部22は、前記最新撮像画像Pt上における前記所定の物体Obの新たな位置および大きさが検出される。そして、距離演算部22は、この検出した前記新たな位置および大きさに基づいて当該移動体と前記所定の物体Obとの間の距離を求める。前記距離zは、前記特許文献1に開示されているように、次の式(1)および式(2)によって与えられる。
θ=atan(f/xc)   ・・・(1)
z=f・w/wc   ・・・(2)
ここで、θは、画像上の物体Obにおける幅方向の中心xcで検出された位置の物体Obの存在する方位であり、fは、撮像部1の焦点距離であり、wは、実際の物体Obの幅であり、そして、wcは、画像上の物体Obの幅である。
The distance calculation unit 22 detects the predetermined object Ob from the latest captured image Pt by template matching using the reference image TP as a template while enlarging or reducing the reference image TP or the latest captured image Pt. Thereby, the distance calculation unit 22 detects a new position and size of the predetermined object Ob on the latest captured image Pt. Then, the distance calculation unit 22 obtains a distance between the moving body and the predetermined object Ob based on the detected new position and size. The distance z is given by the following equations (1) and (2), as disclosed in Patent Document 1.
θ = atan (f / xc) (1)
z = f · w / wc (2)
Here, θ is the direction in which the object Ob at the position detected at the center xc in the width direction of the object Ob on the image exists, f is the focal length of the imaging unit 1, and w is the actual object The width of Ob, and wc is the width of the object Ob on the image.
 状態取得部23は、撮像部1の撮像状態を取得するものである。より具体的には、状態取得部23は、撮像部1によって撮像された前記最新撮像画像Ptにおける所定の特徴量に基づいて撮像部1の撮像状態を取得するものである。例えば、好ましくは、状態取得部23は、前記所定の特徴量として、画素値または輝度値の統計量(例えば平均値や中央値等)を取得するものである。また例えば、好ましくは、状態取得部23は、前記所定の特徴量として、前記最新撮像画像における周波数特性を取得するものである。前記所定の特徴量を算出する際に、状態取得部23は、前記最新撮像画像の一部を用いてもよい。 The state acquisition unit 23 acquires the imaging state of the imaging unit 1. More specifically, the state acquisition unit 23 acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest captured image Pt captured by the imaging unit 1. For example, preferably, the state acquisition unit 23 acquires a statistical value (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount. Further, for example, preferably, the state acquisition unit 23 acquires a frequency characteristic in the latest captured image as the predetermined feature amount. When calculating the predetermined feature amount, the state acquisition unit 23 may use a part of the latest captured image.
 状態取得部23は、撮像部1の撮像状態として撮像部1の撮像方向を取得するものであってもよい。 The state acquisition unit 23 may acquire the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1.
 破棄判定部24は、状態取得部23によって取得された前記撮像状態が所定の条件を満たすか否かを判定し、前記判定の結果に基づいて参照情報記憶部31に記憶された参照画像TPを破棄するか否かを判定するものである。破棄判定部24は、状態取得部23によって取得された前記撮像状態が所定の条件を満たすと判定した場合に、参照情報記憶部31に記憶された参照画像TPを破棄するものである。すなわち、破棄判定部24は、参照情報記憶部31の参照画像TPを破棄させる。 The discard determination unit 24 determines whether or not the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition, and uses the reference image TP stored in the reference information storage unit 31 based on the determination result. It is determined whether or not to discard. The discard determination unit 24 discards the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. That is, the discard determination unit 24 discards the reference image TP in the reference information storage unit 31.
 より具体的には、破棄判定部24は、機能的に、状態判定部241と、保持可否判定部242と、破棄制御部243とを備える。状態判定部241は、状態取得部23によって取得された前記撮像状態が所定の条件を満たすか否かを判定するものである。保持可否判定部242は、状態判定部241によって判定された判定の結果に基づいて参照情報記憶部31に記憶された参照画像TPを破棄するか否かを判定するものである。破棄制御部243は、状態取得部23によって取得された前記撮像状態が所定の条件を満たすと判定された場合に、参照情報記憶部31の参照画像TPを破棄させるものである。 More specifically, the discard determination unit 24 functionally includes a state determination unit 241, a holdability determination unit 242, and a discard control unit 243. The state determination unit 241 determines whether the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. The holdability determination unit 242 determines whether or not to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241. The discard control unit 243 discards the reference image TP in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition.
 通知部25は、状態取得部23によって取得された前記撮像状態が所定の条件を満たすと破棄判定部24によって連続的に判定された連続判定回数nが閾値Thn以上である場合に、状態取得部23によって取得された前記撮像状態が所定の条件を満たす旨を報知部4によって外部に通知するものである。 When the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition, the notification unit 25 determines that the state acquisition unit 25 determines that the continuous determination count n continuously determined by the discard determination unit 24 is equal to or greater than a threshold value Thn. The notification unit 4 notifies the outside that the imaging state acquired by the video camera 23 satisfies a predetermined condition.
 報知部4は、処理部2に接続され、通知部25の制御に従って状態取得部23によって取得された前記撮像状態が所定の条件を満たす旨を外部に報知する装置である。報知部4は、例えば、光を発光する発光ダイオードを備えて構成され、通知部25の制御に従って点灯(点滅する場合を含む)することによって前記旨を外部に知らせる。また例えば、報知部4は、音を発するブザーまたはスピーカを備えて構成され、通知部25の制御に従って警告音を発することによって前記旨を外部に知らせる。また例えば、報知部4は、例えばCRTディスプレイや液晶ディスプレイ等の表示装置を備えて構成され、通知部25の制御に従って前記旨を表すメッセージを表示することによって前記旨を外部に知らせる。 The notification unit 4 is a device that is connected to the processing unit 2 and notifies the outside that the imaging state acquired by the state acquisition unit 23 in accordance with the control of the notification unit 25 satisfies a predetermined condition. The notification unit 4 includes, for example, a light emitting diode that emits light, and turns on (including the case of blinking) in accordance with the control of the notification unit 25 to notify the outside. Further, for example, the notification unit 4 is configured to include a buzzer or a speaker that emits a sound, and notifies the outside by issuing a warning sound according to the control of the notification unit 25. Further, for example, the notification unit 4 is configured to include a display device such as a CRT display or a liquid crystal display, for example, and notifies the outside by displaying a message indicating the above in accordance with the control of the notification unit 25.
 次に、本実施形態における測距装置DMの動作について、撮像条件に応じた参照画像の取捨を中心に、以下に説明する。図3は、実施形態における測距装置の動作を示すフローチャートである。 Next, the operation of the distance measuring apparatus DM in the present embodiment will be described below with a focus on discarding the reference image according to the imaging conditions. FIG. 3 is a flowchart showing the operation of the distance measuring apparatus in the embodiment.
 本実施形態における測距装置DMでは、図略の起動スイッチの入力や測距開始指示スイッチの入力等によって測距が開始されると、撮像部1は、処理部2の制御部21の制御に従って、被写体の撮像画像を所定のフレームレートで撮像し、時系列な撮像画像を順次に処理部2へ出力する。 In the distance measuring device DM according to the present embodiment, when the distance measurement is started by an input of an unillustrated start switch, an input of a distance measurement start instruction switch, or the like, the imaging unit 1 follows the control of the control unit 21 of the processing unit 2. The captured image of the subject is captured at a predetermined frame rate, and the time-series captured images are sequentially output to the processing unit 2.
 処理部2の距離演算部22は、撮像画像(入力画像)Ptが入力されると、図3において、まず、参照画像TPが記憶部3の参照情報記憶部31に記憶されているか否か(有るか否か)を判定する(S1)。この判定の結果、参照画像TPが参照情報記憶部31に記憶されていない(無い)場合(N)には、距離演算部22は、次の処理S3を実行する。一方、この処理S1における判定の結果、参照画像TPが参照情報記憶部31に記憶されている(有る)場合(Y)には、この撮像画像(入力画像)Ptに対し、テンプレートマッチングを行うために、参照画像TPを取得し(S2)、そして、処理S3を実行する。 When the captured image (input image) Pt is input, the distance calculation unit 22 of the processing unit 2 first determines whether or not the reference image TP is stored in the reference information storage unit 31 of the storage unit 3 in FIG. (S1). As a result of this determination, when the reference image TP is not stored (is not stored) in the reference information storage unit 31 (N), the distance calculation unit 22 executes the next process S3. On the other hand, when the reference image TP is stored (exists) in the reference information storage unit 31 as a result of the determination in step S1 (Y), template matching is performed on the captured image (input image) Pt. Then, the reference image TP is acquired (S2), and the process S3 is executed.
 この処理S3では、距離演算部22は、所定の物体Obを検出する。より具体的には、処理S1で参照画像TPが無いと判定された場合には、距離演算部22は、撮像画像(入力画像)Ptに対し、例えば上述した公知技術によって新たに所定の物体Obを検出する。一方、処理S1で参照画像TPが有ると判定された場合には、距離演算部22は、撮像画像(入力画像)に対し、処理S2で取得した参照画像TPをテンプレートに用いたテンプレートマッチングによって、所定の物体Obを検出し、撮像画像(入力画像)Pt上の新たな前記所定の物体Obの位置および大きさを検出する。 In this process S3, the distance calculation unit 22 detects a predetermined object Ob. More specifically, when it is determined in step S1 that there is no reference image TP, the distance calculation unit 22 newly adds a predetermined object Ob to the captured image (input image) Pt by, for example, the above-described known technique. Is detected. On the other hand, when it is determined in step S1 that the reference image TP is present, the distance calculation unit 22 performs template matching on the captured image (input image) using the reference image TP acquired in step S2 as a template. The predetermined object Ob is detected, and the position and size of the new predetermined object Ob on the captured image (input image) Pt are detected.
 次に、処理部2の状態取得部23は、撮像部1の撮像状態を取得し、この取得した撮像部1の撮像状態を破棄判定部24の状態判定部241に通知する(S4)。次に、状態判定部241は、撮像部1の撮像状態の通知を受けると、撮像部1の撮像状態が所定の条件を満たすか否かを判定し、判定結果を保持可否判定部242に通知する(S5)。これら処理S4における前記撮像状態の取得および処理S5における前記撮像状態の判定については、後述する。 Next, the state acquisition unit 23 of the processing unit 2 acquires the imaging state of the imaging unit 1, and notifies the state determination unit 241 of the discard determination unit 24 of the acquired imaging state of the imaging unit 1 (S4). Next, when receiving the notification of the imaging state of the imaging unit 1, the state determination unit 241 determines whether the imaging state of the imaging unit 1 satisfies a predetermined condition and notifies the determination result to the holdability determination unit 242. (S5). The acquisition of the imaging state in the process S4 and the determination of the imaging state in the process S5 will be described later.
 この処理S5における判定の結果、撮像部1の撮像状態が所定の条件を満たさない場合(N)には、状態判定部241は、その旨を距離演算部22に通知し、これによって距離演算部22は、処理S3で検出した前記所定の物体Obを含む画像を新たな参照画像TPとして参照情報記憶部31に記憶し、保存する(S6)。なお、処理S1で参照画像TPが参照情報記憶部31に記憶されていると判断されていた場合には、距離演算部22は、処理S3で求めた像画像(入力画像)Pt上の新たな前記所定の物体Obの位置および大きさに基づいて、当該移動体と前記所定の物体Obとの間の距離を求める。そして、距離演算部22は、時系列で順次に入力される次の撮像画像(次の入力画像)を処理するために、処理を処理S1に戻す。 As a result of the determination in step S5, when the imaging state of the imaging unit 1 does not satisfy the predetermined condition (N), the state determination unit 241 notifies the distance calculation unit 22 accordingly, thereby the distance calculation unit. 22 stores the image including the predetermined object Ob detected in the process S3 in the reference information storage unit 31 as a new reference image TP and stores it (S6). When it is determined that the reference image TP is stored in the reference information storage unit 31 in the process S1, the distance calculation unit 22 adds a new image on the image image (input image) Pt obtained in the process S3. Based on the position and size of the predetermined object Ob, a distance between the moving object and the predetermined object Ob is obtained. Then, the distance calculation unit 22 returns the process to the process S1 in order to process the next captured image (next input image) sequentially input in time series.
 一方、処理S5における判定の結果、撮像部1の撮像状態が所定の条件を満たす場合(Y)には、状態判定部241は、その旨を破棄判定部24の保持可否判定部242に通知し、これによって保持可否判定部242は、状態判定部241によって判定された判定の結果に基づいて参照情報記憶部31に記憶された参照画像TPを破棄するか否かを判定する。すなわち、保持可否判定部242は、参照画像TPが参照情報記憶部31に記憶されているか否か(有るか否か)を判定する(S7)。 On the other hand, as a result of the determination in step S5, when the imaging state of the imaging unit 1 satisfies a predetermined condition (Y), the state determination unit 241 notifies the hold determination unit 242 of the discard determination unit 24 to that effect. Accordingly, the holdability determination unit 242 determines whether to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241. That is, the holdability determination unit 242 determines whether or not the reference image TP is stored in the reference information storage unit 31 (S7).
 この処理S7における判定の結果、参照画像TPが参照情報記憶部31に記憶されている(有る)場合(Y)には、保持可否判定部242は、その旨を破棄判定部24の破棄制御部243に通知し、破棄制御部243は、参照情報記憶部31に記憶されている参照画像TPを、参照情報記憶部31に破棄させ(S9)、次の撮像画像(次の入力画像)を処理するために、処理を処理S1に戻す。この処理S9において、参照画像TPは、例えば、参照画像TPのデータ自体を参照情報記憶部31から消去することで放棄されてよく、また例えば、参照画像TPは、参照画像TPのデータを記憶していた記憶領域を開放することで放棄されてよい。 When the reference image TP is stored (present) in the reference information storage unit 31 as a result of the determination in step S7 (Y), the holdability determination unit 242 notifies the discard control unit of the discard determination unit 24 to that effect. The discard control unit 243 causes the reference information storage unit 31 to discard the reference image TP stored in the reference information storage unit 31 (S9), and processes the next captured image (next input image). In order to do so, the process returns to process S1. In this process S9, for example, the reference image TP may be discarded by deleting the data of the reference image TP from the reference information storage unit 31. For example, the reference image TP stores the data of the reference image TP. It may be abandoned by releasing the storage area.
 一方、前記処理S7における判定の結果、参照画像TPが参照情報記憶部31に記憶されていない(無い)場合(N)には、保持可否判定部242は、撮像部1の撮像状態が所定の条件を満たす旨を通知部25に通知し、通知部25は、撮像部1の撮像状態が所定の条件を満たす旨を連続的に通知された場合に、その通知回数を連続判定回数nとして計数し、この計数した連続判定回数nが閾値Thn以上であるか否かを判定する(S8)。 On the other hand, when the reference image TP is not stored (is not stored) in the reference information storage unit 31 (N) as a result of the determination in the process S7, the holdability determination unit 242 determines that the imaging state of the imaging unit 1 is predetermined. The notification unit 25 notifies the notification unit 25 that the condition is satisfied, and when the notification unit 25 is continuously notified that the imaging state of the imaging unit 1 satisfies the predetermined condition, the notification unit 25 counts the number of notifications as the continuous determination number n. Then, it is determined whether or not the counted number n of consecutive determinations is equal to or greater than a threshold value Thn (S8).
 この処理S8における判定の結果、連続判定回数nが閾値Thn未満である場合(N)には、通知部25は、時系列で順次に入力される次の撮像画像(次の入力画像)を処理するために、処理を処理S1に戻す。 As a result of the determination in step S8, when the number of consecutive determinations n is less than the threshold value Thn (N), the notification unit 25 processes the next captured image (next input image) sequentially input in time series. In order to do so, the process returns to process S1.
 一方、この処理S8における判定の結果、連続判定回数nが閾値Thn以上である場合(N)には、通知部25は、状態取得部23によって取得された前記撮像状態が所定の条件を満たす旨を報知部4によって外部に通知する(S10)。通知部25は、時系列で順次に入力される次の撮像画像(次の入力画像)を処理するために、処理を処理S1に戻す。 On the other hand, if the result of determination in step S8 is that the number of consecutive determinations n is equal to or greater than the threshold value Thn (N), the notification unit 25 indicates that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. Is notified to the outside by the notification unit 4 (S10). The notification unit 25 returns the process to step S1 in order to process the next captured image (next input image) sequentially input in time series.
 なお、図3に示す処理S1ないし処理S10は、上述のように、時系列で順次に入力される撮像画像(入力画像)Ptごとに実行される。したがって、処理S8において、撮像部1の撮像状態が所定の条件を満たす旨を連続的に通知される場合は、これら時系列で順次に入力される撮像画像(入力画像)Ptが、連続して前記所定の条件を満たす撮像状態で連続して撮像される場合である。 Note that the processes S1 to S10 shown in FIG. 3 are executed for each captured image (input image) Pt sequentially input in time series as described above. Therefore, when it is continuously notified in the process S8 that the imaging state of the imaging unit 1 satisfies the predetermined condition, the captured images (input images) Pt sequentially input in time series are continuously displayed. This is a case where images are continuously captured in an imaging state that satisfies the predetermined condition.
 ここで、上述の処理S4における前記撮像状態の取得および処理S5における前記撮像状態の判定について、以下に説明する。図4は、撮像部を搭載した車両がトンネルに入る際に、実施形態の測距装置における撮像部の撮像状態として画像の輝度値の中央値を用いた場合における前記撮像状態の判定手法を説明するための図である。図5は、撮像部を搭載した車両がトンネルを出る際に、実施形態の測距装置における撮像部の撮像状態として画像の輝度値の中央値を用いた場合における前記撮像状態の判定手法を説明するための図である。図4Aおよび図5Aは、時系列な撮像画像(各フレーム)における輝度値の中央値の時間変化を示す図であり、その横軸は、フレーム(時間)であり、その縦軸は、輝度値の中央値である。図4Bおよび図5Bは、前記中央値の変化量の時間変化を示す図であり、その横軸は、フレーム(時間)であり、その縦軸は、輝度値の変化量である。図6は、実施形態の測距装置における撮像部に水滴が付着する前後における前記撮像部によって撮像される各撮像画像を模式的に示す図である。図6Aは、前記撮像部に水滴が付着する前に前記撮像部によって撮像される撮像画像を模式的に示し、図6Bは、前記撮像部に水滴が付着した後に前記撮像部によって撮像される撮像画像を模式的に示す。図7は、実施形態の測距装置における撮像部の撮像状態として画像の周波数特性を用いた場合における前記撮像状態の判定手法を説明するための図である。図7Aは、前記撮像部に水滴が付着する前後における各撮像画像の各周波数特性を示し、図7Bは、前記撮像部に水滴が付着したか否かを判定する第1態様の判定手法を説明するための図であり、そして、図7Cは、前記撮像部に水滴が付着したか否かを判定する第2態様の判定手法を説明するための図である。図7A、BおよびCの横軸は、周波数であり、その縦軸は、強度である。図8は、実施形態の測距装置における状態取得部によって用いられる画像領域を説明するための図である。図8Aは、状態取得部によって用いられる第1態様の画像領域を説明するための図であり、図8Bは、状態取得部によって用いられる第2態様の画像領域を説明するための図である。図9は、実施形態の測距装置における撮像部の撮像状態として、前記撮像部を搭載した車両の操舵角を用いた場合における前記撮像状態の判定手法を説明するための図である。図9Aは、操舵角の時間変化を示す図であり、その横軸は、時間であり、その縦軸は、操舵角である。図9Bは、操舵角の変化量の時間変化を示す図であり、その横軸は、時間であり、その縦軸は、操舵角の変化量である。 Here, the acquisition of the imaging state in the above-described process S4 and the determination of the imaging state in the process S5 will be described below. FIG. 4 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment when the vehicle equipped with the imaging unit enters the tunnel. It is a figure for doing. FIG. 5 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring device of the embodiment when the vehicle equipped with the imaging unit exits the tunnel. It is a figure for doing. 4A and 5A are diagrams showing temporal changes in the median luminance value in a time-series captured image (each frame), the horizontal axis is a frame (time), and the vertical axis is the luminance value. Is the median of FIG. 4B and FIG. 5B are diagrams showing the time change of the change amount of the median value, the horizontal axis is the frame (time), and the vertical axis is the change amount of the luminance value. FIG. 6 is a diagram schematically illustrating captured images captured by the imaging unit before and after water droplets adhere to the imaging unit in the distance measuring apparatus according to the embodiment. 6A schematically illustrates a captured image captured by the imaging unit before water droplets adhere to the imaging unit, and FIG. 6B illustrates an image captured by the imaging unit after water droplets adhere to the imaging unit. An image is shown typically. FIG. 7 is a diagram for describing a method for determining the imaging state when the frequency characteristics of the image are used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment. FIG. 7A shows each frequency characteristic of each captured image before and after water droplets adhere to the imaging unit, and FIG. 7B illustrates a determination method according to the first aspect for determining whether or not water droplets have adhered to the imaging unit. FIG. 7C is a diagram for explaining a determination method of the second mode for determining whether or not water droplets have adhered to the imaging unit. In FIGS. 7A, 7B, and 7C, the horizontal axis represents frequency, and the vertical axis represents intensity. FIG. 8 is a diagram for explaining an image region used by the state acquisition unit in the distance measuring apparatus according to the embodiment. FIG. 8A is a diagram for explaining an image area of the first mode used by the state acquisition unit, and FIG. 8B is a diagram for explaining an image region of the second mode used by the state acquisition unit. FIG. 9 is a diagram for explaining a method for determining the imaging state when the steering angle of a vehicle equipped with the imaging unit is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment. FIG. 9A is a diagram showing the time change of the steering angle, the horizontal axis is time, and the vertical axis is the steering angle. FIG. 9B is a diagram showing the change over time in the change amount of the steering angle, the horizontal axis is time, and the vertical axis is the change amount of the steering angle.
 上述の状態取得部23は、撮像部1によって撮像された時系列な撮像画像のうちの最新の最新撮像画像Ptにおける所定の特徴量に基づいて撮像部1の撮像状態を取得する状態取得部23aであってもよい。このような測距装置DMは、撮像部1によって撮像された最新の撮像画像(入力画像)Ptを用いるので、撮像部1の撮像状態を取得するための装置を別途に必要としない。 The state acquisition unit 23 described above acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest latest captured image Pt among the time-series captured images captured by the imaging unit 1. It may be. Such a distance measuring device DM uses the latest captured image (input image) Pt imaged by the imaging unit 1, and therefore does not require a separate device for acquiring the imaging state of the imaging unit 1.
 前記所定の特徴量は、最新撮像画像Ptにおける種々の属性値を用いることができる。例えば、前記所定の特徴量は、最新撮像画像Ptにおける各画素の各画素値の平均値、最新撮像画像Ptにおける各画素の各画素値の中央値、最新撮像画像Ptにおける各画素の各輝度値の平均値および最新撮像画像Ptにおける各画素の各輝度値の中央値等の、最新撮像画像Ptにおける各画素の所定の統計量である。なお、外乱の影響を少なくするために、前記所定の特徴量は、中央値であることが好ましい。より具体的には、最新撮像画像Ptにおける各画素の各画素値の中央値は、時系列な各撮像画像P(各フレーム、時間経過)に対し、移動体がトンネルに入る場合には図4Aに示すように、トンネルに入る前(トンネル外)では相対的に大きな値で推移し、トンネルに差し掛かると漸減し、そして、トンネルに入った後(トンネル内)では相対的に小さな値で推移するように変化する。また、最新撮像画像Ptにおける各画素の各画素値の中央値は、時系列な各撮像画像P(各フレーム、時間経過)に対し、移動体がトンネルから出る場合には図5Aに示すように変化する。前記中央値は、トンネルから出る前(トンネル内)では相対的に小さな値で推移し、トンネルに差し掛かると漸増し、そして、トンネルから出た後(トンネル外)では相対的に大きな値で推移するように変化する。このように最新撮像画像Ptにおける各画素の各画素値の中央値は、変化するので、その変化量(最新撮像画像Ptの中央値と、1つ前の撮像画像Pの中央値との差)は、図4Aの場合では図4Bに示すように、そして、図5Aの場合では図5Bに示すように、トンネルに差し掛かかっている場合のみに、相対的に大きな値となる。このように中央値の変化量は、変化するため、破棄判定部24は、状態取得部23aによって取得された、最新撮像画像Ptの中央値における変化量が所定の閾値Th1以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの中央値における変化量が所定の閾値Th1以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、最新撮像画像Ptの中央値における変化量が所定の閾値Th1未満であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たさないと判定する。前記所定の閾値Th1は、例えば複数のサンプルを測定することによって統計的に適宜に設定される。 As the predetermined feature amount, various attribute values in the latest captured image Pt can be used. For example, the predetermined feature amount includes an average value of each pixel value of each pixel in the latest captured image Pt, a median value of each pixel value of each pixel in the latest captured image Pt, and each luminance value of each pixel in the latest captured image Pt. And a predetermined statistic of each pixel in the latest captured image Pt, such as a median value of each luminance value of each pixel in the latest captured image Pt. In order to reduce the influence of disturbance, the predetermined feature amount is preferably a median value. More specifically, the median value of each pixel value of each pixel in the latest captured image Pt is shown in FIG. 4A when the moving body enters the tunnel with respect to each time-series captured image P (each frame, time elapsed). As shown in Fig. 2, it changes at a relatively large value before entering the tunnel (outside the tunnel), gradually decreases when it reaches the tunnel, and changes at a relatively small value after entering the tunnel (inside the tunnel). To change. Further, the median value of each pixel value of each pixel in the latest captured image Pt is as shown in FIG. 5A when the moving body exits the tunnel with respect to each time-series captured image P (each frame, time elapsed). Change. The median value is relatively small before exiting the tunnel (inside the tunnel), gradually increasing when reaching the tunnel, and relatively large after exiting the tunnel (outside the tunnel). To change. Thus, since the median value of each pixel value of each pixel in the latest captured image Pt changes, the amount of change (the difference between the median value of the latest captured image Pt and the median value of the previous captured image P). As shown in FIG. 4B in the case of FIG. 4A and as shown in FIG. 5B in the case of FIG. 5A, the value is relatively large only when the tunnel is approaching. Since the change amount of the median value changes in this way, the discard determination unit 24 determines whether or not the change amount in the median value of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th1. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is equal to or greater than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is less than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Judge that there is no. The predetermined threshold Th1 is statistically appropriately set by measuring a plurality of samples, for example.
 なお、図4および図5に示す前記中央値の変化および前記中央値の変化量の変化は、トンネルの出入りの場合だけでなく、比較的長いガード下を出入りする場合や、順光から逆光になった場合および逆光から順光になった場合や、対向車のライトに撮像部1が照射された場合等の、明るさが変化した場合に生じ得る。 Note that the change in the median value and the change in the median value shown in FIGS. 4 and 5 are not only in the case of entering / exiting the tunnel, but also when entering / exiting a relatively long guard, or from normal light to backlight. This may occur when the brightness changes, such as when the image is changed, when the backlight changes to the forward light, or when the imaging unit 1 is irradiated on the light of the oncoming vehicle.
 また、このような撮像状態が変化した場合には、図4および図5に示すように、例えば撮像部1のAE(Automatic Exposure、自動露出)が機能するまでのように、変化後の撮像状態に撮像部1が対処して撮像画像が安定するまで、通常、所定の時間を要する(AEの不安定な期間Tu)。このため、上述の測距装置DMにおいて、図1に破線で示すように、参照情報記憶部31に記憶された参照画像TPを破棄した場合に、この破棄を判定した処理S1ないし処理S10の処理に用いられた撮像画像(入力画像、上記最新撮像画像)Ptを撮像した以後で所定枚数を撮像した後(所定時間経過した後)に、撮像部1によって撮像された撮像画像Pを新たな参照画像TPとして参照情報記憶部31に記憶する参照情報更新部をさらに備えるように、測距装置DMが構成されてもよい。このような参照情報更新部を備えることによって、この不安定な状態で撮像された撮像画像を参照情報記憶部31に記憶することが低減され、測距装置DMは、検出精度を向上することができる。 Further, when such an imaging state changes, as shown in FIGS. 4 and 5, for example, until the AE (Automatic Exposure, automatic exposure) of the imaging unit 1 functions, the imaging state after the change Usually, a predetermined time is required until the imaging unit 1 copes with and the captured image is stabilized (AE unstable period Tu). For this reason, in the distance measuring device DM described above, as indicated by a broken line in FIG. 1, when the reference image TP stored in the reference information storage unit 31 is discarded, the processing of processing S1 to processing S10 that determines this discarding After a predetermined number of images have been captured after the captured image (input image, the latest captured image) Pt used in the above is captured (after a predetermined time has elapsed), the captured image P captured by the imaging unit 1 is newly referred to. The distance measuring apparatus DM may be configured to further include a reference information updating unit that stores the image TP in the reference information storage unit 31. By including such a reference information update unit, storing the captured image captured in this unstable state in the reference information storage unit 31 is reduced, and the distance measuring device DM can improve the detection accuracy. it can.
 また例えば、前記所定の特徴量は、最新撮像画像Ptにおける周波数特性である。例えば撮像部1に虫や水滴等の異物が付着した場合、前方の一部が不明瞭となり、前記異物の前後で最新撮像画像Ptの周波数特性が変化する。この撮像画像Pの周波数特性は、撮像画像Pに含まれる周波数成分(強度)の分布である周波数スペクトルであり、周波数に対する強度である。例えば、撮像部1に水滴Dpが付着する場合、水滴Dpの付着前では、撮像画像P1は、図6Aに模式的に示すように、全体に亘って比較的鮮明であるが、水滴Dpの付着後では、撮像画像P2は、図6Bに模式的に示すように、一部分が比較的不鮮明となる。このため、水滴Dpの付着後における撮像画像P2は、水滴Dpの付着前における撮像画像P1に較べて高周波成分が減少する。したがって、図7Aに示すように、水滴Dpの付着前における撮像画像P1の周波数特性αと、水滴Dpの付着後における撮像画像P2の周波数特性βとは、異なり、水滴Dpの付着後における撮像画像P2の周波数特性βは、水滴Dpの付着前における撮像画像P1の周波数特性αに対し、低周波側(例えば周波数fL以下の周波数範囲)の強度が大きい一方、高周波側(例えば周波数fH以上の周波数範囲)の強度が小さいプロファイルとなる。 Further, for example, the predetermined feature amount is a frequency characteristic in the latest captured image Pt. For example, when a foreign matter such as an insect or a water droplet adheres to the imaging unit 1, a part of the front is unclear, and the frequency characteristics of the latest captured image Pt change before and after the foreign matter. The frequency characteristic of the captured image P is a frequency spectrum that is a distribution of frequency components (intensities) included in the captured image P, and is an intensity with respect to the frequency. For example, when the water droplet Dp adheres to the imaging unit 1, before the water droplet Dp adheres, the captured image P1 is relatively clear as a whole as schematically shown in FIG. 6A, but the water droplet Dp adheres. Later, as shown schematically in FIG. 6B, a part of the captured image P2 becomes relatively unclear. For this reason, the high frequency component of the captured image P2 after the water drop Dp is attached is reduced compared to the captured image P1 before the water drop Dp is attached. Therefore, as shown in FIG. 7A, the frequency characteristic α of the captured image P1 before the water droplet Dp is attached is different from the frequency characteristic β of the captured image P2 after the water droplet Dp is attached, and the captured image after the water droplet Dp is attached. The frequency characteristic β of P2 is higher in intensity on the low frequency side (for example, the frequency range of frequency fL or lower) than the frequency characteristic α of the captured image P1 before the water drop Dp is attached, while on the high frequency side (for example, frequency of frequency fH or higher) (Range) is a small profile.
 このように異物の付着の前後において撮像画像の周波数特性は、変化するため、破棄判定部24は、例えば、図7Bに示すように、状態取得部23aによって取得された、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記変化量が所定の閾値Th21未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。また例えば、破棄判定部24は、図7Bに示すように、状態取得部23aによって取得された、最新撮像画像Ptの周波数特性における高周波数fHの変化量が所定の閾値Th22以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における高周波数fHの変化量が所定の閾値Th22以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記変化量が所定の閾値Th22未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。また例えば、破棄判定部24は、状態取得部23によって取得された、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であってその高周波数fHの変化量が所定の閾値Th22以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であって前記高周波数fHの変化量が所定の閾値Th22以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定し、これ以外の場合には、前記撮像状態が所定の条件を満たさないと判定する。 Thus, since the frequency characteristics of the captured image change before and after the attachment of the foreign matter, the discard determination unit 24, for example, as shown in FIG. 7B, the frequency of the latest captured image Pt acquired by the state acquisition unit 23a. By determining whether or not the amount of change in the low frequency fL in the characteristic is greater than or equal to a predetermined threshold Th21, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th21, the imaging state acquired by the state acquisition unit 23a is predetermined. It is determined that the above condition is satisfied. Then, when determining that the amount of change is less than a predetermined threshold Th21, the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, as illustrated in FIG. 7B, the discard determination unit 24 determines whether or not the amount of change in the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th22. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th22, the imaging state acquired by the state acquisition unit 23a is predetermined. It is determined that the above condition is satisfied. When determining that the amount of change is less than a predetermined threshold Th22, the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined. By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
 あるいは、破棄判定部24は、例えば、図7Cに示すように、状態取得部23aによって取得された、最新撮像画像Ptの周波数特性における低周波数fLの強度が所定の閾値Th31以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における低周波数fLの強度が所定の閾値Th31以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記強度が所定の閾値Th31未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。また例えば、破棄判定部24は、図7Cに示すように、状態取得部23aによって取得された、最新撮像画像Ptの周波数特性における高周波数fHの強度量が所定の閾値Th32以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における高周波数fHの強度量が所定の閾値Th32以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記強度が所定の閾値Th32未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。また例えば、破棄判定部24は、状態取得部23によって取得された、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であってその高周波数fHの変化量が所定の閾値Th22以上であるか否かを判定することによって、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、最新撮像画像Ptの周波数特性における低周波数fLの変化量が所定の閾値Th21以上であってその高周波数fHの変化量が所定の閾値Th22以上であると判定した場合には、状態取得部23aによって取得された前記撮像状態が所定の条件を満たすと判定し、これ以外の場合には、前記撮像状態が所定の条件を満たさないと判定する。 Alternatively, for example, as illustrated in FIG. 7C, the discard determination unit 24 determines whether the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th31. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th31, the imaging state acquired by the state acquisition unit 23a is a predetermined value. It is determined that the condition is satisfied. If the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th31, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, as illustrated in FIG. 7C, the discard determination unit 24 determines whether the intensity amount of the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th32. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the intensity amount of the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th32, the imaging state acquired by the state acquisition unit 23a is predetermined. It is determined that the above condition is satisfied. If the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th32, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined. By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
 これら前記所定の閾値Th21、Th22、Th31、Th32は、例えば複数のサンプルを測定することによって統計的に適宜に設定される。 The predetermined threshold values Th21, Th22, Th31, and Th32 are statistically appropriately set, for example, by measuring a plurality of samples.
 なお、上述では、状態取得部23は、1つの特徴量を取得し、破棄判定部24は、この状態取得部23によって取得された1つの特徴量に基づいて撮像状態を判定したが、状態取得部23は、複数の特徴量をそれぞれ取得し、破棄判定部24は、この状態取得部23によって取得されたこれら複数の特徴量に基づいて撮像状態を判定してもよい。 In the above description, the state acquisition unit 23 acquires one feature amount, and the discard determination unit 24 determines the imaging state based on the one feature amount acquired by the state acquisition unit 23. The unit 23 may acquire a plurality of feature amounts, and the discard determination unit 24 may determine the imaging state based on the plurality of feature amounts acquired by the state acquisition unit 23.
 また、このような撮像画像Pから特徴量を検出する場合に、状態取得部23aは、最新撮像画像Ptの一部を用いることが好ましい。このように最新撮像画像(入力画像)Ptの一部を用いて撮像部1の撮像状態を取得するので、情報処理の負荷が低減される。最新撮像画像Ptの一部は、例えば、図9Aに示すように、最新撮像画像Pt1における移動体に最も近接する下端から消失点までの矩形領域Pt11であってよい。一般に、撮像画像Pにおける例えば路面の消失点よりも上側の領域は、空が描画されている場合が多く、移動体の移動に対する障害物を判定する領域から除外可能である。このため、最新撮像画像Ptの一部を消失点よりも下側の領域に設定することによって、移動体の移動に対する障害物を判定する観点から、測距装置DMは、撮像画像Pから特徴量を検出する画像領域を適切に設定できる。また、移動体の移動に対する障害物は、移動体の走行領域内に存在するものが重要であるから、また例えば、最新撮像画像Ptの一部は、図9Bに示すように、移動体の走行領域Pt21であってよい。移動体の走行領域Pt21は、例えば、最新撮像画像Pt2における移動体に最も近接する下端の両端と消失点とによって囲まれる三角形領域(仮想の走行領域)であってよく、また例えば、移動体の走行領域Pt21は、最新撮像画像Pt2における移動体に最も近接する下端と路面の両端に描画される一対の白線とによって囲まれる三角形領域(実際の走行領域)であってよい。図9Bに示す例では、路面の両端に描画される一対の白線は、消失点で交差している。消失点は、実際には平行である平行線が画像中で交わる点であり、撮像画像が遠近法を用いたように形成されるために生じる。このように最新撮像画像Ptの一部を移動体の走行領域Pt21に設定することによって、その面積が図9Aに示す矩形領域Pt11の面積よりも狭くなるので、情報処理の負荷がより低減される。 In addition, when detecting a feature amount from such a captured image P, it is preferable that the state acquisition unit 23a uses a part of the latest captured image Pt. Thus, since the imaging state of the imaging unit 1 is acquired using a part of the latest captured image (input image) Pt, the load of information processing is reduced. A part of the latest captured image Pt may be, for example, a rectangular region Pt11 from the lower end closest to the moving body to the vanishing point in the latest captured image Pt1, as shown in FIG. 9A. In general, in the captured image P, for example, an area above the vanishing point of the road surface is often drawn in the sky, and can be excluded from the area for determining an obstacle to the movement of the moving body. For this reason, from the viewpoint of determining an obstacle to the movement of the moving body by setting a part of the latest captured image Pt in a region below the vanishing point, the distance measuring device DM determines the feature amount from the captured image P. It is possible to appropriately set the image area for detecting the image. Further, since it is important that an obstacle to the movement of the moving body is present in the traveling area of the moving body, for example, a part of the latest captured image Pt is traveled by the moving body as shown in FIG. 9B. It may be a region Pt21. The traveling area Pt21 of the moving object may be, for example, a triangular area (virtual traveling area) surrounded by both ends of the lower end closest to the moving object and the vanishing point in the latest captured image Pt2. The travel area Pt21 may be a triangular area (actual travel area) surrounded by a lower end closest to the moving body in the latest captured image Pt2 and a pair of white lines drawn on both ends of the road surface. In the example shown in FIG. 9B, a pair of white lines drawn at both ends of the road surface intersect at a vanishing point. The vanishing point is a point where parallel lines that are actually parallel intersect in the image, and occurs because the captured image is formed using the perspective method. Thus, by setting a part of the latest captured image Pt in the travel region Pt21 of the moving body, the area becomes narrower than the area of the rectangular region Pt11 shown in FIG. 9A, so that the information processing load is further reduced. .
 また、上述の状態取得部23は、撮像部1の撮像状態として撮像部1の撮像方向を取得する状態取得部23bであってもよい。このような測距装置DMは、撮像部1の撮像方向を撮像部1の撮像状態とするので、移動体の移動方向を撮像部1の撮像状態に流用することができ、また、例えば左折や右折等の、移動体の移動方向の変更に対処できる。 Further, the state acquisition unit 23 described above may be a state acquisition unit 23b that acquires the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1. Since such a distance measuring device DM sets the imaging direction of the imaging unit 1 to the imaging state of the imaging unit 1, the moving direction of the moving body can be diverted to the imaging state of the imaging unit 1. It is possible to cope with a change in the moving direction of the moving body such as a right turn.
 撮像部1の撮像方向は、移動体に搭載される撮像部1が、通常、上述したように、移動体の移動方向に一致するように配設されるため、移動体の移動方向を取得することによって取得可能である。より具体的には、例えば、上述の測距装置DMは、移動体に搭載され、移動体の移動方向を検出するジャイロセンサをさらに備え、状態取得部23bは、前記ジャイロセンサの検出結果に基づいて移動体の移動方向、すなわち、撮像部1の撮像方向を取得する。また例えば、移動体が車両である場合には、状態取得部23bは、操舵角情報をいわゆるCAN(Controller Area Network)を伝送する車両情報から取得し、この取得した操舵角情報に基づいて移動体の移動方向、すなわち、撮像部1の撮像方向を取得する。前記車両情報は、操舵角情報の他に、横加速度、ヨーレートおよび横滑り防止装置の動作情報等も含まれ、状態取得部23bは、これらの情報に基づいて、あるいは、これらの情報も考慮することで移動体の移動方向、すなわち、撮像部1の撮像方向を取得してもよい。このように状態取得部23bが撮像部1の撮像方向を取得する場合には、破棄判定部24は、状態取得部23bによって取得された撮像部1の撮像方向が所定の条件を満たすか否かを判定することによって、前記撮像状態が所定の条件を満たすか否かを判定できる。例えば、撮像部1の撮像方向として状態取得部23bが操舵角を用いる場合には、ドライバーが操舵することによって、操舵角は、時間経過に従って、例えば、図9Aに示すように、操舵の開始(ハンドルの回転開始)に応じて増加を始め、移動体の方向転換中では操舵が維持されるので(ハンドルが固定されるので)、略一定となり、そして、移動体の方向がドライバーの所望の方向に略転換すると操舵が戻されるので(ハンドルが戻されるので)、減少を始めるプロファイルとなる。このため、破棄判定部24は、例えば、図9Aに示すように、状態取得部23bによって取得された操舵角が所定の閾値Th41以上であるか否かを判定することによって、状態取得部23bによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、状態取得部23bによって取得された操舵角が所定の閾値Th41以上であると判定した場合には、状態取得部23bによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記操舵角が所定の閾値Th41未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。また、操舵角は、上述のように変化するので、その変化量(最新撮像画像Ptを撮像した際の操舵角と、1つ前の撮像画像Pを撮像した際の操舵角との差)は、図9Bに示すように、操舵している場合(ハンドルの回転中の場合)のみに、相対的に大きな値となる。このように操舵角の変化量は、変化するため、また例えば、破棄判定部24は、図9Bに示すように、状態取得部23bによって取得された操舵角の変化量が所定の閾値Th42以上であるか否かを判定することによって、状態取得部23bによって取得された前記撮像状態が所定の条件を満たすか否かを判定できる。すなわち、破棄判定部24は、状態取得部23bによって取得された操舵角の変化量(絶対値)が所定の閾値Th42以上であると判定した場合には、状態取得部23bによって取得された前記撮像状態が所定の条件を満たすと判定する。そして、破棄判定部24は、前記操舵角の変化量(絶対値)が所定の閾値Th42未満であると判定した場合には、前記撮像状態が所定の条件を満たさないと判定する。これら前記所定の閾値Th41、Th42は、例えば複数のサンプルを測定することによって統計的に適宜に設定される。 As for the imaging direction of the imaging unit 1, since the imaging unit 1 mounted on the moving body is normally arranged to match the moving direction of the moving body as described above, the moving direction of the moving body is acquired. Can be obtained. More specifically, for example, the distance measuring device DM described above is mounted on a moving body and further includes a gyro sensor that detects the moving direction of the moving body, and the state acquisition unit 23b is based on the detection result of the gyro sensor. The moving direction of the moving body, that is, the imaging direction of the imaging unit 1 is acquired. For example, when the moving body is a vehicle, the state acquisition unit 23b acquires the steering angle information from vehicle information that transmits a so-called CAN (Controller Area Network), and the moving body is based on the acquired steering angle information. Movement direction, that is, the imaging direction of the imaging unit 1 is acquired. In addition to the steering angle information, the vehicle information includes lateral acceleration, yaw rate, operation information of the skid prevention device, and the like, and the state acquisition unit 23b considers these information based on these information. Thus, the moving direction of the moving body, that is, the imaging direction of the imaging unit 1 may be acquired. Thus, when the state acquisition unit 23b acquires the imaging direction of the imaging unit 1, the discard determination unit 24 determines whether the imaging direction of the imaging unit 1 acquired by the state acquisition unit 23b satisfies a predetermined condition. It is possible to determine whether or not the imaging state satisfies a predetermined condition. For example, when the state acquisition unit 23b uses the steering angle as the imaging direction of the imaging unit 1, as the driver steers, the steering angle starts to start as shown in FIG. The steering starts to increase as the steering wheel starts rotating, and the steering is maintained during the change of direction of the moving body (because the steering wheel is fixed), so that the direction of the moving body becomes the desired direction of the driver. Since the steering is returned (as the steering wheel is returned), the profile starts to decrease. For this reason, for example, as shown in FIG. 9A, the discard determination unit 24 determines whether or not the steering angle acquired by the state acquisition unit 23b is equal to or greater than a predetermined threshold Th41 by the state acquisition unit 23b. It can be determined whether or not the acquired imaging state satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th41, the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the steering angle is less than the predetermined threshold Th41, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. In addition, since the steering angle changes as described above, the amount of change (the difference between the steering angle when the latest captured image Pt is captured and the steering angle when the previous captured image P is captured) is As shown in FIG. 9B, a relatively large value is obtained only when steering is being performed (when the steering wheel is rotating). Since the change amount of the steering angle changes in this way, for example, as shown in FIG. 9B, the discard determination unit 24 determines that the change amount of the steering angle acquired by the state acquisition unit 23b is greater than or equal to a predetermined threshold Th42. By determining whether or not there is, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition. In other words, when the discard determination unit 24 determines that the change amount (absolute value) of the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th42, the imaging acquired by the state acquisition unit 23b. It is determined that the state satisfies a predetermined condition. If the discard determination unit 24 determines that the change amount (absolute value) of the steering angle is less than the predetermined threshold Th42, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. These predetermined threshold values Th41 and Th42 are statistically appropriately set by measuring, for example, a plurality of samples.
 このように本実施形態における測距装置DMおよびこれに実装された測距方法では、前記撮像状態が所定の条件を満たすか否かが判定され、この判定の結果に基づいて参照情報記憶部31に記憶された参照画像TPを破棄するか否かが判定される。このため、本実施形態における測距装置DMおよび測距方法は、前記破棄の判定結果に基づいて参照情報記憶部31に記憶された参照画像TPを破棄すべきか否かが分かるので、前記撮像状態が変化した場合でも上述した誤検出や未検出を低減し、検出精度を向上できる。 As described above, in the distance measuring device DM and the distance measuring method implemented in the present embodiment, it is determined whether or not the imaging state satisfies a predetermined condition, and the reference information storage unit 31 is based on the determination result. It is determined whether or not the reference image TP stored in is to be discarded. For this reason, since the distance measuring device DM and the distance measuring method in this embodiment can determine whether or not the reference image TP stored in the reference information storage unit 31 should be discarded based on the determination result of the discarding, the imaging state Even when the change occurs, it is possible to reduce the above-described erroneous detection and non-detection and improve the detection accuracy.
 また、本実施形態における測距装置DMおよび測距方法は、前記撮像状態が所定の条件を満たすと判定した場合に、参照情報記憶部31に記憶された参照画像TPを破棄するので、前記誤検出や未検出を確実に防止できる。 In addition, the distance measuring device DM and the distance measuring method in the present embodiment discard the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state satisfies a predetermined condition. Detection and non-detection can be reliably prevented.
 また、本実施形態における測距装置DMおよび測距方法では、ユーザは、報知部4を用いた通知部25の通知によって、前記撮像状態が所定の条件を満たすと連続的に判定されたことを認識することができる。前記撮像状態が所定の条件を満たすと連続的に判定された場合では、撮像部1における異物の固着や測距装置DMの故障等の一時的ではない異常が生じているものと推察される。したがって、本実施形態における測距装置DMおよび測距方法は、通知部25を備えるので、例えば撮像部1を点検する等の適宜な措置をユーザに促すことができる。このため、前記閾値Thnは、この観点から、フレームレートに応じた適宜な回数、例えば、15回や30回等に設定される。 Further, in the distance measuring device DM and the distance measuring method according to the present embodiment, the user continuously determines that the imaging state satisfies a predetermined condition by the notification of the notification unit 25 using the notification unit 4. Can be recognized. When it is continuously determined that the imaging state satisfies a predetermined condition, it is presumed that a non-temporary abnormality such as sticking of a foreign object in the imaging unit 1 or a failure of the distance measuring device DM has occurred. Therefore, since the distance measuring device DM and the distance measuring method in the present embodiment include the notification unit 25, it is possible to prompt the user to take appropriate measures such as inspecting the imaging unit 1, for example. Therefore, the threshold value Thn is set to an appropriate number of times according to the frame rate, for example, 15 times or 30 times, from this viewpoint.
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 This specification discloses various modes of technology as described above, and the main technologies are summarized below.
 一態様にかかる測距装置は、移動体に搭載され、被写体を時間的に連続して撮像する撮像部と、前記撮像部によって撮像された時系列な撮像画像のうちの1つを参照画像として記憶する参照情報記憶部と、前記撮像部によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と前記参照情報記憶部に記憶された前記参照画像とに基づいて前記最新撮像画像に含まれる所定の物体までの距離を求める距離演算部と、前記撮像部の撮像状態を取得する状態取得部と、前記状態取得部によって取得された前記撮像状態が所定の条件を満たすか否かを判定し、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かを判定する破棄判定部とを備える。そして、他の一態様にかかる測距方法は、移動体に搭載され、被写体を時間的に連続して撮像する撮像工程と、前記撮像工程によって撮像された時系列な撮像画像のうちの1つを参照画像として参照情報記憶部に記憶する記憶工程と、前記撮像工程によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と前記参照情報記憶部に記憶された前記参照画像とに基づいて前記最新撮像画像に含まれる所定の物体までの距離を求める距離演算工程と、前記撮像部の撮像状態を取得する状態取得工程と、前記状態取得工程によって取得された前記撮像状態が所定の条件を満たすか否かを判定し、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かを判定する破棄判定工程とを備える。 A distance measuring apparatus according to an aspect is mounted on a moving body, and an imaging unit that continuously images a subject in time and one of time-series captured images captured by the imaging unit as a reference image Based on the reference information storage unit to be stored, the latest latest captured image of the time-series captured images captured by the imaging unit, and the reference image stored in the reference information storage unit, A distance calculation unit that obtains a distance to a predetermined object included, a state acquisition unit that acquires an imaging state of the imaging unit, and whether or not the imaging state acquired by the state acquisition unit satisfies a predetermined condition And a discard determination unit that determines whether to discard the reference image stored in the reference information storage unit based on the determination result. The distance measuring method according to another aspect is mounted on a moving body and is one of an imaging process for imaging a subject continuously in time, and a time-series captured image captured by the imaging process. A reference image stored in the reference information storage unit, a latest latest captured image of the time-series captured images captured in the imaging step, and the reference image stored in the reference information storage unit A distance calculation step for obtaining a distance to a predetermined object included in the latest captured image, a state acquisition step for acquiring an imaging state of the imaging unit, and the imaging state acquired by the state acquisition step are predetermined. A discard determination step of determining whether or not a condition is satisfied and determining whether or not to discard the reference image stored in the reference information storage unit based on the determination result.
 このような測距装置および測距方法では、前記撮像状態が所定の条件を満たすか否かが判定され、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かが判定される。このため、このような測距装置および測距方法は、前記破棄の判定結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄すべきか否かが分かるので、前記撮像状態が変化した場合でも検出精度を向上できる。 In such a distance measuring apparatus and distance measuring method, it is determined whether or not the imaging state satisfies a predetermined condition, and the reference image stored in the reference information storage unit is discarded based on the determination result. It is determined whether or not. For this reason, since the distance measuring apparatus and the distance measuring method can determine whether or not the reference image stored in the reference information storage unit should be discarded based on the determination result of the discard, the imaging state changes. Even in this case, the detection accuracy can be improved.
 また、他の一態様では、上述の測距装置において、前記破棄判定部は、前記状態取得部によって取得された前記撮像状態が所定の条件を満たすと判定した場合に、前記参照情報記憶部に記憶された前記参照画像を破棄する。そして、好ましくは、この測距装置は、前記参照情報記憶部に記憶された前記参照画像を破棄した場合に、前記最新撮像画像を撮像した以後で所定枚数を撮像した後に前記撮像部によって撮像された撮像画像を新たな参照画像として前記参照情報記憶部に記憶する参照情報更新部をさらに備える。 In another aspect, in the distance measuring device described above, when the discard determination unit determines that the imaging state acquired by the state acquisition unit satisfies a predetermined condition, the discard determination unit stores the reference information storage unit. The stored reference image is discarded. Preferably, when the reference image stored in the reference information storage unit is discarded, the distance measuring device is captured by the imaging unit after capturing a predetermined number of images after capturing the latest captured image. And a reference information update unit that stores the captured image as a new reference image in the reference information storage unit.
 このような測距装置は、前記撮像状態が所定の条件を満たすと判定した場合に、前記参照情報記憶部に記憶された前記参照画像を破棄するので、上述した誤検出や未検出を確実に防止できる。 Since such a distance measuring device discards the reference image stored in the reference information storage unit when it is determined that the imaging state satisfies a predetermined condition, the above-described erroneous detection and non-detection are ensured. Can be prevented.
 また、他の一態様では、これら上述の測距装置において、前記状態取得部は、前記撮像部によって撮像された前記最新撮像画像における所定の特徴量に基づいて前記撮像部の撮像状態を取得する。そして、好ましくは、この測距装置において、前記状態取得部は、前記所定の特徴量として、画素値または輝度値の統計量(例えば平均値や中央値等)を取得するものである。また、好ましくは、この測距装置において、前記状態取得部は、前記所定の特徴量として、前記最新撮像画像における周波数特性を取得するものである。 In another aspect, in the above-described distance measuring device, the state acquisition unit acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit. . Preferably, in the distance measuring apparatus, the state acquisition unit acquires a statistic (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount. Preferably, in the distance measuring apparatus, the state acquisition unit acquires a frequency characteristic in the latest captured image as the predetermined feature amount.
 このような測距装置は、前記撮像部によって撮像された前記最新撮像画像における所定の特徴量に基づいて前記撮像部の撮像状態を取得するので、前記撮像部の撮像状態を取得するための装置を別途に必要としない。 Since such a distance measuring device acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit, an apparatus for acquiring the imaging state of the imaging unit Is not required separately.
 また、他の一態様では、上述の測距装置において、前記状態取得部は、前記最新撮像画像の一部を用いる。 In another aspect, in the distance measuring device described above, the state acquisition unit uses a part of the latest captured image.
 このような測距装置は、前記最新撮像画像の一部を用いて前記撮像部の撮像状態を取得するので、情報処理の負荷を低減できる。 Such a distance measuring device acquires the imaging state of the imaging unit by using a part of the latest captured image, so that the load of information processing can be reduced.
 また、他の一態様では、これら上述の測距装置において、前記状態取得部は、前記撮像部の撮像状態として前記撮像部の撮像方向を取得する。 In another aspect, in the above-described distance measuring device, the state acquisition unit acquires an imaging direction of the imaging unit as an imaging state of the imaging unit.
 通常、前記撮像部は、その撮像方向と移動体の移動方向とを一致させて移動体に配設される。したがって、このような測距装置は、前記撮像部の撮像方向を前記撮像部の撮像状態とするので、移動体の移動方向を前記撮像部の撮像状態に流用することができ、また、例えば左折や右折等の、移動体の移動方向の変更に対処できる。 Usually, the imaging unit is arranged on the moving body so that the imaging direction coincides with the moving direction of the moving body. Therefore, since such a distance measuring device sets the imaging direction of the imaging unit to the imaging state of the imaging unit, the moving direction of the moving body can be diverted to the imaging state of the imaging unit. It is possible to cope with changes in the moving direction of the moving body, such as turning right or turning.
 また、他の一態様では、これら上述の測距装置において、前記状態取得部によって取得された前記撮像状態が所定の条件を満たすと前記破棄判定部によって連続的に判定された連続判定回数が閾値以上である場合に、前記状態取得部によって取得された前記撮像状態が所定の条件を満たす旨を外部に通知する通知部をさらに備える。 In another aspect, in the above-described distance measuring device, the number of continuous determinations continuously determined by the discard determination unit when the imaging state acquired by the state acquisition unit satisfies a predetermined condition is a threshold value. In the case described above, the information processing apparatus further includes a notification unit that notifies the outside that the imaging state acquired by the state acquisition unit satisfies a predetermined condition.
 このような測距装置では、ユーザは、前記通知部の通知によって、前記撮像状態が所定の条件を満たすと連続的に判定されたことを認識することができる。したがって、このような測距装置は、前記通知部を備えるので、例えば撮像部を点検する等の適宜な措置をユーザに促すことができる。 In such a distance measuring device, the user can recognize that the imaging state is continuously determined to satisfy a predetermined condition by the notification of the notification unit. Therefore, since such a distance measuring device includes the notification unit, it is possible to prompt the user to take appropriate measures such as checking the imaging unit.
 この出願は、2013年6月21日に出願された日本国特許出願特願2013-130709を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2013-130709 filed on June 21, 2013, the contents of which are included in this application.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 In order to express the present invention, the present invention has been properly and fully described through the embodiments with reference to the drawings. However, those skilled in the art can easily change and / or improve the above-described embodiments. It should be recognized that this is possible. Therefore, unless the modifications or improvements implemented by those skilled in the art are at a level that departs from the scope of the claims recited in the claims, the modifications or improvements are not covered by the claims. To be construed as inclusive.
 本発明によれば、測距装置および測距方法を提供することができる。 According to the present invention, a distance measuring device and a distance measuring method can be provided.

Claims (7)

  1.  移動体に搭載され、被写体を時間的に連続して撮像する撮像部と、
     前記撮像部によって撮像された時系列な撮像画像のうちの1つを参照画像として記憶する参照情報記憶部と、
     前記撮像部によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と前記参照情報記憶部に記憶された前記参照画像とに基づいて前記最新撮像画像に含まれる所定の物体までの距離を求める距離演算部と、
     前記撮像部の撮像状態を取得する状態取得部と、
     前記状態取得部によって取得された前記撮像状態が所定の条件を満たすか否かを判定し、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かを判定する破棄判定部とを備えること
     を特徴とする測距装置。
    An imaging unit that is mounted on a moving body and continuously images a subject; and
    A reference information storage unit that stores one of time-series captured images captured by the imaging unit as a reference image;
    The distance to a predetermined object included in the latest captured image based on the latest latest captured image of the time-series captured images captured by the imaging unit and the reference image stored in the reference information storage unit A distance calculation unit for obtaining
    A state acquisition unit for acquiring an imaging state of the imaging unit;
    It is determined whether the imaging state acquired by the state acquisition unit satisfies a predetermined condition, and whether to discard the reference image stored in the reference information storage unit based on the determination result A ranging device comprising: a discard determination unit for determining.
  2.  前記破棄判定部は、前記状態取得部によって取得された前記撮像状態が所定の条件を満たすと判定した場合に、前記参照情報記憶部に記憶された前記参照画像を破棄すること
     を特徴とする請求項1に記載の測距装置。
    The discard determination unit discards the reference image stored in the reference information storage unit when it is determined that the imaging state acquired by the state acquisition unit satisfies a predetermined condition. Item 3. The distance measuring device according to Item 1.
  3.  前記状態取得部は、前記撮像部によって撮像された前記最新撮像画像における所定の特徴量に基づいて前記撮像部の撮像状態を取得すること
     を特徴とする請求項1または請求項2に記載の測距装置。
    The measurement state according to claim 1 or 2, wherein the state acquisition unit acquires an imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit. Distance device.
  4.  前記状態取得部は、前記最新撮像画像の一部を用いること
     を特徴とする請求項3に記載の測距装置。
    The distance measuring device according to claim 3, wherein the state acquisition unit uses a part of the latest captured image.
  5.  前記状態取得部は、前記撮像部の撮像状態として前記撮像部の撮像方向を取得すること
     を特徴とする請求項1または請求項2に記載の測距装置。
    The distance measuring device according to claim 1, wherein the state acquisition unit acquires an imaging direction of the imaging unit as an imaging state of the imaging unit.
  6.  前記状態取得部によって取得された前記撮像状態が所定の条件を満たすと前記破棄判定部によって連続的に判定された連続判定回数が閾値以上である場合に、前記状態取得部によって取得された前記撮像状態が所定の条件を満たす旨を外部に通知する通知部をさらに備えること
     を特徴とする請求項1ないし請求項5のいずれか1項に記載の測距装置。
    When the imaging state acquired by the state acquisition unit satisfies a predetermined condition, the imaging acquired by the state acquisition unit when the number of continuous determinations continuously determined by the discard determination unit is greater than or equal to a threshold value The distance measuring device according to any one of claims 1 to 5, further comprising a notification unit that notifies the outside that the state satisfies a predetermined condition.
  7.  移動体に搭載され、被写体を時間的に連続して撮像する撮像工程と、
     前記撮像工程によって撮像された時系列な撮像画像のうちの1つを参照画像として参照情報記憶部に記憶する記憶工程と、
     前記撮像工程によって撮像された時系列な撮像画像のうちの最新の最新撮像画像と前記参照情報記憶部に記憶された前記参照画像とに基づいて前記最新撮像画像に含まれる所定の物体までの距離を求める距離演算工程と、
     前記撮像部の撮像状態を取得する状態取得工程と、
     前記状態取得工程によって取得された前記撮像状態が所定の条件を満たすか否かを判定し、前記判定の結果に基づいて前記参照情報記憶部に記憶された前記参照画像を破棄するか否かを判定する破棄判定工程とを備えること
     を特徴とする測距方法。
    An imaging process that is mounted on a moving body and images a subject continuously in time,
    A storage step of storing one of the time-series captured images captured by the imaging step as a reference image in a reference information storage unit;
    The distance to a predetermined object included in the latest captured image based on the latest latest captured image of the time-series captured images captured in the imaging step and the reference image stored in the reference information storage unit A distance calculation step for obtaining
    A state acquisition step of acquiring an imaging state of the imaging unit;
    It is determined whether or not the imaging state acquired in the state acquisition step satisfies a predetermined condition, and whether or not to discard the reference image stored in the reference information storage unit based on the determination result And a discard determination step for determining.
PCT/JP2014/062897 2013-06-21 2014-05-14 Distance measurement device and distance measurement method WO2014203658A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015522667A JPWO2014203658A1 (en) 2013-06-21 2014-05-14 Ranging device and ranging method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013130709 2013-06-21
JP2013-130709 2013-06-21

Publications (1)

Publication Number Publication Date
WO2014203658A1 true WO2014203658A1 (en) 2014-12-24

Family

ID=52104400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/062897 WO2014203658A1 (en) 2013-06-21 2014-05-14 Distance measurement device and distance measurement method

Country Status (2)

Country Link
JP (1) JPWO2014203658A1 (en)
WO (1) WO2014203658A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07141592A (en) * 1993-11-12 1995-06-02 Toyota Motor Corp Road white line detecting device
JP2004112144A (en) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd Front car tracking system and method for tracking front car
JP2005229444A (en) * 2004-02-13 2005-08-25 Toshiba Corp Vehicle tracking device and program
JP2009085651A (en) * 2007-09-28 2009-04-23 Hitachi Ltd Image processing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07141592A (en) * 1993-11-12 1995-06-02 Toyota Motor Corp Road white line detecting device
JP2004112144A (en) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd Front car tracking system and method for tracking front car
JP2005229444A (en) * 2004-02-13 2005-08-25 Toshiba Corp Vehicle tracking device and program
JP2009085651A (en) * 2007-09-28 2009-04-23 Hitachi Ltd Image processing system

Also Published As

Publication number Publication date
JPWO2014203658A1 (en) 2017-02-23

Similar Documents

Publication Publication Date Title
JP4987573B2 (en) Outside monitoring device
JP6257792B2 (en) Camera covering state recognition method, camera system, and automobile
JP4914233B2 (en) Outside monitoring device
JP4631096B2 (en) Vehicle periphery monitoring device
JP5887219B2 (en) Lane departure warning device
JP6364797B2 (en) Image analysis apparatus and image analysis method
JP6220327B2 (en) Traveling lane marking recognition device, traveling lane marking recognition program
US10127460B2 (en) Lane boundary line information acquiring device
JP2007257449A (en) Road division line detector
JP2008027309A (en) Collision determination system and collision determination method
JP2007164636A (en) Lane line detection device
JP4528283B2 (en) Vehicle periphery monitoring device
JP2014160322A (en) Lane boundary deviation suppression device
JP5759950B2 (en) In-vehicle camera device
JP2009219555A (en) Drowsiness detector, driving support apparatus, drowsiness detecting method
JP2019219719A (en) Abnormality detection device and abnormality detection method
JP2007128460A (en) Collision determination system, collision determination method, computer program, and determination device
JP5004923B2 (en) Vehicle driving support device
JP2009295018A (en) Vehicle periphery monitoring device
WO2014203658A1 (en) Distance measurement device and distance measurement method
JP2008042759A (en) Image processing apparatus
JP5957315B2 (en) In-vehicle lane recognition system
JP2018005441A (en) Inter-vehicle distance alarm and collision alarm device
JP2006107000A (en) Method and device for deciding image abnormality
JP6564682B2 (en) Object detection device, object detection method, and object detection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14813005

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015522667

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14813005

Country of ref document: EP

Kind code of ref document: A1