WO2018020936A1 - Vehicle periphery monitoring device, and method for monitoring periphery of vehicle - Google Patents

Vehicle periphery monitoring device, and method for monitoring periphery of vehicle Download PDF

Info

Publication number
WO2018020936A1
WO2018020936A1 PCT/JP2017/023559 JP2017023559W WO2018020936A1 WO 2018020936 A1 WO2018020936 A1 WO 2018020936A1 JP 2017023559 W JP2017023559 W JP 2017023559W WO 2018020936 A1 WO2018020936 A1 WO 2018020936A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
flicker
vehicle
frequency
Prior art date
Application number
PCT/JP2017/023559
Other languages
French (fr)
Japanese (ja)
Inventor
近藤 大輔
恒 山口
Original Assignee
カルソニックカンセイ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カルソニックカンセイ株式会社 filed Critical カルソニックカンセイ株式会社
Priority to DE112017003775.1T priority Critical patent/DE112017003775T5/en
Priority to CN201780037399.2A priority patent/CN109479122A/en
Publication of WO2018020936A1 publication Critical patent/WO2018020936A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing

Definitions

  • This case relates to a vehicle periphery monitoring device and a vehicle periphery monitoring method.
  • Such a vehicle periphery monitoring device includes, for example, a side mirror monitor.
  • the boundary information indicating the boundary between the host vehicle and the surrounding scenery is used to perform a process for clarifying the boundary.
  • the main purpose of this case is to solve the above-mentioned problems.
  • An imaging unit capable of capturing an image of the surroundings of the own vehicle; An image processing unit that performs image processing on an image captured by the imaging unit; A display unit for displaying an image processed by the image processing unit,
  • the image processing unit includes a flicker correction unit that can be corrected so that flicker of an image displayed on the display unit is reduced when the image captured by the imaging unit has flicker.
  • the flicker correction unit is characterized by a vehicle periphery monitoring device that determines that the relative displacement of the regularly arranged object with the host vehicle in the image captured by the imaging unit is flicker. Further, the present invention is characterized by a vehicle periphery monitoring method using the vehicle periphery monitoring device.
  • FIG. 1 is an overall plan view of a vehicle on which a vehicle periphery monitoring device according to an embodiment is mounted. It is a block diagram of the periphery monitoring apparatus for vehicles of FIG. It is a block diagram of the flicker correction
  • FIG. 5 is a diagram illustrating a state in which the image in FIG. 4 is converted so that an object can be detected by an object detection unit. It is a figure which shows the state which frequency-converted the data of FIG. 5 in the frequency conversion part. It is a figure which shows an example of the image from which flicker was reduced. It is a figure which shows the mode of the process (with respect to each square) in an object detection part.
  • This vehicle periphery monitoring device that captures an image of the surroundings of the vehicle with a camera mounted on the vehicle and displays the image on the display unit.
  • This vehicle periphery monitoring device is configured as, for example, a side mirror monitor.
  • the vehicle periphery monitoring device 1 includes an imaging unit 4 that can capture an image 3 around the host vehicle 2, and An image processing unit 5 that performs image processing on the image 3 captured by the imaging unit 4; It is assumed that the image processing unit 5 includes a display unit 7 that displays an image 6 processed by the image processing unit 5.
  • the imaging unit 4 is a monitor camera or the like provided on the left and right of the host vehicle 2.
  • One image pickup unit 4 is installed in the front edge portion of the front door 11 of the host vehicle 2 at a position near the boundary portion between the door body 12 and the front window glass 13 substantially toward the rear of the vehicle. . Therefore, the image 3 photographed by the imaging unit 4 is the one that reflects the rear of the vehicle from the left and right sides of the host vehicle 2.
  • the image processing unit 5 is a (calculation) control unit such as a computer having an image processing function.
  • the display unit 7 is a display panel such as a monitor (liquid crystal panel or organic EL panel) provided in a pair of left and right according to a pair of left and right monitor cameras.
  • the pair of left and right display units 7 are installed in a part of an instrument panel provided in front of the driver's seat (for example, a position in front of the driver's seat). For this reason, the image 6 behind the vehicle by the display unit 7 is viewed at a position close to the center in the field of view of the driver who is facing the front and is located outside the field of view. It is much easier to see than the mirror image of the existing side mirror, and it is likely to become an obstacle in some situations.
  • the number of installed display units 7 and the installation positions are not limited to the above.
  • this embodiment has the following configuration.
  • the image processing unit 5 reduces the flickering of the image 6 displayed on the display unit 7 when the image 3 captured by the imaging unit 4 has flickering.
  • a flicker correcting unit 22 that can be corrected is provided.
  • the flicker correcting unit 22 determines that the relative displacement of the regularly arranged object 24 with the host vehicle 2 in the image 3 (see FIG. 4) photographed by the imaging unit 4 is flickering.
  • flicker refers to, for example, an image 3 captured by the imaging unit 4 that is reflected in a state where light and dark are alternately repeated.
  • the main object is flickering by the object 24, in particular, repeated light and darkness by the regularly arranged objects 24.
  • the difference between light and dark is sufficiently small, even the regularly arranged object 24 may not feel flickering.
  • a framework 24a of an iron bridge, implantations planted at almost equal intervals on both sides of an expressway, and the like can be considered. Other examples will be described later.
  • the regularly arranged objects 24 tend to flicker more as the distance between the objects 24 is narrower.
  • the relative displacement includes, for example, a case where the object 24 is stationary and the own vehicle 2 is moving, a case where the object 24 is moving and the own vehicle 2 is stationary, and a case where both the object 24 and the own vehicle 2 are moving. And are included.
  • the flicker correcting unit 22 blurs the flicker portions 31 and 32 in the image 3 as shown in FIG. 4 (as shown in FIG. 7 through the processes shown in FIG. 5 and FIG. 6).
  • the flickers displayed on the display unit 7 are reduced (blur units 35 and 36).
  • the flickering portions 31 and 32 are mainly directed to the direct object (flickering portion 31) by the object 24 in the image 3, but depending on the situation, the image of the object 24 reflected on the front window glass 13. Indirect (such as a flickering portion 32) due to the above, an image of the object 24 reflected on the door body 12, and the like can also be included. Whether to include indirect or not can be determined by the magnitude of the difference between light and dark.
  • blurring process for example, it is conceivable to apply an effect (blurring process) to the image 3 such as motion blur (process for making an afterimage of movement), transparency and low contrast.
  • the blurring process is not limited to the above.
  • the flicker correction unit 22 includes a sampling unit 41 that samples the image 3 captured by the imaging unit 4; An object detection unit 42 (see FIGS. 5, 8, and 9) that detects an object 24 that is relatively displaced with respect to the host vehicle 2 from the image 3 sampled by the sampling unit 41; A frequency conversion unit 43 (see FIG. 10) that converts the frequency of the data 42a used to detect the object 24 that is relatively displaced by the object detection unit 42; A flicker determination unit 45 that determines flicker when the frequency component that is uncomfortable in the data 43a frequency-converted by the frequency conversion unit 43 exceeds a preset threshold value 44 (see FIGS. 6 and 10). And.
  • sampling unit 41 the object detection unit 42, the frequency conversion unit 43, and the flicker determination unit 45 may be configured as functional blocks on software in the image processing unit 5.
  • the sampling unit 41 samples the image 3 for about several seconds. For example, in the case of the imaging unit 4 capable of capturing 15 frames per second, 45 frames of the image 3 are sampled in 3 seconds.
  • the number of frames taken by the imaging unit 4 and the sampling time and number of images 3 are not limited to the above.
  • ⁇ ⁇ Frequency components that are uncomfortable are mainly high frequency components.
  • the white portion is a frequency component that makes the user feel uncomfortable
  • the black portion is the other low-frequency component.
  • FIG. 10 shows a threshold value 44 for cutting the other low frequency side components to make only the frequency components that feel uncomfortable.
  • the threshold value 44 can be appropriately adjusted while referring to the final image 6 so that a sense of incongruity does not occur.
  • the flicker determination unit 45 determines that there is flicker in the data 43a in FIG. When it is determined that there is flicker, the flicker correction unit 22 performs a blurring process on the flicker portions 31 and 32.
  • the object detection unit 42 detects the object 24 that is relatively displaced with respect to the host vehicle 2 by performing discrete cosine transformation on the image 3 sampled by the sampling unit 41. Further, the frequency conversion unit 43 performs frequency conversion on the data 42a used for detection of the object 24 that is relatively displaced by the object detection unit 42 using Fourier transform.
  • the image 3 sampled by the sampling unit 41 is first meshed and divided into fine vertical and horizontal grids 3a for each frame.
  • the discrete cosine transform is performed by the object detection unit 42 for each square 3a.
  • the size of the grid 3 a can be arbitrarily set according to the processing capability of the image processing unit 5 and the resolution required for the display unit 7.
  • Discrete cosine transform is one of transform methods for transforming discrete signals into the frequency domain.
  • the discrete cosine transform is widely used for data compression and the like because it can reduce the data capacity without losing most of the original signal.
  • the discrete cosine transform is characterized in that the frequency components of the signal after conversion are concentrated in a low frequency region. In this case, the discrete cosine transform as described above is used to detect the object 24 that is displaced relative to the host vehicle 2.
  • FIG. 8 a state as shown in FIG. 8 is obtained.
  • This is black and white so that the vertical direction (vertical direction) is the vertical axis of image 3 and the horizontal direction (horizontal direction) is the horizontal axis of image 3, the upper left is a low frequency component and the lower right is a high frequency component.
  • the changing object 24 is represented as a white portion. And the white part becomes large, so that the change of the object 24 is large.
  • the waveform data 42a thus obtained is Fourier-transformed by the frequency converter 43 to obtain waveform data 43a as shown in FIG.
  • the Fourier transform is a conversion technique for converting a certain waveform so as to be represented by superposition of sine waves.
  • the flicker determination unit 45 performs the blurring process.
  • the imaging unit 4 captures an image 3 around the host vehicle 2
  • the image processing unit 5 controls image processing on the image 3 captured by the imaging unit 4
  • the display unit 7 displays the image 6 processed by the image processing unit 5.
  • the image processing unit 5 corrects the flickering of the image 6 displayed on the display unit 7 to be reduced.
  • the flicker correcting unit 22 determines that the relative displacement of the regularly arranged object 24 with the host vehicle 2 in the image 3 captured by the imaging unit 4 is flickering.
  • the flicker correction unit 22 reduces the flicker displayed on the display unit 7 by blurring the flicker portions 31 and 32 of the image 3.
  • the flicker correction unit 22 samples the image 3 captured by the imaging unit 4 by the sampling unit 41
  • the object detection unit 42 detects the object 24 that is displaced relative to the host vehicle 2 from the image 3 sampled by the sampling unit 41
  • the frequency conversion unit 43 converts the frequency of the data used to detect the object 24 that is relatively displaced by the object detection unit 42
  • the flicker determination unit 45 determines that flickering occurs when the frequency component that is felt uncomfortable in the data frequency-converted by the frequency conversion unit 43 exceeds a preset threshold value 44.
  • step S1 image data (image 3) of the camera (imaging unit 4) is acquired,
  • step S2 the sampling unit 41 samples for several frames,
  • step S3 the object detection unit 42 extracts an object (object 24) shown at regular intervals, If there are no objects appearing at regular intervals in step S4, the screen is directly output in step S6. If there are objects appearing at equal intervals, an effect (blurring process) is applied to the portions appearing at equal intervals in step S5.
  • step S6 the screen is output, and the process returns to the beginning and the above processing is repeated.
  • the object detection unit 42 detects the object 24 that is relatively displaced with respect to the host vehicle 2 by performing discrete cosine transform on the image 3 sampled by the sampling unit 41,
  • the frequency conversion unit 43 performs frequency conversion on the data used for detection of the object 24 that is relatively displaced by the object detection unit 42 using Fourier transform.
  • the vehicle periphery monitoring device captures an image around the host vehicle with a camera mounted on the vehicle and displays the image on the display unit.
  • the vehicle periphery monitoring device is, for example, a side mirror monitor.
  • the imaging unit 4 captures an image 3 around the host vehicle 2
  • the image processing unit 5 performs image processing on the image 3 captured by the imaging unit 4
  • the display unit 7 performs image processing on the image processing unit 5.
  • the image-processed image 6 is displayed.
  • the image processing unit 5 can reduce the flicker of the image 6 displayed on the display unit 7 by the flicker correction unit 22 when the image 3 photographed by the imaging unit 4 has flicker. it can. Thereby, the part which looks troublesome due to flickering can be partially eliminated (or reduced) from the display of the display unit 7, and for example, the display unit 7 is arranged at a position where it is easily noticeable within the visual field range. Even in such a case, the troublesomeness caused by flickering can be eliminated or reduced. You can concentrate on driving.
  • the flicker correcting unit 22 determines that the relative displacement of the regularly arranged external object 24 in the image 3 photographed by the image capturing unit 4 with the host vehicle 2 is flickering.
  • the framework 24a of the iron bridge the planting planted at approximately equal intervals on both sides of the highway, the landmark 24b, and the lighting 24c arranged at approximately equal intervals in the tunnel, etc. It is possible to correct flickering due to the object 24 constituting the scenery around the vehicle 2.
  • flickering by a train or a freight car passing near the own vehicle 2 while the vehicle is stopped can be corrected by the above.
  • the flicker correcting unit 22 causes the flickering portions 31 and 32 of the image 3 to be blurred and displayed on the display unit 7. Thereby, the flicker displayed on the display part 7 can be reduced reliably.
  • the image 3 can be blurred by motion blur (processing for making an afterimage of motion), transparency, low contrast, or the like. By performing these blurring processes, display discomfort can be reduced. In addition, it is possible to prevent car sickness caused by continuing to watch the display having flicker.
  • the sampling unit 41 samples the image 3 captured by the imaging unit 4, and the object detection unit 42 is relative to the host vehicle 2 from the image 3 sampled by the sampling unit 41.
  • the object 24 that is displaced is detected
  • the frequency converter 43 converts the frequency of the data used to detect the object 24 that is relatively displaced by the object detector 42
  • the flicker determination unit 45 is frequency-converted by the frequency converter 43.
  • the object detection unit 42 performs the discrete cosine transform on the image 3 sampled by the sampling unit 41 to detect the object 24 that is relatively displaced with respect to the host vehicle 2, and the frequency conversion unit 43
  • the data used to detect the object 24 that is relatively displaced by the object detection unit 42 is frequency-converted using Fourier transform. Thereby, the flickering object 24 can be actually detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The principal objective of the present invention is to make it possible to alleviate flicker resulting from the scenery around a vehicle, for example. The present invention relates to a vehicle periphery monitoring device (1) provided with: an image capturing unit (4) capable of capturing an image (3) of the periphery of the vehicle (2); an image processing unit (5) which subjects the image (3) captured by the image capturing unit (4) to image processing; and a display unit (7) which displays an image (6) which has been subjected to image processing by the image processing unit (5) (control unit). The image processing unit (5) is provided with a flicker correcting unit (22) which is capable of carrying out correction if the image (3) captured by the image capturing unit (4) contains flicker, in such a way as to alleviate flicker in the image (6) displayed by the display unit (7). The flicker correcting unit (22) determines that a displacement relative to the vehicle (2) of regularly arranged objects (24) in the image (3) captured by the image capturing unit (4) is flicker.

Description

車両用周辺監視装置および車両周辺監視方法Vehicle periphery monitoring device and vehicle periphery monitoring method
 本件は、車両用周辺監視装置および車両周辺監視方法に関するものである。 This case relates to a vehicle periphery monitoring device and a vehicle periphery monitoring method.
 近年、車両に搭載したカメラで自車両周辺の画像を撮影して、表示部に表示させるようにした車両用周辺監視装置の開発が進められている(例えば、特許文献1参照)。このような車両用周辺監視装置には、例えば、サイドミラーモニタなどがある。 In recent years, development of a vehicle periphery monitoring device in which an image around the host vehicle is taken with a camera mounted on the vehicle and displayed on a display unit has been underway (for example, see Patent Document 1). Such a vehicle periphery monitoring device includes, for example, a side mirror monitor.
 上記特許文献1に記載された車両用周辺監視装置では、自車両と周辺の景色との境界を示す境界情報を用いて、境界を明瞭化する処理を行うようにしている。 In the vehicle periphery monitoring device described in Patent Document 1, the boundary information indicating the boundary between the host vehicle and the surrounding scenery is used to perform a process for clarifying the boundary.
特開2014-116756号公報JP 2014-116756 A
 しかしながら、上記特許文献1に記載された車両用周辺監視装置では、自車両と周辺の景色との境界を明瞭化していただけなので、例えば、自車両周辺の景色などによるちらつきをなくすことなどはできなかった。 However, in the vehicle periphery monitoring device described in Patent Document 1, since the boundary between the own vehicle and the surrounding scenery is only clarified, for example, flicker due to the scenery around the own vehicle cannot be eliminated. It was.
 そこで、本件は、上記した問題点を解決することを、主な目的としている。 Therefore, the main purpose of this case is to solve the above-mentioned problems.
 上記課題を解決するために、本件は、
  自車両周辺の画像を撮影可能な撮像部と、
  該撮像部で撮影した画像を画像処理する画像処理部と、
  該画像処理部で画像処理した画像を表示する表示部と、を備え、
  前記画像処理部は、前記撮像部で撮影した画像にちらつきがある場合に、前記表示部に表示される画像のちらつきが軽減されるように補正可能なちらつき補正部を備えると共に、
  前記ちらつき補正部は、前記撮像部で撮影した画像中の、規則的に配置された物体の自車両との相対変位をちらつきと判断する車両用周辺監視装置を特徴とする。また、この車両用周辺監視装置を用いた車両周辺監視方法を特徴とする。
In order to solve the above problem,
An imaging unit capable of capturing an image of the surroundings of the own vehicle;
An image processing unit that performs image processing on an image captured by the imaging unit;
A display unit for displaying an image processed by the image processing unit,
The image processing unit includes a flicker correction unit that can be corrected so that flicker of an image displayed on the display unit is reduced when the image captured by the imaging unit has flicker.
The flicker correction unit is characterized by a vehicle periphery monitoring device that determines that the relative displacement of the regularly arranged object with the host vehicle in the image captured by the imaging unit is flicker. Further, the present invention is characterized by a vehicle periphery monitoring method using the vehicle periphery monitoring device.
 本件によれば、上記構成によって、自車両周辺の景色などによるちらつきを軽減することなどができる。 According to this case, flicker due to scenery around the host vehicle can be reduced by the above configuration.
本実施の形態にかかる車両用周辺監視装置を搭載した車両の全体平面図である。1 is an overall plan view of a vehicle on which a vehicle periphery monitoring device according to an embodiment is mounted. 図1の車両用周辺監視装置の構成図である。It is a block diagram of the periphery monitoring apparatus for vehicles of FIG. 図1のちらつき補正部の構成図である。It is a block diagram of the flicker correction | amendment part of FIG. 撮像部で撮影した画像の一例を示す図である。It is a figure which shows an example of the image image | photographed with the imaging part. 図4の画像を、物体検出部で物体の検出ができるように変換した状態を示す図である。FIG. 5 is a diagram illustrating a state in which the image in FIG. 4 is converted so that an object can be detected by an object detection unit. 図5のデータを、周波数変換部で周波数変換した状態を示す図である。It is a figure which shows the state which frequency-converted the data of FIG. 5 in the frequency conversion part. ちらつきが軽減された画像の一例を示す図である。It is a figure which shows an example of the image from which flicker was reduced. 物体検出部での(各マス目に対する)処理の様子を示す図である。It is a figure which shows the mode of the process (with respect to each square) in an object detection part. 物体検出部で処理されたデータの状態を示す図である。It is a figure which shows the state of the data processed by the object detection part. 周波数変換部で周波数変換された波形、および、この波形に対する処理を示す図である。It is a figure which shows the process with respect to the waveform frequency-converted by the frequency conversion part, and this waveform. 鉄橋の骨組みがちらついて見える場合の具体例における、元画像である。It is an original image in the specific example in case the frame of an iron bridge looks flickering. 鉄橋の骨組みがちらついて見える場合の具体例における、物体の検出の様子を示す図である。It is a figure which shows the mode of the detection of the object in the specific example in case the frame of an iron bridge looks flickering. 地点標がちらついて見える場合の具体例における、元画像である。It is an original image in a specific example in the case where a point mark appears to flicker. 地点標がちらついて見える場合の具体例における、物体の検出の様子を示す図である。It is a figure which shows the mode of the detection of the object in the specific example in case a placemark looks flickering. 地点標がちらついて見える場合の具体例における、ちらつきが軽減された画像である。This is an image in which flickering is reduced in a specific example in which the point mark appears to flicker. トンネル内でほぼ等間隔に並ぶ照明がちらついて見える場合の具体例における、元画像である。It is an original image in the specific example in the case where the illuminations arranged at almost equal intervals in the tunnel appear to flicker. トンネル内でほぼ等間隔に並ぶ照明がちらついて見える場合の具体例における、物体の検出の様子を示す図である。It is a figure which shows the mode of the detection of the object in the specific example in case the illumination arranged in a tunnel at substantially equal intervals is seen flickering. トンネル内でほぼ等間隔に並ぶ照明がちらついて見える場合の具体例における、ちらつきが軽減された画像である。It is an image in which flickering is reduced in a specific example in the case where illuminations arranged at almost equal intervals in a tunnel appear to flicker. ちらつき補正部での処理を表したフローチャートである。It is a flowchart showing the process in a flicker correction | amendment part.
 以下、本実施の形態を、図面を用いて詳細に説明する。
  図1~図14は、この実施の形態を説明するためのものである。
Hereinafter, the present embodiment will be described in detail with reference to the drawings.
1 to 14 are for explaining this embodiment.
 <構成>以下、この実施例の構成について説明する。 <Configuration> The configuration of this embodiment will be described below.
 車両に搭載したカメラで自車両周辺の画像を撮影して、表示部に表示する車両用周辺監視装置を設ける。この車両用周辺監視装置を、例えば、サイドミラーモニタなどとして構成する。 • Provide a vehicle periphery monitoring device that captures an image of the surroundings of the vehicle with a camera mounted on the vehicle and displays the image on the display unit. This vehicle periphery monitoring device is configured as, for example, a side mirror monitor.
 具体的には、図1、図2に示すように、車両用周辺監視装置1を、自車両2周辺の画像3を撮影可能な撮像部4と、
  この撮像部4で撮影した画像3を画像処理する画像処理部5と、
  この画像処理部5で画像処理した画像6を表示する表示部7と、を備えたものとする。
Specifically, as shown in FIGS. 1 and 2, the vehicle periphery monitoring device 1 includes an imaging unit 4 that can capture an image 3 around the host vehicle 2, and
An image processing unit 5 that performs image processing on the image 3 captured by the imaging unit 4;
It is assumed that the image processing unit 5 includes a display unit 7 that displays an image 6 processed by the image processing unit 5.
 ここで、撮像部4は、自車両2に対して左右一対設けられたモニタカメラなどとされている。撮像部4は、自車両2のフロントドア11の前縁部における、ドア本体12とフロントウインドガラス13との境界部分の位置周辺に対し、ほぼ車両後方へ向けてそれぞれ1台ずつ設置されている。よって、撮像部4で撮影した画像3は、自車両2の左右の側部からそれぞれ車両後方を映したものとなる。画像処理部5は、画像処理機能を備えたコンピュータなどの(演算)制御部とされる。 Here, the imaging unit 4 is a monitor camera or the like provided on the left and right of the host vehicle 2. One image pickup unit 4 is installed in the front edge portion of the front door 11 of the host vehicle 2 at a position near the boundary portion between the door body 12 and the front window glass 13 substantially toward the rear of the vehicle. . Therefore, the image 3 photographed by the imaging unit 4 is the one that reflects the rear of the vehicle from the left and right sides of the host vehicle 2. The image processing unit 5 is a (calculation) control unit such as a computer having an image processing function.
 表示部7は、左右一対のモニタカメラに合わせて左右一対設けられたモニタ(液晶パネルや有機ELパネル)などの表示パネルとされる。左右一対の表示部7は、運転席の前方に設けられたインストルメントパネルの部分(例えば、運転席前方の位置)などに設置される。そのため、表示部7による車両後方の画像6は、正面を向いて運転中の運転者の視野範囲内における、中心に近い位置にて視認されることになり、これまで、視野範囲外などに位置していた既存のサイドミラーによる鏡像よりも格段に目につき易く、状況によっては却って邪魔になり易いものとなる可能性が高い。但し、表示部7の設置枚数や設置位置などについては、上記に限るものではない。 The display unit 7 is a display panel such as a monitor (liquid crystal panel or organic EL panel) provided in a pair of left and right according to a pair of left and right monitor cameras. The pair of left and right display units 7 are installed in a part of an instrument panel provided in front of the driver's seat (for example, a position in front of the driver's seat). For this reason, the image 6 behind the vehicle by the display unit 7 is viewed at a position close to the center in the field of view of the driver who is facing the front and is located outside the field of view. It is much easier to see than the mirror image of the existing side mirror, and it is likely to become an obstacle in some situations. However, the number of installed display units 7 and the installation positions are not limited to the above.
 以上のような基本的な構成に対し、この実施例では、以下のような構成を備えるようにしている。 In contrast to the basic configuration as described above, this embodiment has the following configuration.
 (1)図2(図3)に示すように、画像処理部5は、撮像部4で撮影した画像3にちらつきがある場合に、表示部7に表示される画像6のちらつきが軽減されるように補正可能なちらつき補正部22を備えるようにする。
  ちらつき補正部22は、例えば、撮像部4で撮影した画像3(図4参照)中の、規則的に配置された物体24の自車両2との相対変位をちらつきと判断するものとされる。
(1) As shown in FIG. 2 (FIG. 3), the image processing unit 5 reduces the flickering of the image 6 displayed on the display unit 7 when the image 3 captured by the imaging unit 4 has flickering. Thus, a flicker correcting unit 22 that can be corrected is provided.
For example, the flicker correcting unit 22 determines that the relative displacement of the regularly arranged object 24 with the host vehicle 2 in the image 3 (see FIG. 4) photographed by the imaging unit 4 is flickering.
 ここで、「ちらつき」とは、例えば、撮像部4で撮影された画像3中の、明暗が交互に繰り返されるような状態で映っているものを言う。この場合には、物体24によるちらつき、特に、規則的に配置された物体24による明暗の繰り返しを主な対象としている。なお、明暗の差が十分に小さいものについては、規則的に配置された物体24であっても、ちらつきと感じない場合がある。 Here, “flicker” refers to, for example, an image 3 captured by the imaging unit 4 that is reflected in a state where light and dark are alternately repeated. In this case, the main object is flickering by the object 24, in particular, repeated light and darkness by the regularly arranged objects 24. In addition, when the difference between light and dark is sufficiently small, even the regularly arranged object 24 may not feel flickering.
 規則的に配置された物体24には、例えば、図4に示すような、鉄橋の骨組み24aや、高速道路の両脇などにほぼ等間隔に植えられた植え込みなどが考えられる。その他の例については後述する。規則的に配置された物体24は、物体24どうしの間隔が狭いほど、ちらつきが大きくなる傾向にある。 As the regularly arranged objects 24, for example, as shown in FIG. 4, a framework 24a of an iron bridge, implantations planted at almost equal intervals on both sides of an expressway, and the like can be considered. Other examples will be described later. The regularly arranged objects 24 tend to flicker more as the distance between the objects 24 is narrower.
 相対変位には、例えば、物体24が静止して自車両2が動いている場合と、物体24が動いて自車両2が静止している場合と、物体24も自車両2も動いている場合とが含まれる。相対変位の速度が速いほど、ちらつきが大きくなる傾向にある。 The relative displacement includes, for example, a case where the object 24 is stationary and the own vehicle 2 is moving, a case where the object 24 is moving and the own vehicle 2 is stationary, and a case where both the object 24 and the own vehicle 2 are moving. And are included. The higher the relative displacement speed, the greater the flicker.
 (2)ちらつき補正部22は、図4のような画像3の中のちらつき部分31,32を、(図5、図6に示すような処理を経て、図7に示すように)ぼかすことによって、表示部7に表示されるちらつきを軽減するよう構成されている(ぼかし部35,36)。 (2) The flicker correcting unit 22 blurs the flicker portions 31 and 32 in the image 3 as shown in FIG. 4 (as shown in FIG. 7 through the processes shown in FIG. 5 and FIG. 6). The flickers displayed on the display unit 7 are reduced (blur units 35 and 36).
 ここで、ちらつき部分31,32は、主に、画像3中の物体24による直接的なもの(ちらつき部分31)を対象としているが、状況によっては、フロントウインドガラス13に映った物体24の像などによる間接的なもの(ちらつき部分32)や、ドア本体12に映った物体24の像なども含めることができる。間接的なものを含めるかどうかについては、明暗の差の大きさによって決めることができる。 Here, the flickering portions 31 and 32 are mainly directed to the direct object (flickering portion 31) by the object 24 in the image 3, but depending on the situation, the image of the object 24 reflected on the front window glass 13. Indirect (such as a flickering portion 32) due to the above, an image of the object 24 reflected on the door body 12, and the like can also be included. Whether to include indirect or not can be determined by the magnitude of the difference between light and dark.
 「ぼかす」には、画像3に対して、例えば、モーションブラー(動きを残像化する処理)や透明化や低コントラスト化などのエフェクト(ぼかし処理)をかけることなどが考えられる。ただし、ぼかし処理は、上記に限るものではない。 For “blur”, for example, it is conceivable to apply an effect (blurring process) to the image 3 such as motion blur (process for making an afterimage of movement), transparency and low contrast. However, the blurring process is not limited to the above.
 (3)ちらつき補正部22は、図3に示すように、撮像部4で撮影した画像3をサンプリングするサンプリング部41と、
  このサンプリング部41でサンプリングした画像3から自車両2に対して相対変位する物体24の検出を行う物体検出部42(図5、および、図8、図9参照)と、
  この物体検出部42で相対変位する物体24の検出に用いたデータ42aを周波数変換する周波数変換部43(図10参照)と、
  この周波数変換部43で周波数変換されたデータ43a中の不快と感じる周波数成分が、予め設定した閾値44を越えた場合に、ちらつきと判断するちらつき判断部45(図6、および、図10参照)と、を有するものとされる。
(3) As shown in FIG. 3, the flicker correction unit 22 includes a sampling unit 41 that samples the image 3 captured by the imaging unit 4;
An object detection unit 42 (see FIGS. 5, 8, and 9) that detects an object 24 that is relatively displaced with respect to the host vehicle 2 from the image 3 sampled by the sampling unit 41;
A frequency conversion unit 43 (see FIG. 10) that converts the frequency of the data 42a used to detect the object 24 that is relatively displaced by the object detection unit 42;
A flicker determination unit 45 that determines flicker when the frequency component that is uncomfortable in the data 43a frequency-converted by the frequency conversion unit 43 exceeds a preset threshold value 44 (see FIGS. 6 and 10). And.
 ここで、サンプリング部41、物体検出部42、周波数変換部43、ちらつき判断部45は、画像処理部5内でソフトウェア上の機能ブロックとして構成しても良い。サンプリング部41は、数秒間程度、画像3のサンプリングを行うものとされる。例えば、1秒間に15フレームの撮影が可能な撮像部4の場合、3秒間で45フレーム分の画像3がサンプリングされることになる。但し、撮像部4による撮影フレーム数や画像3のサンプリング時間やサンプリング数は、上記に限るものではない。 Here, the sampling unit 41, the object detection unit 42, the frequency conversion unit 43, and the flicker determination unit 45 may be configured as functional blocks on software in the image processing unit 5. The sampling unit 41 samples the image 3 for about several seconds. For example, in the case of the imaging unit 4 capable of capturing 15 frames per second, 45 frames of the image 3 are sampled in 3 seconds. However, the number of frames taken by the imaging unit 4 and the sampling time and number of images 3 are not limited to the above.
 不快と感じる周波数成分は、主に高周波側の成分となる。図6では、白い部分が不快と感じる周波数成分、黒い部分がそれ以外の低周波側の成分となっている。また、図10では、それ以外の低周波側の成分をカットして、不快と感じる周波数成分のみとするための閾値44が示されている。 周波 数 Frequency components that are uncomfortable are mainly high frequency components. In FIG. 6, the white portion is a frequency component that makes the user feel uncomfortable, and the black portion is the other low-frequency component. Further, FIG. 10 shows a threshold value 44 for cutting the other low frequency side components to make only the frequency components that feel uncomfortable.
 閾値44は、最終的な画像6を参照しながら、違和感が生じないように、適宜調整することができる。 The threshold value 44 can be appropriately adjusted while referring to the final image 6 so that a sense of incongruity does not occur.
 ちらつき判断部45では、図10のデータ43aを、閾値44を越える周波数成分があることで、ちらつき有りと判断することになる。そして、ちらつき有りと判断された場合に、ちらつき補正部22は、ちらつき部分31,32に対してぼかし処理をかける。 The flicker determination unit 45 determines that there is flicker in the data 43a in FIG. When it is determined that there is flicker, the flicker correction unit 22 performs a blurring process on the flicker portions 31 and 32.
 (4)より具体的には、物体検出部42は、サンプリング部41でサンプリングした画像3を、離散コサイン変換によって、自車両2に対して相対変位する物体24の検出を行うものとされる。
  また、周波数変換部43は、物体検出部42で相対変位する物体24の検出に用いたデータ42aを、フーリエ変換を用いて周波数変換するものとされる。
(4) More specifically, the object detection unit 42 detects the object 24 that is relatively displaced with respect to the host vehicle 2 by performing discrete cosine transformation on the image 3 sampled by the sampling unit 41.
Further, the frequency conversion unit 43 performs frequency conversion on the data 42a used for detection of the object 24 that is relatively displaced by the object detection unit 42 using Fourier transform.
 ここで、サンプリング部41でサンプリングされた画像3は、先ず、図5に示すように、メッシュをかけられて各フレームごとに細かい縦横のマス目3aに領域分割される。そして、この状態で、各マス目3aごとに物体検出部42による離散コサイン変換が行われる。マス目3aの大きさは、画像処理部5の処理能力と表示部7に必要な解像度とに応じて任意に設定することができる。 Here, as shown in FIG. 5, the image 3 sampled by the sampling unit 41 is first meshed and divided into fine vertical and horizontal grids 3a for each frame. In this state, the discrete cosine transform is performed by the object detection unit 42 for each square 3a. The size of the grid 3 a can be arbitrarily set according to the processing capability of the image processing unit 5 and the resolution required for the display unit 7.
 離散コサイン変換は、離散信号を、周波数領域に変換する変換手法のひとつである。離散コサイン変換は、元の信号の大部分を損ねずにデータの容量を減らすことができるので、データの圧縮などに広く用いられているものである。離散コサイン変換には、変換後の信号の周波数成分が低周波数領域に集中するという特徴がある。この場合には、上記したような離散コサイン変換を、自車両2に対して相対変位する物体24の検出に利用するようにしている。 Discrete cosine transform is one of transform methods for transforming discrete signals into the frequency domain. The discrete cosine transform is widely used for data compression and the like because it can reduce the data capacity without losing most of the original signal. The discrete cosine transform is characterized in that the frequency components of the signal after conversion are concentrated in a low frequency region. In this case, the discrete cosine transform as described above is used to detect the object 24 that is displaced relative to the host vehicle 2.
 より具体的には、各マス目3aについて離散コサイン変換を行うと、図8に示すような状態となる。これは、縦方向(上下方向)を画像3の縦軸とし、横方向(左右方向)を画像3の横軸として、左上側が低周波成分となり、右下側が高周波成分となるように、白黒で表したようなものとなる。このうち、変化する物体24は、白い部分として表される。そして、物体24の変化が大きいほど、白い部分が大きくなる。 More specifically, when discrete cosine transform is performed for each square 3a, a state as shown in FIG. 8 is obtained. This is black and white so that the vertical direction (vertical direction) is the vertical axis of image 3 and the horizontal direction (horizontal direction) is the horizontal axis of image 3, the upper left is a low frequency component and the lower right is a high frequency component. It will be as shown. Of these, the changing object 24 is represented as a white portion. And the white part becomes large, so that the change of the object 24 is large.
 そして、同じ位置のマス目3aを時系列に沿って並べ、それぞれのマス目3aの白い部分の面積を求めると、図9に示すようなグラフが得られる。このグラフのデータ42aに、例えば、上下に伸び縮みするような波形の変化((1)の状態と(2)の状態繰り返し)が見られるかどうかを調べることで、規則的に配置された物体24による相対変位があるかどうかを得ることができる。このデータ42aの波形では、周期的な山が表れていることで、規則的に配置された物体24が映っていることが分かる。 Then, when the cells 3a at the same position are arranged in time series and the area of the white portion of each cell 3a is obtained, a graph as shown in FIG. 9 is obtained. For example, by checking whether or not the data 42a of this graph shows a waveform change (state (1) and state repetition (2)) that expands and contracts up and down, an object arranged regularly Whether there is a relative displacement by 24 can be obtained. In the waveform of this data 42a, it can be seen that the regularly arranged objects 24 are shown by the periodic peaks appearing.
 次に、こうして得られた波形のデータ42aを、周波数変換部43でフーリエ変換して、図10に示すような波形のデータ43aを得る。フーリエ変換は、ある波形を正弦波の重ね合わせによって表すように変換する変換手法である。 Next, the waveform data 42a thus obtained is Fourier-transformed by the frequency converter 43 to obtain waveform data 43a as shown in FIG. The Fourier transform is a conversion technique for converting a certain waveform so as to be represented by superposition of sine waves.
 これらの処理は、各マス目3aごとに行われる。なお、このような処理は、具体的には、上記したような鉄橋の骨組み24aの場合(図11A、図11B参照)以外にも、例えば、図12A~図12Cに示すような地点標24bの場合や、図13A~図13Cに示すような、トンネル内でほぼ等間隔に並ぶ照明24cの場合などにも有効となる。 These processes are performed for each square 3a. Specifically, such processing is not limited to the case of the steel bridge framework 24a as described above (see FIGS. 11A and 11B), for example, the point mark 24b as shown in FIGS. 12A to 12C. This is also effective in the case of the illumination 24c arranged at almost equal intervals in the tunnel as shown in FIGS. 13A to 13C.
 その後は、ちらつき判断部45でぼかし処理がかけられることになる。 After that, the flicker determination unit 45 performs the blurring process.
 (5)以下、上記した車両用周辺監視装置を用いた車両周辺監視方法について説明する。
  この車両周辺監視方法は、撮像部4によって、自車両2周辺の画像3を撮影し、
  画像処理部5(制御部)によって、撮像部4で撮影した画像3を画像処理し、
  表示部7によって、画像処理部5で画像処理した画像6を表示するものとなる。
  そして、画像処理部5は、ちらつき補正部22によって、撮像部4で撮影した画像3にちらつきがある場合に、表示部7に表示される画像6のちらつきが軽減されるように補正する。
  この際、ちらつき補正部22は、撮像部4で撮影した画像3中の、規則的に配置された物体24の自車両2との相対変位をちらつきと判断する。
(5) Hereinafter, a vehicle periphery monitoring method using the above-described vehicle periphery monitoring device will be described.
In this vehicle periphery monitoring method, the imaging unit 4 captures an image 3 around the host vehicle 2,
The image processing unit 5 (control unit) performs image processing on the image 3 captured by the imaging unit 4,
The display unit 7 displays the image 6 processed by the image processing unit 5.
Then, when the image 3 captured by the imaging unit 4 has flickering, the image processing unit 5 corrects the flickering of the image 6 displayed on the display unit 7 to be reduced.
At this time, the flicker correcting unit 22 determines that the relative displacement of the regularly arranged object 24 with the host vehicle 2 in the image 3 captured by the imaging unit 4 is flickering.
 (6)そして、ちらつき補正部22は、画像3のちらつき部分31,32をぼかすことによって、表示部7に表示されるちらつきを軽減する。 (6) The flicker correction unit 22 reduces the flicker displayed on the display unit 7 by blurring the flicker portions 31 and 32 of the image 3.
 (7)ちらつき補正部22は、サンプリング部41によって、撮像部4で撮影した画像3をサンプリングし、
  物体検出部42によって、サンプリング部41でサンプリングした画像3から自車両2に対して相対変位する物体24の検出を行い、
  周波数変換部43によって、物体検出部42で相対変位する物体24の検出に用いたデータを周波数変換し、
  ちらつき判断部45によって、周波数変換部43で周波数変換されたデータ中の不快と感じる周波数成分が、予め設定した閾値44を越えた場合に、ちらつきと判断する。
(7) The flicker correction unit 22 samples the image 3 captured by the imaging unit 4 by the sampling unit 41,
The object detection unit 42 detects the object 24 that is displaced relative to the host vehicle 2 from the image 3 sampled by the sampling unit 41,
The frequency conversion unit 43 converts the frequency of the data used to detect the object 24 that is relatively displaced by the object detection unit 42, and
The flicker determination unit 45 determines that flickering occurs when the frequency component that is felt uncomfortable in the data frequency-converted by the frequency conversion unit 43 exceeds a preset threshold value 44.
 具体的には、図14のフローチャートに示すように、ステップS1で、カメラ(撮像部4)の画像データ(画像3)を取得し、
  ステップS2で、サンプリング部41によって、数フレーム間サンプリングし、
  ステップS3で、物体検出部42によって、等間隔に映っている物(物体24)を抽出し、
  ステップS4で、等間隔に映っている物がない場合には、そのままステップS6で画面出力し、
  等間隔に映っている物がある場合には、ステップS5で、等間隔に映っている箇所にエフェクト(ぼかし処理)をかけ、
  ステップS6で、画面出力し、最初に戻って上記した処理を繰り返すことになる。
Specifically, as shown in the flowchart of FIG. 14, in step S1, image data (image 3) of the camera (imaging unit 4) is acquired,
In step S2, the sampling unit 41 samples for several frames,
In step S3, the object detection unit 42 extracts an object (object 24) shown at regular intervals,
If there are no objects appearing at regular intervals in step S4, the screen is directly output in step S6.
If there are objects appearing at equal intervals, an effect (blurring process) is applied to the portions appearing at equal intervals in step S5.
In step S6, the screen is output, and the process returns to the beginning and the above processing is repeated.
 (8)物体検出部42は、サンプリング部41でサンプリングした画像3を、離散コサイン変換することで、自車両2に対して相対変位する物体24の検出を行い、
  周波数変換部43は、物体検出部42で相対変位する物体24の検出に用いたデータを、フーリエ変換を用いて周波数変換する。
(8) The object detection unit 42 detects the object 24 that is relatively displaced with respect to the host vehicle 2 by performing discrete cosine transform on the image 3 sampled by the sampling unit 41,
The frequency conversion unit 43 performs frequency conversion on the data used for detection of the object 24 that is relatively displaced by the object detection unit 42 using Fourier transform.
 <作用効果>この実施例によれば、以下のような作用効果を得ることができる。 <Action and effect> According to this embodiment, the following action and effect can be obtained.
 車両用周辺監視装置は、車両に搭載したカメラで自車両周辺の画像を撮影し、表示部に表示するものである。車両用周辺監視装置を、例えば、サイドミラーモニタなどとする。 The vehicle periphery monitoring device captures an image around the host vehicle with a camera mounted on the vehicle and displays the image on the display unit. The vehicle periphery monitoring device is, for example, a side mirror monitor.
 具体的には、撮像部4によって、自車両2周辺の画像3を撮影し、画像処理部5によって、撮像部4で撮影した画像3を画像処理し、表示部7によって、画像処理部5で画像処理した画像6を表示する。 Specifically, the imaging unit 4 captures an image 3 around the host vehicle 2, the image processing unit 5 performs image processing on the image 3 captured by the imaging unit 4, and the display unit 7 performs image processing on the image processing unit 5. The image-processed image 6 is displayed.
 (作用効果1)そして、画像処理部5は、撮像部4で撮影した画像3にちらつきがある場合に、ちらつき補正部22によって、表示部7に表示される画像6のちらつきを軽減することができる。これにより、ちらつきによって煩わしく見える部分を、表示部7の表示から部分的になくす(または軽減する)ことができ、例えば、表示部7が、視野範囲内の目につき易い位置などに配置されているような場合でも、ちらつきによる煩わしさをなくしたり軽減したりすることができる。運転に集中することができる。 (Operation Effect 1) The image processing unit 5 can reduce the flicker of the image 6 displayed on the display unit 7 by the flicker correction unit 22 when the image 3 photographed by the imaging unit 4 has flicker. it can. Thereby, the part which looks troublesome due to flickering can be partially eliminated (or reduced) from the display of the display unit 7, and for example, the display unit 7 is arranged at a position where it is easily noticeable within the visual field range. Even in such a case, the troublesomeness caused by flickering can be eliminated or reduced. You can concentrate on driving.
 この際、ちらつき補正部22は、撮像部4で撮影した画像3中の規則的に配置された外部の物体24の自車両2との相対変位をちらつきであると判断する。これにより、走行中に、例えば、鉄橋の骨組み24aや、高速道路の両脇などにほぼ等間隔に植えられた植え込みや、地点標24bや、トンネル内でほぼ等間隔に並ぶ照明24cなどの自車両2周辺の景色を構成している物体24などによるちらつきを補正することができる。 At this time, the flicker correcting unit 22 determines that the relative displacement of the regularly arranged external object 24 in the image 3 photographed by the image capturing unit 4 with the host vehicle 2 is flickering. As a result, while traveling, for example, the framework 24a of the iron bridge, the planting planted at approximately equal intervals on both sides of the highway, the landmark 24b, and the lighting 24c arranged at approximately equal intervals in the tunnel, etc. It is possible to correct flickering due to the object 24 constituting the scenery around the vehicle 2.
 また、停車中に、例えば、自車両2の近くを通過する列車や貨車などによるちらつきなどについても、上記によって補正することができる。 Further, for example, flickering by a train or a freight car passing near the own vehicle 2 while the vehicle is stopped can be corrected by the above.
 (作用効果2)ちらつき補正部22が、画像3のちらつき部分31,32をぼかして表示部7に表示させるようにした。これにより、表示部7に表示されるちらつきを確実に軽減することができる。画像3は、モーションブラー(動きを残像化する処理)や透明化や低コントラスト化などによってぼかすことができる。これらのぼかす処理を行うことによって、表示の違和感を軽減することができる。また、ちらつきを有する表示を見続けることによって生じる自動車酔いなどを防止することができる。 (Operation Effect 2) The flicker correcting unit 22 causes the flickering portions 31 and 32 of the image 3 to be blurred and displayed on the display unit 7. Thereby, the flicker displayed on the display part 7 can be reduced reliably. The image 3 can be blurred by motion blur (processing for making an afterimage of motion), transparency, low contrast, or the like. By performing these blurring processes, display discomfort can be reduced. In addition, it is possible to prevent car sickness caused by continuing to watch the display having flicker.
 (作用効果3)ちらつき補正部22では、サンプリング部41が、撮像部4で撮影した画像3をサンプリングし、物体検出部42が、サンプリング部41でサンプリングした画像3から自車両2に対して相対変位する物体24の検出し、周波数変換部43が、物体検出部42で相対変位する物体24の検出に用いたデータを周波数変換し、ちらつき判断部45が、周波数変換部43で周波数変換されたデータ中の不快と感じる周波数成分が、予め設定した閾値44を越えた場合に、ちらつきと判断する。これにより、撮像部4で撮影した画像3のちらつきを具体的に検出して、ぼかすことができるようになる。 (Operation Effect 3) In the flicker correction unit 22, the sampling unit 41 samples the image 3 captured by the imaging unit 4, and the object detection unit 42 is relative to the host vehicle 2 from the image 3 sampled by the sampling unit 41. The object 24 that is displaced is detected, the frequency converter 43 converts the frequency of the data used to detect the object 24 that is relatively displaced by the object detector 42, and the flicker determination unit 45 is frequency-converted by the frequency converter 43. When the frequency component in the data that is felt uncomfortable exceeds a preset threshold value 44, it is determined that the flicker is present. As a result, the flicker of the image 3 captured by the imaging unit 4 can be specifically detected and blurred.
 (作用効果4)物体検出部42は、サンプリング部41でサンプリングした画像3を、離散コサイン変換することで、自車両2に対して相対変位する物体24の検出を行い、周波数変換部43は、物体検出部42で相対変位する物体24の検出に用いたデータを、フーリエ変換を用いて周波数変換するようにした。これによって、ちらつきのある物体24の検出を実際に行うことができる。 (Operation Effect 4) The object detection unit 42 performs the discrete cosine transform on the image 3 sampled by the sampling unit 41 to detect the object 24 that is relatively displaced with respect to the host vehicle 2, and the frequency conversion unit 43 The data used to detect the object 24 that is relatively displaced by the object detection unit 42 is frequency-converted using Fourier transform. Thereby, the flickering object 24 can be actually detected.
 1   車両用周辺監視装置
 2   自車両
 3   画像
 4   撮像部
 5   画像処理部
 7   表示部
 22  ちらつき補正部
 24  規則的に配置された物体
 31  ちらつき部分
 32  ちらつき部分
 35  ぼかし部
 36  ぼかし部
 41  サンプリング部
 42  物体検出部
 42a データ
 43  周波数変換部
 43a データ
 44  閾値
 45  ちらつき判断部
DESCRIPTION OF SYMBOLS 1 Vehicle periphery monitoring apparatus 2 Own vehicle 3 Image 4 Imaging part 5 Image processing part 7 Display part 22 Flicker correction part 24 Object regularly arranged 31 Flicker part 32 Flicker part 35 Blur part 36 Blur part 41 Sampling part 42 Object Detection unit 42a Data 43 Frequency conversion unit 43a Data 44 Threshold value 45 Flicker judgment unit
関連出願の相互参照Cross-reference of related applications
 本出願は、2016年7月28日に、日本国特許庁に出願された特願2016-148058に基づいて優先権を主張し、その全ての開示は、完全に本明細書で参照により組み込まれる。 This application claims priority based on Japanese Patent Application No. 2016-148058 filed with the Japan Patent Office on July 28, 2016, the entire disclosure of which is fully incorporated herein by reference. .

Claims (8)

  1.   自車両周辺の画像を撮影可能な撮像部と、
      該撮像部で撮影した画像を画像処理する画像処理部と、
      該画像処理部で画像処理した画像を表示する表示部と、を備え、
      前記画像処理部は、前記撮像部で撮影した画像にちらつきがある場合に、前記表示部に表示される画像のちらつきが軽減されるように補正可能なちらつき補正部を備えると共に、
      前記ちらつき補正部は、前記撮像部で撮影した画像中の、規則的に配置された物体の自車両との相対変位をちらつきと判断することを特徴とする車両用周辺監視装置。
    An imaging unit capable of capturing an image of the surroundings of the own vehicle;
    An image processing unit that performs image processing on an image captured by the imaging unit;
    A display unit for displaying an image processed by the image processing unit,
    The image processing unit includes a flicker correction unit that can be corrected so that flicker of an image displayed on the display unit is reduced when the image captured by the imaging unit has flicker.
    The vehicle flicker monitoring device, wherein the flicker correction unit determines that the relative displacement of the regularly arranged object with the host vehicle in the image captured by the imaging unit is flicker.
  2.  請求項1に記載の車両用周辺監視装置において、
      前記ちらつき補正部は、画像のちらつき部分をぼかすことによって、前記表示部に表示されるちらつきを軽減するよう構成されたことを特徴とする車両用周辺監視装置。
    The vehicle periphery monitoring device according to claim 1,
    The vehicle periphery monitoring device, wherein the flicker correction unit is configured to reduce the flicker displayed on the display unit by blurring a flicker portion of an image.
  3.  請求項1に記載の車両用周辺監視装置において、
      前記ちらつき補正部は、前記撮像部で撮影した画像をサンプリングするサンプリング部と、
      該サンプリング部でサンプリングした画像から自車両に対して相対変位する物体の検出を行う物体検出部と、
      該物体検出部で相対変位する物体の検出に用いたデータを周波数変換する周波数変換部と、
      該周波数変換部で周波数変換されたデータ中の不快と感じる周波数成分が、予め設定した閾値を越えた場合に、ちらつきと判断するちらつき判断部と、を有することを特徴とする車両用周辺監視装置。
    The vehicle periphery monitoring device according to claim 1,
    The flicker correction unit is a sampling unit that samples an image captured by the imaging unit;
    An object detection unit that detects an object that is relatively displaced with respect to the host vehicle from the image sampled by the sampling unit;
    A frequency converter that converts the frequency of the data used to detect the object that is relatively displaced by the object detector;
    A vehicle periphery monitoring device, comprising: a flicker determination unit that determines flicker when a frequency component that is uncomfortable in data frequency-converted by the frequency conversion unit exceeds a preset threshold value .
  4.  請求項3に記載の車両用周辺監視装置において、
      前記物体検出部は、前記サンプリング部でサンプリングした画像を、離散コサイン変換によって、自車両に対して相対変位する物体の検出を行うものとされ、
      前記周波数変換部は、前記物体検出部で相対変位する物体の検出に用いたデータを、フーリエ変換を用いて周波数変換するものとされたことを特徴とする車両用周辺監視装置。
    In the vehicle periphery monitoring device according to claim 3,
    The object detection unit is configured to detect an object that is relatively displaced with respect to the host vehicle by performing discrete cosine transform on the image sampled by the sampling unit,
    The vehicle periphery monitoring device, wherein the frequency converter is configured to frequency-convert data used for detection of an object that is relatively displaced by the object detector using Fourier transform.
  5.   撮像部によって、自車両周辺の画像を撮影し、
      画像処理部によって、前記撮像部で撮影した画像を画像処理し、
      表示部によって、前記画像処理部で画像処理した画像を表示すると共に、
      前記画像処理部は、ちらつき補正部によって、前記撮像部で撮影した画像にちらつきがある場合に、前記表示部に表示される画像のちらつきが軽減されるように補正すると共に、
      前記ちらつき補正部は、前記撮像部で撮影した画像中の、規則的に配置された物体の自車両との相対変位をちらつきと判断することを特徴とする車両周辺監視方法。
    Take an image of the area around the vehicle with the imaging unit,
    The image processing unit performs image processing on the image captured by the imaging unit,
    The display unit displays the image processed by the image processing unit,
    The image processing unit corrects the flicker correction unit so that the flicker of the image displayed on the display unit is reduced when the image captured by the imaging unit has flicker.
    The vehicle flicker correction method, wherein the flicker correction unit determines that the relative displacement of the regularly arranged object with the host vehicle in the image captured by the imaging unit is flicker.
  6.  請求項5に記載の車両周辺監視方法において、
      前記ちらつき補正部は、画像のちらつき部分をぼかすことによって、前記表示部に表示されるちらつきを軽減することを特徴とする車両周辺監視方法。
    The vehicle periphery monitoring method according to claim 5, wherein
    The vehicle flicker monitoring method, wherein the flicker correction unit reduces flicker displayed on the display unit by blurring a flicker portion of an image.
  7.  請求項5に記載の車両周辺監視方法において、
      前記ちらつき補正部は、サンプリング部によって、撮像部で撮影した画像をサンプリングし、
      物体検出部によって、前記サンプリング部でサンプリングした画像から自車両に対して相対変位する物体の検出を行い、
      周波数変換部によって、前記物体検出部で相対変位する物体の検出に用いたデータを周波数変換し、
      ちらつき判断部によって、前記周波数変換部で周波数変換されたデータ中の不快と感じる周波数成分が、予め設定した閾値を越えた場合に、ちらつきと判断することを特徴とする車両周辺監視方法。
    The vehicle periphery monitoring method according to claim 5, wherein
    The flicker correction unit samples the image taken by the imaging unit by the sampling unit,
    The object detection unit detects an object that is displaced relative to the host vehicle from the image sampled by the sampling unit,
    The frequency conversion unit converts the frequency of the data used to detect the object that is relatively displaced by the object detection unit,
    A vehicle periphery monitoring method, wherein a flicker judgment unit judges flicker when a frequency component that is felt uncomfortable in data frequency-converted by the frequency conversion unit exceeds a preset threshold value.
  8.  請求項7に記載の車両周辺監視方法において、
      前記物体検出部は、前記サンプリング部でサンプリングした画像を、離散コサイン変換することで、自車両に対して相対変位する物体の検出を行い、
      前記周波数変換部は、前記物体検出部で相対変位する物体の検出に用いたデータを、フーリエ変換を用いて周波数変換することを特徴とする車両周辺監視方法。
    The vehicle periphery monitoring method according to claim 7,
    The object detection unit detects an object relatively displaced with respect to the host vehicle by performing discrete cosine transform on the image sampled by the sampling unit,
    The method of monitoring a vehicle periphery, wherein the frequency conversion unit frequency-converts data used for detection of an object that is relatively displaced by the object detection unit using Fourier transform.
PCT/JP2017/023559 2016-07-28 2017-06-27 Vehicle periphery monitoring device, and method for monitoring periphery of vehicle WO2018020936A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112017003775.1T DE112017003775T5 (en) 2016-07-28 2017-06-27 Vehicle environment monitor and vehicle environment monitor process
CN201780037399.2A CN109479122A (en) 2016-07-28 2017-06-27 Vehicle periphery monitoring device and vehicle periphery monitoring method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-148058 2016-07-28
JP2016148058A JP6691012B2 (en) 2016-07-28 2016-07-28 Vehicle periphery monitoring device and vehicle periphery monitoring method

Publications (1)

Publication Number Publication Date
WO2018020936A1 true WO2018020936A1 (en) 2018-02-01

Family

ID=61016767

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/023559 WO2018020936A1 (en) 2016-07-28 2017-06-27 Vehicle periphery monitoring device, and method for monitoring periphery of vehicle

Country Status (4)

Country Link
JP (1) JP6691012B2 (en)
CN (1) CN109479122A (en)
DE (1) DE112017003775T5 (en)
WO (1) WO2018020936A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4339027A1 (en) * 2022-09-09 2024-03-20 Alps Alpine Co., Ltd. Vehicle rear side image display device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341476A (en) * 1998-05-21 1999-12-10 Toyo Commun Equip Co Ltd Monitoring system
JP2002259985A (en) * 2001-03-02 2002-09-13 Hitachi Ltd Image monitoring method, image monitoring device and storage medium
WO2012172842A1 (en) * 2011-06-13 2012-12-20 本田技研工業株式会社 Driving assistance device
JP2016021649A (en) * 2014-07-14 2016-02-04 パナソニックIpマネジメント株式会社 Image processing system, image processing device and image processing method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02214382A (en) * 1989-02-15 1990-08-27 Nec Corp Television camera device
JP5259277B2 (en) * 2008-07-02 2013-08-07 本田技研工業株式会社 Driving assistance device
JP5625371B2 (en) * 2010-01-29 2014-11-19 ソニー株式会社 Image processing apparatus, signal processing method, and program
US9255183B2 (en) 2010-03-10 2016-02-09 Sk Chemicals Co., Ltd. Polyarylene sulfide having reduced outgassing and preparation method thereof
JP5320331B2 (en) * 2010-03-17 2013-10-23 日立オートモティブシステムズ株式会社 In-vehicle environment recognition device and in-vehicle environment recognition system
JP5945395B2 (en) * 2011-10-13 2016-07-05 オリンパス株式会社 Imaging device
CN103295219B (en) * 2012-03-02 2017-05-10 北京数码视讯科技股份有限公司 Method and device for segmenting image
JP2014116756A (en) 2012-12-07 2014-06-26 Toyota Motor Corp Periphery monitoring system
CN104104882B (en) * 2013-04-09 2017-08-11 展讯通信(上海)有限公司 Image flicker detection method and device, image capture device
US9686505B2 (en) * 2014-01-03 2017-06-20 Mediatek Singapore Pte. Ltd. Method for flicker detection and associated circuit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341476A (en) * 1998-05-21 1999-12-10 Toyo Commun Equip Co Ltd Monitoring system
JP2002259985A (en) * 2001-03-02 2002-09-13 Hitachi Ltd Image monitoring method, image monitoring device and storage medium
WO2012172842A1 (en) * 2011-06-13 2012-12-20 本田技研工業株式会社 Driving assistance device
JP2016021649A (en) * 2014-07-14 2016-02-04 パナソニックIpマネジメント株式会社 Image processing system, image processing device and image processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4339027A1 (en) * 2022-09-09 2024-03-20 Alps Alpine Co., Ltd. Vehicle rear side image display device

Also Published As

Publication number Publication date
JP2018019250A (en) 2018-02-01
DE112017003775T5 (en) 2019-05-09
CN109479122A (en) 2019-03-15
JP6691012B2 (en) 2020-04-28

Similar Documents

Publication Publication Date Title
US8345095B2 (en) Blind spot image display apparatus and method thereof for vehicle
EP2434759B1 (en) Monitoring apparatus
US11082626B2 (en) Image processing device, imaging device, and image processing method
JP4706466B2 (en) Imaging device
EP3200449A1 (en) On-board electronic mirror
EP1983397A3 (en) Landmark navigation for vehicles using blinking optical beacons
WO2011043030A1 (en) Vehicle peripheral monitoring device
JP5759907B2 (en) In-vehicle imaging device
JP6820074B2 (en) Crew number detection system, occupant number detection method, and program
US10455159B2 (en) Imaging setting changing apparatus, imaging system, and imaging setting changing method
JP2011250376A (en) Vehicle periphery image display device
JP2016111509A (en) Image processing device for vehicle, image processing method for vehicle and program
JP4552636B2 (en) Driver monitor system and processing method thereof
WO2018020936A1 (en) Vehicle periphery monitoring device, and method for monitoring periphery of vehicle
JP5073461B2 (en) Vehicle perimeter monitoring system
JP2010108182A (en) Vehicle driving support apparatus
JPH0948282A (en) Indirect visual confirmation device for vehicle
JP3210758B2 (en) Display image contrast improvement method
JP2008290527A (en) Vehicular dazzling prevention system
US20210217146A1 (en) Image processing apparatus and image processing method
JP4512044B2 (en) Road section line detector
KR101428229B1 (en) Apparatus and method for acquising differential image
JP2008060981A (en) Image observation apparatus
US20190279385A1 (en) Vision system and method for a motor vehicle
JP2018019250A5 (en)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17833945

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17833945

Country of ref document: EP

Kind code of ref document: A1