WO2019235004A1 - Calibration device and electronic mirror system - Google Patents

Calibration device and electronic mirror system Download PDF

Info

Publication number
WO2019235004A1
WO2019235004A1 PCT/JP2019/008266 JP2019008266W WO2019235004A1 WO 2019235004 A1 WO2019235004 A1 WO 2019235004A1 JP 2019008266 W JP2019008266 W JP 2019008266W WO 2019235004 A1 WO2019235004 A1 WO 2019235004A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
road
image
lane
camera
Prior art date
Application number
PCT/JP2019/008266
Other languages
French (fr)
Japanese (ja)
Inventor
彩乃 宮下
清水 直樹
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Publication of WO2019235004A1 publication Critical patent/WO2019235004A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a calibration device and an electronic mirror system.
  • the driver looks at the side mirrors provided facing both sides of the vehicle and visually confirms the image of the rear area reflected by the side mirror.
  • a so-called electronic mirror system that includes a camera facing rearward and a monitor installed in a vehicle interior that displays an image of a rear area captured by the camera (for example, see Patent Document 1).
  • the aerodynamic resistance of the vehicle can be reduced by replacing the camera with a physical size smaller than that of the side mirror.
  • the electronic mirror system it is possible to detect the following vehicle existing in the rear area based on the image captured by the camera. However, if the driving road is curved, the exact position of the following vehicle along the road (the driving lane and the distance from the vehicle) is detected based on the position of the following vehicle in the image. Difficult to do.
  • the present invention has been made in view of the above circumstances, and based on an image of a camera provided in the own vehicle, the positional relationship between the other vehicle reflected in the image of the camera and the own vehicle in real space is accurately obtained. It is an object of the present invention to provide a calibration device and an electronic mirror system that can be used.
  • the first aspect of the present invention is the state of the curve of the road based on a plurality of images respectively taken by cameras mounted on the host vehicle at a plurality of time points different in time series on the road on which the host vehicle is traveling. Correspondence between a position on the image and a position along the road in the real space of a portion shown in the image in a reference state on the assumption that the road is a straight line
  • a calibration apparatus comprising: a storage unit that stores a relationship; and a correction unit that corrects the correspondence stored in the storage unit based on the state of the road curve detected by the road state detection unit It is.
  • a camera that is mounted on a host vehicle and is provided at a site where a reflector is to be provided, and that captures an image of the outside of the host vehicle, and an image captured by the camera, is the reflector.
  • An image display device provided in the passenger compartment of the host vehicle that displays a horizontally inverted image as reflected and viewed, and a calibration device according to the present invention, wherein the image display device is a calibration device. It is an electronic mirror system which performs display which emphasizes the approaching other vehicle by the signal output from the notification output unit of the apparatus.
  • the positional relationship between the other vehicle reflected in the image of the camera and the own vehicle in the real space is accurately determined. Can be sought.
  • FIG. 1 is a block diagram showing an electronic mirror system 1 that is an example of an electronic mirror system according to the present invention, including a camera ECU (Electronic Control Unit) 30 that is an example of a calibration device according to the present invention.
  • FIG. A vehicle (hereinafter referred to as a host vehicle) equipped with the electronic mirror system shown in FIG. 1 is traveling in the left lane of a horizontal road that is not curved, for example, three lanes on one side. It is the figure represented typically. It is an example of the image which expressed the image which the camera shown in FIG. 2 image
  • FIG. 7 is a diagram illustrating a state in which a grid corrected to follow the road curve illustrated in FIGS. 6 and 7 is superimposed on the image illustrated in FIG. 6.
  • FIG. 10 is an example of an image obtained by inverting the image captured by the camera illustrated in FIG. It is a figure which shows the state which superimposed the grating
  • FIG. 1 is a block diagram showing an electronic mirror system 1 which is an example of an electronic mirror system according to the present invention, including a camera ECU (Electronic Control Unit) 30 which is an example of a calibration apparatus according to the present invention.
  • FIG. A vehicle 200 equipped with the electronic mirror system 1 shown below (hereinafter referred to as the own vehicle 200) is traveling from the upper side in a state in which the vehicle is traveling on the left lane L1 of a horizontal road 400 that is not curved, for example, three lanes on one side.
  • FIG. 3 schematically shows a viewpoint, and FIG. 3 is an example of an image P1 that is obtained by horizontally inverting the image taken by the camera 10 shown in FIG.
  • the electronic mirror system 1 shown in FIG. 1 includes a camera 10, an in-vehicle network (CAN: Controller Area Network) 20, a camera ECU 30, and a monitor 90.
  • CAN Controller Area Network
  • the camera 10 is provided as an alternative to the left and right side mirrors at positions on both sides where the left and right side mirrors of the host vehicle 200 should be respectively attached. That is, one camera 10 is provided on each of the left and right, and each camera 10 captures a rear region on the side where the host vehicle 200 is mainly installed. Specifically, as shown in FIG. 2, the camera 10 provided on the right side of the traveling direction indicated by the arrow of the host vehicle 200 is installed toward the rear of the right side of the host vehicle 200, and Shoot the area. Although not shown, the camera 10 provided on the left side of the traveling direction indicated by the arrow of the host vehicle 200 is installed toward the rear left side of the host vehicle 200, and the left rear region is defined. Take a picture.
  • the image captured by the camera 10 is displayed on the monitor 90 as it is, the image familiar to the side mirror is an inverted image. , You may feel uncomfortable or confused.
  • the camera ECU 30 performs a process of horizontally reversing the image captured by the camera 10 to display the horizontally reversed image P1 on the monitor 90 as shown in FIG.
  • the in-vehicle network 20 is a line that shares various types of information (vehicle speed, steering angle, gear position, etc.) of the host vehicle 200.
  • the camera ECU 30 includes a road state detection unit 31, a storage unit 32, a correction unit 33, an object recognition unit 34, a travel lane detection unit 35, a distance calculation unit 36, and a notification output unit 37. .
  • the road state detection unit 31 detects the curve state of the road 400 on which the host vehicle 200 is traveling. Specifically, the state of the curve is detected based on a plurality of left and right reversed images P1 captured by the camera 10 at a plurality of time points different in time series on the road 400 on which the host vehicle 200 is traveling.
  • the end points 411 of the white line 410 that separates the lanes L1 and L2 (see FIG. 2) of the road 400 in a plurality of images P1 captured by the camera 10 at a plurality of time points that are different in time series and reversed left and right.
  • the degree of the curve of the road 400 is detected based on a time-series change in the position of the upper left corner point).
  • the road state detection unit 31 may detect a curve state based on a change in time-series position of a photographed object other than the end point 411 of the white line 410.
  • FIG. 4 is a schematic diagram showing the calibration grid 40 stored in the storage unit 32.
  • FIG. 5 shows an example when the calibration grid 40 is captured by the camera 10 and superimposed on the horizontally inverted image P1. It is a schematic diagram shown.
  • the storage unit 32 stores a calibration grid 40 shown in FIG.
  • This lattice 40 is the position and real space on the image P1 of each part on the road surface of the road 400 shown in the image P1 in a reference state assuming that the road 400 on which the vehicle 200 is traveling is a straight line. It is an example of the table (map) showing the correspondence with the position along the road 400 in FIG.
  • the lines M1, M2, M3, M4, M5, M6, and M7 extending in the lateral direction of the grid 40 are equidistant intervals in the front-rear direction of the host vehicle 200 in the real space (for example, 10 [m] intervals in the real space) )
  • the lines N1, N2, N3, N4, N5, and N6 extending in the vertical direction are equidistant intervals in the vehicle width direction of the host vehicle 200 in the real space (for example, 1 [m] intervals in the real space) It corresponds to. That is, the lattice 40 corresponds to a rectangular area in the front-rear direction 60 [m] and the vehicle width direction 5 [m] of the host vehicle 200 in the real space.
  • the correction unit 33 corrects the lattice 40 stored in the storage unit 32 based on the curve state of the road 400 detected by the road state detection unit 31. Specifically, for example, when the road state detection unit 31 detects that the road 400 is in a straight line that is not curved, the correction unit 33 uses the lattice 40 in the reference state stored in the storage unit 32. Without being corrected, as shown in FIG. 5, it is directly superimposed on the image P1.
  • the correction unit 33 corrects the lattice 40 in the reference state so as to follow the detected curve of the road 400. I do.
  • FIG. 6 schematically illustrates a state in which the host vehicle 200 is traveling on the left lane L1 of a horizontal road that is gently curved in the right direction of, for example, three lanes on one side, from the viewpoint from above.
  • FIGS. 7 and 7 are examples of an image P1 obtained by reversing the image P0 captured by the camera 10 shown in FIG. 6, and FIG. 8 is corrected so as to follow the curve of the road 400 shown in FIGS.
  • FIG. 7 is a diagram showing a state in which a grid 40 ′ is superimposed on the image P1 shown in FIG. 6.
  • FIG. 9 shows that the own vehicle 200 exits a side road lane L1 branched from the two-lane highway on the left side, for example, and curves to the right of the side lane L1.
  • FIG. 10 is a diagram schematically illustrating the state of the camera 10 from above
  • FIG. 10 is an example of an image P1 obtained by horizontally inverting the image P0 captured by the camera 10 illustrated in FIG. 9,
  • FIG. 10 is a diagram showing a state in which a lattice 40 ′ corrected to follow the curve of the road 400 shown is superimposed on the image P ⁇ b> 1 shown in FIG. 9.
  • the correction unit 33 sets the grid 40 to be superimposed on the image P1 in FIG. Then, it is transformed into a lattice 40 ′ along the detected curve of the road 400 and superimposed on the image P 1 of FIG. 6 as shown in FIG. Thereby, it is possible to specify the positional relationship between each part on the road surface of the road 400 and the target object in contact with the road surface with the host vehicle 200 along the curve of the road 400 in the real space.
  • the correction unit 33 displays the image P1 in FIG.
  • the grid 40 to be superimposed is transformed into a grid 40 ′ along the detected curve of the road 400 and superimposed on the image P 1 of FIG. 9 as shown in FIG.
  • the positional relationship between each part on the road surface of the road 400 and the vehicle 200 along the curve of the road 400 in the real space can be specified.
  • the object recognizing unit 34 follows vehicles 301, 302,... (Hereinafter referred to as succeeding vehicles 301) running behind the host vehicle 200 in the image P ⁇ b> 1. To detect.
  • the object recognizing unit 34 stores in advance a pattern such as a contour shape or a pattern mainly on the front surface of a vehicle including a motorcycle or the like in the storage unit 32, and the stored pattern is subjected to pattern matching or the like with respect to the image P1.
  • the subsequent vehicle 301 is detected.
  • the object recognition unit 34 detects the position of the subsequent vehicle 301 or the like in the image P1 by detecting the position of the detected subsequent vehicle 301 or the like on the road surface.
  • the traveling lane detection unit 35 creates a road 400 in the real space based on the grid 40 obtained by correcting the position of the succeeding vehicle 301 or the like recognized by the object recognition unit 34 on the image P1 by the correction unit 33.
  • the position along the road 400 in the real space is specified by converting to the position along the road, and the lane in which the succeeding vehicle 301 or the like is traveling in the real space is specified.
  • the lattice 40 in a straight line where the road 400 is not curved, the lattice 40 remains the lattice 40 in the reference state, and the lattice 40 in the reference state is superimposed on FIG. Accordingly, the lane is specified in the real space of each succeeding vehicle 301 or the like.
  • the traveling lane detector 35, the following vehicle 301 travels in the same lane L ⁇ b> 1 as the own vehicle 200, and the following vehicle 302 is adjacent to the right of the traveling lane L ⁇ b> 1 of the own vehicle 200 ( It is determined that the vehicle travels in the adjacent lane (L2) and the following vehicle 303 is traveling in the lane (adjacent lane) L3 adjacent to the right of the adjacent lane L2.
  • the traveling lane detection unit 35 follows each of the following vehicles 301 according to the state where the deformed lattice 40 ′ is superimposed. Identify the lane in real space such as.
  • the traveling lane detection unit 35 causes the subsequent vehicle 302 to travel in the lane (adjacent lane) L2 adjacent to the right of the traveling lane L1 of the host vehicle 200, and the subsequent vehicles 303, 304, 305. Specifies that the vehicle is traveling in a lane (adjacent lane) L3 adjacent to the right of the adjacent lane L2.
  • the traveling lane detector 35 detects each succeeding vehicle according to the state in which the deformed lattice 40 ′ is superimposed.
  • a lane is specified in a real space such as 301.
  • the traveling lane detector 35 specifies that the following vehicles 301 and 306 are traveling in the same lane L1 as the host vehicle 200, as shown in FIG.
  • the distance calculation unit 36 obtains the distance along the curve of the road 400 between the position of the own vehicle 200 and the position of the succeeding vehicle 301 in the real space based on the grid 40 obtained by the correction by the correction unit 33. .
  • the notification output unit 37 includes a lane L2 adjacent to the lane L1 in which the vehicle 200 is traveling, and the distance calculation unit in which the lane in which the subsequent vehicle 301 detected by the travel lane detection unit 35 is traveling.
  • the distance calculated by 36 is shortened in time series, that is, when a subsequent vehicle traveling in the adjacent lane L2 is approaching the host vehicle 200, a signal for notifying the approach of the subsequent vehicle is monitored. Output to 90.
  • the monitor 90 visually displays an image P1 obtained by horizontally inverting the image P0 taken by the camera 10 by the camera ECU 30. At this time, when the camera ECU 30 outputs a signal that informs the adjacent lane L2 of the following vehicle having a higher traveling speed than the own vehicle 200 so as to approach the own vehicle 200, the monitor 90 displays, for example, FIG. As shown in FIG. 3, a vehicle detection frame 350 surrounding the detected subsequent vehicle 302 is superimposed on the image P1.
  • the camera 10 installed on the right side of the host vehicle 200 captures a right rear area of the host vehicle 200 at a predetermined cycle, and the captured image is input to the camera ECU 30.
  • the camera ECU 30 converts each image periodically input from the camera 10 into a left-right inverted image P1.
  • the road state detection unit 31 detects the state of the curve of the road 400 on which the host vehicle 200 is traveling based on a plurality of images P1 that are periodically input to the camera ECU 30 and reversed left and right. The detection of the state of the curve is based on each image P1 taken at different timings in time series when the end point 411 of the specific white line 410 shown in FIG. 2 moves backward as the host vehicle 200 travels. It is determined by the position transition above.
  • FIG. 12 schematically shows how the end point 411 moves from the initial position R1 to the position R2 ′, the position R3 ′, the position R4 ′, and the position R5 ′ in the image P1 as the host vehicle 200 travels. It is a thing.
  • the road condition detection unit 31 detects the positions R1, R2 ′, R3 ′, R4 ′, R5 ′ of the end points 411 that move with time on the image P1, for example, as the positions of the XY orthogonal coordinates in FIG.
  • the position R1, the position R2, the position R3, the position R4, and the position R5 are vehicle speeds (CAN20 at that time) of the host vehicle 200 with a steering angle of 0 [degree] in a reference state where the road 400 is not curved.
  • the predicted positions R1, R2, R3, R4, and R5 are based on a steering angle of 0 [degree] using the vehicle speed of the host vehicle 200 input from the in-vehicle network 20.
  • the predicted positions R1, R2, R3, R4, and R5 in this reference state are aligned on a straight line.
  • the positions R1, R2 ′, R3 ′, R4 ′, and R5 ′ detected by the road condition detection unit 31 are not aligned, and the position R2 ′ is shifted from the position R2 in the X-axis direction and the Y-axis direction, respectively.
  • the subsequent positions R3 ′, R4 ′, and R5 ′ are also shifted in the X-axis direction and the Y-axis direction from the corresponding positions R3, R4, and R5, respectively.
  • This positional deviation in the X-axis direction and the Y-axis direction indicates that the direction of the host vehicle 200 is changing in time series, that is, the road 400 on which the host vehicle 200 is traveling is curved. Yes.
  • the correction unit 33 corrects the calibration grid 40 based on the curve state of the road 400 detected by the road state detection unit 31.
  • a deviation amount ⁇ R2x in which the position R2 ′ detected by the road state detection unit 31 is displaced in the X-axis direction from the predicted position R2 in the reference state (beside the X-axis in FIG. 12).
  • the following description is the same.
  • the amount of displacement ⁇ R2y displaced in the Y-axis direction (described beside the Y-axis in FIG. 12; the same applies hereinafter), and the position R3 ′ from the predicted position R3 in the reference state
  • the deviations ⁇ R3x and ⁇ R3y and R4 ′ which are displaced in the X-axis direction and the Y-axis direction are shifted from the predicted position R4 in the reference state in the X-axis direction.
  • the amount ⁇ R4x and the amount of displacement ⁇ R4y, R5 ′ that is displaced in the Y-axis direction are displaced in the Y-axis direction from the amount of displacement ⁇ R5x that is displaced in the X-axis direction from the predicted position R5 in the reference state. Based on the deviation amount ⁇ R5y, the lattice 40 shown in FIG. 2 is deformed.
  • the movement destination positions other than the positions R2, R3, R4, and R5 in the lattice 40 are the movement destination positions R2 ′, R3 ′, and R4 ′ of these positions R2, R3, R4, and R5.
  • R5 ′ can be calculated as representative points, whereby the lattice 40 can be deformed along the curved road 400.
  • the correction amount for deformation of the grid 40 is calculated.
  • the calculation may be performed only for a partial region P2 behind the host vehicle 200 shown in FIG. Thereby, the calculation amount for correction
  • suppression of calculation load is one of the important issues.
  • the target recognition unit 34 detects the following vehicle from the image P1.
  • the traveling lane detector 35 identifies the lane in which the subsequent vehicle is traveling based on the corrected grid 40 '.
  • the distance calculation unit 36 obtains the distance along the curve of the road 400 from the own vehicle 200 to the following vehicle detected in the image P1 based on the corrected grid 40 ′.
  • the camera ECU 30 identifies the detected lane in which each subsequent vehicle travels and the distance along the curve of the road 400 from the own vehicle 200 to each subsequent vehicle. And the alerting
  • the notification output unit 37 detects a succeeding vehicle traveling in the adjacent lane L2 approaching the host vehicle 200, the monitor 90 A notification signal is output.
  • the monitor 90 displays a vehicle detection frame 350 that surrounds the succeeding vehicle 302 as shown in FIG. It is displayed superimposed on the image P1.
  • the notification output unit 37 follows the following vehicle that travels in the lane L1 in which the host vehicle 200 travels and the succeeding vehicle that travels in the adjacent lane L3. Even if the vehicle approaches the vehicle 200, no notification signal is output. Similarly, even if the following vehicle travels in the adjacent lane L2, the notification output unit 37 does not output a notification signal even when the subsequent vehicle is not approaching the host vehicle 200.
  • the monitor 90 Since there is no notification signal output from the notification output unit 37, the monitor 90 displays the image P1 without adding a vehicle detection frame 350 that alerts the driver to any subsequent vehicle.
  • the camera ECU 30 and the electronic mirror system 1 of the present embodiment based on the image P1 of the camera 10 provided in the host vehicle 200, the following vehicle 301 or the like shown in the image P1 of the camera 10, The positional relationship with the own vehicle 200 along the road 400 in the real space can be accurately obtained.
  • this embodiment demonstrated only the positional relationship of the own vehicle 200 and the following vehicle mainly on the right rear of the own vehicle 200 image
  • the own vehicle The positional relationship between the own vehicle 200 and the following vehicle, mainly the rear left side of the own vehicle 200, taken by the camera 10 provided on the left side of the own vehicle 200 is similarly implemented.
  • this embodiment demonstrated only the positional relationship of the following vehicle and the own vehicle 200 which exist behind the own vehicle 200
  • the calibration apparatus and electronic mirror system which concern on this invention are not only a subsequent vehicle.
  • the positional relationship between the preceding vehicle existing ahead of the host vehicle 200 and the host vehicle 200 can be similarly applied. In this case, the camera which faces the front of the own vehicle is applied.
  • the grid 40 is deformed in accordance with the degree of bending of the road 400, and the position of the succeeding vehicle in the image P1 is obtained by the lane and the distance along the lane represented by the deformed grid 40 ′.
  • the image P1 is converted into an image of another coordinate system specified by the horizontal lines M1, M2,... And the vertical lines N1, N2,.
  • the position (lane, distance from own vehicle 200, etc.) of the following vehicle may be specified on the image of another coordinate system obtained as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

In order for a calibration device to precisely determine, from an image of a camera installed in a vehicle, the real space positional relation of the vehicle to another vehicle captured by the camera, a camera ECU (30) is provided with: a road state detection unit (31) for detecting the state of a curb on a road (400) on which the vehicle (200) is traveling on the basis of multiple images (P1) of the road (400) captured at different points in time chronologically by the camera (10) installed in the vehicle (200); a storage unit (32) for storing, in the form of a table consisting of a calibration grid (40), correspondence relations between the positions of a subsequent vehicle (400) in the images (P1) and positions along the road (400) in real space in a reference state in which the road (400) is assumed to be straight and level; and a correction unit (33) for correcting the grid (40) stored in the storage unit (32) on the basis of the state of the curb on the road (400) detected by the road state detection unit (31).

Description

キャリブレーション装置及び電子ミラーシステムCalibration apparatus and electronic mirror system
 本発明は、キャリブレーション装置及び電子ミラーシステムに関する。 The present invention relates to a calibration device and an electronic mirror system.
 車両の運転者は、車線変更等の際に、車両の両側部に後方を向いて設けられたサイドミラーを見て、そのサイドミラーで反射した後方領域の像を目視で確認している。 When the vehicle driver changes lanes, the driver looks at the side mirrors provided facing both sides of the vehicle and visually confirms the image of the rear area reflected by the side mirror.
 近年、サイドミラーに代えて、後方を向いたカメラと、そのカメラで撮像された後方領域の画像を表示する、車室内に設置されたモニタと、を有するいわゆる電子ミラーシステムが開発されている(例えば、特許文献1参照)。電子ミラーシステムによれば、物理的なサイズがサイドミラーよりも小さいカメラに置き換わることで、車両の空力抵抗を低減することができる。 In recent years, instead of a side mirror, a so-called electronic mirror system has been developed that includes a camera facing rearward and a monitor installed in a vehicle interior that displays an image of a rear area captured by the camera ( For example, see Patent Document 1). According to the electronic mirror system, the aerodynamic resistance of the vehicle can be reduced by replacing the camera with a physical size smaller than that of the side mirror.
特開2018-6936号公報Japanese Patent Laid-Open No. 2018-6936
 電子ミラーシステムによれば、カメラで撮像された画像に基づいて、後方領域に存在する後続車を検知することもできる。しかし、走行している道路がカーブしていると、後続車の、画像における位置だけで、実際の後続車の、道路に沿った正確な位置関係(走行車線、自車からの距離)を検知することは難しい。 According to the electronic mirror system, it is possible to detect the following vehicle existing in the rear area based on the image captured by the camera. However, if the driving road is curved, the exact position of the following vehicle along the road (the driving lane and the distance from the vehicle) is detected based on the position of the following vehicle in the image. Difficult to do.
 本発明は上記事情に鑑みなされたものであって、自車に設けられたカメラの画像に基づいて、カメラの画像に写った他車の、現実空間における自車との位置関係を正確に求めることができるキャリブレーション装置及び電子ミラーシステムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and based on an image of a camera provided in the own vehicle, the positional relationship between the other vehicle reflected in the image of the camera and the own vehicle in real space is accurately obtained. It is an object of the present invention to provide a calibration device and an electronic mirror system that can be used.
 本発明の第1は、自車が走行中の道路において時系列的に異なる複数の時点で前記自車に搭載されたカメラによってそれぞれ撮影された複数の画像に基づいて、前記道路のカーブの状態を検出する道路状態検出部と、前記道路が直線であると仮定した基準状態での、前記画像に写った部分の、前記画像上における位置と現実空間での前記道路に沿った位置との対応関係を記憶した記憶部と、前記道路状態検出部により検出された前記道路のカーブの状態に基づいて、前記記憶部に記憶された前記対応関係を補正する補正部と、を備えたキャリブレーション装置である。 The first aspect of the present invention is the state of the curve of the road based on a plurality of images respectively taken by cameras mounted on the host vehicle at a plurality of time points different in time series on the road on which the host vehicle is traveling. Correspondence between a position on the image and a position along the road in the real space of a portion shown in the image in a reference state on the assumption that the road is a straight line A calibration apparatus comprising: a storage unit that stores a relationship; and a correction unit that corrects the correspondence stored in the storage unit based on the state of the road curve detected by the road state detection unit It is.
 本発明の第2は、自車に搭載されて、反射鏡が備えられるべき部位に設けられた、前記自車の外部を撮影するカメラと、前記カメラで撮影された画像を、前記反射鏡で反射して見たように左右反転した画像を表示する、前記自車の車室に設けられた画像表示装置と、本発明に係るキャリブレーション装置と、を備え、前記画像表示装置は、キャリブレーション装置の報知出力部から出力された信号により、接近する他車を強調する表示を行う電子ミラーシステムである。 According to a second aspect of the present invention, a camera that is mounted on a host vehicle and is provided at a site where a reflector is to be provided, and that captures an image of the outside of the host vehicle, and an image captured by the camera, is the reflector. An image display device provided in the passenger compartment of the host vehicle that displays a horizontally inverted image as reflected and viewed, and a calibration device according to the present invention, wherein the image display device is a calibration device. It is an electronic mirror system which performs display which emphasizes the approaching other vehicle by the signal output from the notification output unit of the apparatus.
 本発明に係るキャリブレーション装置及び電子ミラーシステムによれば、自車に設けられたカメラの画像に基づいて、カメラの画像に写った他車の、現実空間における自車との位置関係を正確に求めることができる。 According to the calibration device and the electronic mirror system according to the present invention, based on the image of the camera provided in the own vehicle, the positional relationship between the other vehicle reflected in the image of the camera and the own vehicle in the real space is accurately determined. Can be sought.
本発明に係る本発明に係るキャリブレーション装置の一例であるカメラECU(Electronic Control Unit)30を含む、本発明に電子ミラーシステムの一例である電子ミラーシステム1を示すブロック図である。1 is a block diagram showing an electronic mirror system 1 that is an example of an electronic mirror system according to the present invention, including a camera ECU (Electronic Control Unit) 30 that is an example of a calibration device according to the present invention. FIG. 図1に示した電子ミラーシステムを搭載した車両(以下、自車という。)が、例えば片側3車線のカーブしていない水平な道路の左の車線を走行している状態を上方からの視点で模式的に表した図である。A vehicle (hereinafter referred to as a host vehicle) equipped with the electronic mirror system shown in FIG. 1 is traveling in the left lane of a horizontal road that is not curved, for example, three lanes on one side. It is the figure represented typically. 図2に示したカメラが撮影した画像を左右反転して表した画像の一例である。It is an example of the image which expressed the image which the camera shown in FIG. 2 image | photographed horizontally reversed. 記憶部が記憶しているキャリブレーション用の格子を示す模式図である。It is a schematic diagram which shows the grating | lattice for calibration which the memory | storage part has memorize | stored. キャリブレーション用の格子をカメラで撮影して左右反転した画像に重畳させたとき一例を示す模式図である。It is a schematic diagram which shows an example when the grating | lattice for calibration was image | photographed with the camera and it superimposed on the image reversed horizontally. 自車が、例えば片側3車線の右方向に緩やかにカーブしている水平な道路の左の車線を走行している状態を上方からの視点で模式的に表した図である。It is the figure which represented typically the state which the own vehicle is drive | working the left lane of the horizontal road which curves gently in the right direction of 3 lanes on one side from the viewpoint from the top. 図6に示したカメラが撮影した画像を左右反転して表した画像の一例である。It is an example of the image which expressed the image which the camera shown in FIG. 6 image | photographed horizontally reversed. 図6,7に示した道路のカーブに沿うように補正された格子を図6に示した画像に重畳させた状態を示す図である。FIG. 7 is a diagram illustrating a state in which a grid corrected to follow the road curve illustrated in FIGS. 6 and 7 is superimposed on the image illustrated in FIG. 6. 自車が、例えば片側2車線の高速道路から左側に分岐した側道の車線に出て、その側道の車線の右方向にカーブしているところを走行している状態を上方からの視点で模式的に表した図である。From the perspective of the vehicle from above, for example, when your vehicle is driving in a right-hand lane that diverges from a two-lane highway on one side to a left-hand lane. It is the figure represented typically. 図9に示したカメラが撮影した画像を左右反転して表した画像の一例である。FIG. 10 is an example of an image obtained by inverting the image captured by the camera illustrated in FIG. 図9,10に示した道路のカーブに沿うように補正された格子を図9に示した画像に重畳させた状態を示す図である。It is a figure which shows the state which superimposed the grating | lattice correct | amended so that it might follow the curve of the road shown to FIG. 9, 10 on the image shown in FIG. 自車の走行に伴って画像において、端点が最初の位置から移動した様子を模式的に表したものである。It is a schematic representation of how the end point has moved from the initial position in the image as the vehicle is running.
 以下、本発明に係る電子ミラーシステムの具体的な実施の形態について、図面を参照して説明する。 Hereinafter, specific embodiments of the electronic mirror system according to the present invention will be described with reference to the drawings.
<電子ミラーシステムの構成>
 図1は本発明に係るキャリブレーション装置の一例であるカメラECU(Electronic Control Unit)30を含む、本発明に電子ミラーシステムの一例である電子ミラーシステム1を示すブロック図、図2は図1に示した電子ミラーシステム1を搭載した車両200(以下、自車200という。)が、例えば片側3車線のカーブしていない水平な道路400の左の車線L1を走行している状態を上方からの視点で模式的に表した図、図3は図2に示したカメラ10が撮影した画像を左右反転して表した画像P1の一例である。
<Configuration of electronic mirror system>
FIG. 1 is a block diagram showing an electronic mirror system 1 which is an example of an electronic mirror system according to the present invention, including a camera ECU (Electronic Control Unit) 30 which is an example of a calibration apparatus according to the present invention. FIG. A vehicle 200 equipped with the electronic mirror system 1 shown below (hereinafter referred to as the own vehicle 200) is traveling from the upper side in a state in which the vehicle is traveling on the left lane L1 of a horizontal road 400 that is not curved, for example, three lanes on one side. FIG. 3 schematically shows a viewpoint, and FIG. 3 is an example of an image P1 that is obtained by horizontally inverting the image taken by the camera 10 shown in FIG.
 図1に示した電子ミラーシステム1は、カメラ10と、車載ネットワーク(CAN:Controller Area Network)20と、カメラECU30と、モニタ90と、を備えている。 The electronic mirror system 1 shown in FIG. 1 includes a camera 10, an in-vehicle network (CAN: Controller Area Network) 20, a camera ECU 30, and a monitor 90.
 カメラ10は、自車200の左右のサイドミラーがそれぞれ取り付けられるべき両側部の位置に、左右のサイドミラーの代替としてそれぞれ設けられている。すなわち、カメラ10は左右に1つずつ設けられていて、各カメラ10は、自車200の主として設置された側の後方領域を撮影する。具体的は、図2に示すように、自車200の、矢印で示した進行方向の右側部に設けられたカメラ10は、自車200の右側の後方に向けて設置され、右側の後方の領域を撮影する。なお、図示を略しているが、自車200の、矢印で示した進行方向の左側部に設けられたカメラ10は、自車200の左側の後方に向けて設置され、左側の後方の領域を撮影する。 The camera 10 is provided as an alternative to the left and right side mirrors at positions on both sides where the left and right side mirrors of the host vehicle 200 should be respectively attached. That is, one camera 10 is provided on each of the left and right, and each camera 10 captures a rear region on the side where the host vehicle 200 is mainly installed. Specifically, as shown in FIG. 2, the camera 10 provided on the right side of the traveling direction indicated by the arrow of the host vehicle 200 is installed toward the rear of the right side of the host vehicle 200, and Shoot the area. Although not shown, the camera 10 provided on the left side of the traveling direction indicated by the arrow of the host vehicle 200 is installed toward the rear left side of the host vehicle 200, and the left rear region is defined. Take a picture.
 そして、図2に示した右側のカメラ10で撮影された画像を、左右反転すると、例えば、図3に示す画像P1となる。仮に、自車200に右のサイドミラーが設置されていた場合、自車200の運転者は、そのサイドミラーで反射した右側の後方の画像を視認するが、サイドミラーで反射した画像は、サイドミラーに入射した画像に対して左右反転する。 Then, when the image taken by the right camera 10 shown in FIG. 2 is reversed left and right, for example, an image P1 shown in FIG. 3 is obtained. If the right side mirror is installed in the host vehicle 200, the driver of the host vehicle 200 visually recognizes the right rear image reflected by the side mirror, but the image reflected by the side mirror is Left and right are reversed with respect to the image incident on the mirror.
 そして、サイドミラーに写った反射した画像を見慣れていた運転者にとっては、カメラ10で撮影された画像をそのままモニタ90に表示すると、サイドミラーで見慣れていた画像とは左右反転した画像であるため、違和感を覚えたり、混乱したりするおそれがある。 For a driver who is accustomed to the reflected image reflected in the side mirror, when the image captured by the camera 10 is displayed on the monitor 90 as it is, the image familiar to the side mirror is an inverted image. , You may feel uncomfortable or confused.
 そこで、カメラECU30は、カメラ10で撮影された画像を左右反転する処理を施して、図3に示すように、左右反転した画像P1をモニタ90に表示させる。 Therefore, the camera ECU 30 performs a process of horizontally reversing the image captured by the camera 10 to display the horizontally reversed image P1 on the monitor 90 as shown in FIG.
 車載ネットワーク20は、自車200の各種情報(車速、操舵角、変速段等)を共有する回線である。 The in-vehicle network 20 is a line that shares various types of information (vehicle speed, steering angle, gear position, etc.) of the host vehicle 200.
<カメラECUの構成>
 カメラECU30は、道路状態検出部31と、記憶部32と、補正部33と、対象認識部34と、走行車線検出部35と、距離算出部36と、報知出力部37と、を備えている。
<Configuration of camera ECU>
The camera ECU 30 includes a road state detection unit 31, a storage unit 32, a correction unit 33, an object recognition unit 34, a travel lane detection unit 35, a distance calculation unit 36, and a notification output unit 37. .
 道路状態検出部31は、自車200が走行している道路400のカーブの状態を検出する。具体的には、自車200が走行中の道路400において時系列的に異なる複数の時点でカメラ10によってそれぞれ撮影された左右反転した複数の画像P1に基づいて、カーブの状態を検出する。この実施形態においては、時系列的に異なる複数の時点でそれぞれカメラ10により撮影され左右反転した複数の画像P1における、道路400の車線L1,L2(図2参照)を区切る白線410の端点411(例えば、左上の角の点)の位置の、時系列的な変化に基づいて、道路400のカーブの程度を検出する。 The road state detection unit 31 detects the curve state of the road 400 on which the host vehicle 200 is traveling. Specifically, the state of the curve is detected based on a plurality of left and right reversed images P1 captured by the camera 10 at a plurality of time points different in time series on the road 400 on which the host vehicle 200 is traveling. In this embodiment, the end points 411 of the white line 410 that separates the lanes L1 and L2 (see FIG. 2) of the road 400 in a plurality of images P1 captured by the camera 10 at a plurality of time points that are different in time series and reversed left and right. For example, the degree of the curve of the road 400 is detected based on a time-series change in the position of the upper left corner point).
 なお、道路状態検出部31は、白線410の端点411以外の撮影物の時系列的な位置の変化に基づいて、カーブの状態を検出するものであってもよい。 Note that the road state detection unit 31 may detect a curve state based on a change in time-series position of a photographed object other than the end point 411 of the white line 410.
 図4は記憶部32が記憶しているキャリブレーション用の格子40を示す模式図、図5はキャリブレーション用の格子40をカメラ10で撮影して左右反転した画像P1に重畳させたとき一例を示す模式図である。 FIG. 4 is a schematic diagram showing the calibration grid 40 stored in the storage unit 32. FIG. 5 shows an example when the calibration grid 40 is captured by the camera 10 and superimposed on the horizontally inverted image P1. It is a schematic diagram shown.
 記憶部32には、図4に示すキャリブレーション用の格子40が記憶されている。この格子40は、自車200が走行している道路400が直線であると仮定した基準状態での、画像P1に写った道路400の路面上における各部分の、画像P1上における位置と現実空間での道路400に沿った位置との対応関係を表すテーブル(マップ)の一例である。 The storage unit 32 stores a calibration grid 40 shown in FIG. This lattice 40 is the position and real space on the image P1 of each part on the road surface of the road 400 shown in the image P1 in a reference state assuming that the road 400 on which the vehicle 200 is traveling is a straight line. It is an example of the table (map) showing the correspondence with the position along the road 400 in FIG.
 つまり、この格子40をカメラ10で写る画像P1に重畳させたとき、例えば図5に示すように、画像P1における道路400の路面上の各部の位置と格子40内の位置との対応づけにより、道路400の路面上の各部の位置(画像上の座標空間における位置)を現実空間における位置(現実空間の座標空間における位置)に対応づける。 That is, when this grid 40 is superimposed on the image P1 captured by the camera 10, for example, as shown in FIG. 5, by associating the position of each part on the road surface of the road 400 in the image P1 with the position in the grid 40, The position of each part (the position in the coordinate space on the image) on the road surface of the road 400 is associated with the position in the real space (the position in the coordinate space of the real space).
 図4に示した格子40は、カメラ10の画角や焦点距離、自車200への取り付け位置、向きなどの、いわゆるカメラパラメータに基づいて設定されている。格子40の横方向に延びる各線M1,M2,M3,M4,M5,M6,M7は、現実空間における自車200の前後方向の後方への等距離の間隔(現実空間において例えば10[m]間隔)に対応し、縦方向に延びる各線N1,N2,N3,N4,N5,N6は、現実空間における自車200の車幅方向への等距離の間隔(現実空間において例えば1[m]間隔)に対応している。つまり、格子40は、現実空間における自車の200の前後方向60[m]、車幅方向5[m]の矩形の領域対応している。 4 is set based on so-called camera parameters such as an angle of view and a focal length of the camera 10, an attachment position to the host vehicle 200, and an orientation. The lines M1, M2, M3, M4, M5, M6, and M7 extending in the lateral direction of the grid 40 are equidistant intervals in the front-rear direction of the host vehicle 200 in the real space (for example, 10 [m] intervals in the real space) ) And the lines N1, N2, N3, N4, N5, and N6 extending in the vertical direction are equidistant intervals in the vehicle width direction of the host vehicle 200 in the real space (for example, 1 [m] intervals in the real space) It corresponds to. That is, the lattice 40 corresponds to a rectangular area in the front-rear direction 60 [m] and the vehicle width direction 5 [m] of the host vehicle 200 in the real space.
 補正部33は、道路状態検出部31により検出された道路400のカーブの状態に基づいて、記憶部32に記憶された格子40を補正する。具体的には、例えば、道路状態検出部31が、道路400がカーブしていない直線の状態であると検出したときは、補正部33は、記憶部32に記憶された基準状態での格子40を補正せずに、図5に示すように、画像P1にそのまま重畳させる。 The correction unit 33 corrects the lattice 40 stored in the storage unit 32 based on the curve state of the road 400 detected by the road state detection unit 31. Specifically, for example, when the road state detection unit 31 detects that the road 400 is in a straight line that is not curved, the correction unit 33 uses the lattice 40 in the reference state stored in the storage unit 32. Without being corrected, as shown in FIG. 5, it is directly superimposed on the image P1.
 一方、例えば、道路状態検出部31により検出された道路400がカーブしているときは、補正部33は、検出された道路400のカーブに沿うように、基準状態での格子40を変形させる補正を行う。 On the other hand, for example, when the road 400 detected by the road condition detection unit 31 is curved, the correction unit 33 corrects the lattice 40 in the reference state so as to follow the detected curve of the road 400. I do.
 ここで、図6は自車200が、例えば片側3車線の右方向に緩やかにカーブしている水平な道路の左の車線L1を走行している状態を上方からの視点で模式的に表した図、図7は図6に示したカメラ10が撮影した画像P0を左右反転して表した画像P1の一例、図8は図6,7に示した道路400のカーブに沿うように補正された格子40′を図6に示した画像P1に重畳させた状態を示す図である。 Here, FIG. 6 schematically illustrates a state in which the host vehicle 200 is traveling on the left lane L1 of a horizontal road that is gently curved in the right direction of, for example, three lanes on one side, from the viewpoint from above. FIGS. 7 and 7 are examples of an image P1 obtained by reversing the image P0 captured by the camera 10 shown in FIG. 6, and FIG. 8 is corrected so as to follow the curve of the road 400 shown in FIGS. FIG. 7 is a diagram showing a state in which a grid 40 ′ is superimposed on the image P1 shown in FIG. 6.
 また、図9は自車200が、例えば片側2車線の高速道路から左側に分岐した側道の車線L1に出て、その側道の車線L1の右方向にカーブしているところを走行している状態を上方からの視点で模式的に表した図、図10は図9に示したカメラ10が撮影した画像P0を左右反転して表した画像P1の一例、図11は図9,10に示した道路400のカーブに沿うように補正された格子40′を図9に示した画像P1に重畳させた状態を示す図である。 Further, FIG. 9 shows that the own vehicle 200 exits a side road lane L1 branched from the two-lane highway on the left side, for example, and curves to the right of the side lane L1. FIG. 10 is a diagram schematically illustrating the state of the camera 10 from above, FIG. 10 is an example of an image P1 obtained by horizontally inverting the image P0 captured by the camera 10 illustrated in FIG. 9, and FIG. FIG. 10 is a diagram showing a state in which a lattice 40 ′ corrected to follow the curve of the road 400 shown is superimposed on the image P <b> 1 shown in FIG. 9.
 例えば、道路状態検出部31が、道路400が図6,7に示すように緩やかに右方向にカーブしていると検出した場合、補正部33は、図7の画像P1に重畳させる格子40を、検出された道路400のカーブに沿った格子40′に変形させて、図8に示すように、図6の画像P1に重畳させる。これにより、道路400の路面上の各部分及び路面に接している対象物体の、現実空間における道路400のカーブに沿った自車200との位置関係を特定することができる。 For example, when the road state detection unit 31 detects that the road 400 is gently curved in the right direction as shown in FIGS. 6 and 7, the correction unit 33 sets the grid 40 to be superimposed on the image P1 in FIG. Then, it is transformed into a lattice 40 ′ along the detected curve of the road 400 and superimposed on the image P 1 of FIG. 6 as shown in FIG. Thereby, it is possible to specify the positional relationship between each part on the road surface of the road 400 and the target object in contact with the road surface with the host vehicle 200 along the curve of the road 400 in the real space.
 同様に、道路状態検出部31が、道路400が図9,10に示すように左側に分岐し他後に右方向にカーブしていると検出した場合、補正部33は、図10の画像P1に重畳させる格子40を、検出された道路400のカーブに沿った格子40′に変形させて、図11に示すように、図9の画像P1に重畳させる。これにより、道路400の路面上の各部分の、現実空間における道路400のカーブに沿った自車200との位置関係を特定することができる。 Similarly, when the road state detection unit 31 detects that the road 400 is branched to the left as shown in FIGS. 9 and 10 and then curved in the right direction, the correction unit 33 displays the image P1 in FIG. The grid 40 to be superimposed is transformed into a grid 40 ′ along the detected curve of the road 400 and superimposed on the image P 1 of FIG. 9 as shown in FIG. Thereby, the positional relationship between each part on the road surface of the road 400 and the vehicle 200 along the curve of the road 400 in the real space can be specified.
 対象認識部34は、図3,7,10等に示すように、画像P1に写った、自車200の後方を走行する後続車301,302,…(以下、後続車301等という。)を検出する。対象認識部34は、自動二輪等を含む車両の主に前面の輪郭形状や柄などのパターンを記憶部32に予め記憶していて、この記憶されたパターンを画像P1に対してパターンマッチング等により、後続車301等を検出する。また、対象認識部34は、検出した後続車301等の路面上の位置を検出することで、画像P1における後続車301等の位置を検出する。 As shown in FIGS. 3, 7, 10, etc., the object recognizing unit 34 follows vehicles 301, 302,... (Hereinafter referred to as succeeding vehicles 301) running behind the host vehicle 200 in the image P <b> 1. To detect. The object recognizing unit 34 stores in advance a pattern such as a contour shape or a pattern mainly on the front surface of a vehicle including a motorcycle or the like in the storage unit 32, and the stored pattern is subjected to pattern matching or the like with respect to the image P1. The subsequent vehicle 301 is detected. In addition, the object recognition unit 34 detects the position of the subsequent vehicle 301 or the like in the image P1 by detecting the position of the detected subsequent vehicle 301 or the like on the road surface.
 走行車線検出部35は、対象認識部34により認識された後続車301等の画像P1上の位置を、補正部33により補正して得られた格子40に基づいて、現実空間での道路400に沿った位置に変換して現実空間での道路400に沿った位置を特定し、現実空間における後続車301等が走行している車線を特定する。 The traveling lane detection unit 35 creates a road 400 in the real space based on the grid 40 obtained by correcting the position of the succeeding vehicle 301 or the like recognized by the object recognition unit 34 on the image P1 by the correction unit 33. The position along the road 400 in the real space is specified by converting to the position along the road, and the lane in which the succeeding vehicle 301 or the like is traveling in the real space is specified.
 すなわち、例えば、図3に示すように、道路400がカーブしていない直線の状態では、格子40は、基準状態での格子40のままであり、この基準状態での格子40を重畳した図5にしたがって、各後続車301等の現実空間で車線を特定する。 That is, for example, as shown in FIG. 3, in a straight line where the road 400 is not curved, the lattice 40 remains the lattice 40 in the reference state, and the lattice 40 in the reference state is superimposed on FIG. Accordingly, the lane is specified in the real space of each succeeding vehicle 301 or the like.
 この場合、走行車線検出部35は、図5に示すように、後続車301は自車200と同じ車線L1を走行し、後続車302は自車200の走行車線L1の右に隣接する車線(隣接車線)L2を走行し、後続車303は隣接車線L2の右に隣接する車線(隣々接車線)L3を走行していると特定する。 In this case, as shown in FIG. 5, the traveling lane detector 35, the following vehicle 301 travels in the same lane L <b> 1 as the own vehicle 200, and the following vehicle 302 is adjacent to the right of the traveling lane L <b> 1 of the own vehicle 200 ( It is determined that the vehicle travels in the adjacent lane (L2) and the following vehicle 303 is traveling in the lane (adjacent lane) L3 adjacent to the right of the adjacent lane L2.
 また、図8に示した場合、基準状態での格子40から格子40′に変形されているため、この変形された格子40′を重畳した状態にしたがって、走行車線検出部35は各後続車301等の現実空間で車線を特定する。 Further, in the case shown in FIG. 8, since the lattice 40 in the reference state is deformed from the lattice 40 ′, the traveling lane detection unit 35 follows each of the following vehicles 301 according to the state where the deformed lattice 40 ′ is superimposed. Identify the lane in real space such as.
 この場合、走行車線検出部35は、図8に示すように、後続車302は自車200の走行車線L1の右に隣接する車線(隣接車線)L2を走行し、後続車303,304,305は隣接車線L2の右に隣接する車線(隣々接車線)L3を走行していると特定する。 In this case, as shown in FIG. 8, the traveling lane detection unit 35 causes the subsequent vehicle 302 to travel in the lane (adjacent lane) L2 adjacent to the right of the traveling lane L1 of the host vehicle 200, and the subsequent vehicles 303, 304, 305. Specifies that the vehicle is traveling in a lane (adjacent lane) L3 adjacent to the right of the adjacent lane L2.
 同様に、図11に示した場合、基準状態での格子40から格子40′に変形されているため、この変形された格子40′を重畳した状態にしたがって、走行車線検出部35は各後続車301等の現実空間で車線を特定する。 Similarly, in the case shown in FIG. 11, since the lattice 40 in the reference state is deformed from the lattice 40 ′, the traveling lane detector 35 detects each succeeding vehicle according to the state in which the deformed lattice 40 ′ is superimposed. A lane is specified in a real space such as 301.
 この場合、走行車線検出部35は、図11に示すように、後続車301,306は自車200と同じ車線L1を走行していると特定する。 In this case, the traveling lane detector 35 specifies that the following vehicles 301 and 306 are traveling in the same lane L1 as the host vehicle 200, as shown in FIG.
 距離算出部36は、補正部33により補正して得られた格子40に基づいて、現実空間における自車200の位置と後続車301等の位置との、道路400のカーブに沿った距離を求める。 The distance calculation unit 36 obtains the distance along the curve of the road 400 between the position of the own vehicle 200 and the position of the succeeding vehicle 301 in the real space based on the grid 40 obtained by the correction by the correction unit 33. .
 報知出力部37は、走行車線検出部35により検出された後続車301等が走行している車線が、自車200が走行している車線L1に隣接する車線L2であり、かつ、距離算出部36により算出された距離が時系列的に短くなっているとき、すなわち隣接車線L2を走行している後続車が自車200に接近しているときは、後続車の接近を報知する信号をモニタ90に出力する。 The notification output unit 37 includes a lane L2 adjacent to the lane L1 in which the vehicle 200 is traveling, and the distance calculation unit in which the lane in which the subsequent vehicle 301 detected by the travel lane detection unit 35 is traveling. When the distance calculated by 36 is shortened in time series, that is, when a subsequent vehicle traveling in the adjacent lane L2 is approaching the host vehicle 200, a signal for notifying the approach of the subsequent vehicle is monitored. Output to 90.
 モニタ90は、カメラ10で撮影された画像P0をカメラECU30により左右反転した画像P1を可視的に表示する。このとき、カメラECU30により、隣接車線L2を、自車200に接近するように自車200よりも走行速度の速い後続車を報知する信号が出力されているときは、モニタ90は、例えば図8に示すように、その検出した後続車302の周囲を囲む車両検知枠350を画像P1に重畳する。 The monitor 90 visually displays an image P1 obtained by horizontally inverting the image P0 taken by the camera 10 by the camera ECU 30. At this time, when the camera ECU 30 outputs a signal that informs the adjacent lane L2 of the following vehicle having a higher traveling speed than the own vehicle 200 so as to approach the own vehicle 200, the monitor 90 displays, for example, FIG. As shown in FIG. 3, a vehicle detection frame 350 surrounding the detected subsequent vehicle 302 is superimposed on the image P1.
<作用>
 次に、本実施形態の電子ミラーシステム1の作用について説明する。自車200の右側部に設置されたカメラ10は、自車200の右後方の領域を所定の周期で撮影し、撮影された画像はカメラECU30に入力される。カメラECU30は、カメラ10から周期的に入力された各画像を、左右反転した画像P1に変換する。
<Action>
Next, the operation of the electronic mirror system 1 of the present embodiment will be described. The camera 10 installed on the right side of the host vehicle 200 captures a right rear area of the host vehicle 200 at a predetermined cycle, and the captured image is input to the camera ECU 30. The camera ECU 30 converts each image periodically input from the camera 10 into a left-right inverted image P1.
 カメラECU30に周期的に入力されて、それぞれ左右反転した複数の画像P1に基づいて、道路状態検出部31は、自車200が走行している道路400のカーブの状態を検出する。カーブの状態の検出は、図2に示した特定の白線410の端点411が、自車200の走行に伴って後方に移動していくときの時系列的に異なるタイミングで撮影された各画像P1上での位置の遷移によって求められる。 The road state detection unit 31 detects the state of the curve of the road 400 on which the host vehicle 200 is traveling based on a plurality of images P1 that are periodically input to the camera ECU 30 and reversed left and right. The detection of the state of the curve is based on each image P1 taken at different timings in time series when the end point 411 of the specific white line 410 shown in FIG. 2 moves backward as the host vehicle 200 travels. It is determined by the position transition above.
 図12は自車200の走行に伴って画像P1において、端点411が最初の位置R1から、位置R2′、位置R3′、位置R4′、位置R5′というように移動した様子を模式的に表したものである。道路状態検出部31は、画像P1上で経時的に移動していく端点411の位置R1,R2′,R3′,R4′,R5′を、例えば図12のXY直交座標の位置として検出する。 FIG. 12 schematically shows how the end point 411 moves from the initial position R1 to the position R2 ′, the position R3 ′, the position R4 ′, and the position R5 ′ in the image P1 as the host vehicle 200 travels. It is a thing. The road condition detection unit 31 detects the positions R1, R2 ′, R3 ′, R4 ′, R5 ′ of the end points 411 that move with time on the image P1, for example, as the positions of the XY orthogonal coordinates in FIG.
 なお、図12において、位置R1、位置R2、位置R3、位置R4、位置R5は、道路400がカーブしていない基準状態での、操舵角0[度]の自車200のその時の車速(CAN20から入力)により端点411が移動していくと予測される位置を表すものである。この予測の位置R1,R2,R3,R4,R5は、車載ネットワーク20から入力された自車200の車速を用いて、操舵角0[度]を前提としたものである。この基準状態での予測の位置R1,R2,R3,R4,R5は、一直線上に並ぶ。 In FIG. 12, the position R1, the position R2, the position R3, the position R4, and the position R5 are vehicle speeds (CAN20 at that time) of the host vehicle 200 with a steering angle of 0 [degree] in a reference state where the road 400 is not curved. Represents the position where the end point 411 is predicted to move. The predicted positions R1, R2, R3, R4, and R5 are based on a steering angle of 0 [degree] using the vehicle speed of the host vehicle 200 input from the in-vehicle network 20. The predicted positions R1, R2, R3, R4, and R5 in this reference state are aligned on a straight line.
 しかし、道路状態検出部31によって検出された位置R1,R2′,R3′,R4′,R5′は一直線上に並ばずに、位置R2′は位置R2からX軸方向及びY軸方向にそれぞれずれ、その後の位置R3′,R4′,R5′もそれぞれ対応する位置R3,R4,R5からX軸方向及びY軸方向にずれている。このX軸方向及びY軸方向への位置ずれは、自車200の向きが時系列的に変化していること、すなわち自車200が走行している道路400がカーブしていることを示している。 However, the positions R1, R2 ′, R3 ′, R4 ′, and R5 ′ detected by the road condition detection unit 31 are not aligned, and the position R2 ′ is shifted from the position R2 in the X-axis direction and the Y-axis direction, respectively. The subsequent positions R3 ′, R4 ′, and R5 ′ are also shifted in the X-axis direction and the Y-axis direction from the corresponding positions R3, R4, and R5, respectively. This positional deviation in the X-axis direction and the Y-axis direction indicates that the direction of the host vehicle 200 is changing in time series, that is, the road 400 on which the host vehicle 200 is traveling is curved. Yes.
 補正部33は、道路状態検出部31で検出された、この道路400のカーブの状態に基づいて、キャリブレーション用の格子40を補正する。 The correction unit 33 corrects the calibration grid 40 based on the curve state of the road 400 detected by the road state detection unit 31.
 具体的には、道路状態検出部31で検出された位置R2′が基準状態での予測された位置R2からの、X軸方向に位置ずれているずれ量ΔR2x(図12のX軸の傍に記載。以下、同じ。)とY軸方向に位置ずれているずれ量ΔR2y(図12のY軸の傍に記載。以下、同じ。)、位置R3′が基準状態での予測された位置R3からの、X軸方向に位置ずれているずれ量ΔR3xとY軸方向に位置ずれているずれ量ΔR3y、R4′が基準状態での予測された位置R4からの、X軸方向に位置ずれているずれ量ΔR4xとY軸方向に位置ずれているずれ量ΔR4y、R5′が基準状態での予測された位置R5からの、X軸方向に位置ずれているずれ量ΔR5xとY軸方向に位置ずれているずれ量ΔR5yに基づいて、図2に示した格子40を変形させる。 More specifically, a deviation amount ΔR2x in which the position R2 ′ detected by the road state detection unit 31 is displaced in the X-axis direction from the predicted position R2 in the reference state (beside the X-axis in FIG. 12). The following description is the same.) And the amount of displacement ΔR2y displaced in the Y-axis direction (described beside the Y-axis in FIG. 12; the same applies hereinafter), and the position R3 ′ from the predicted position R3 in the reference state The deviations ΔR3x and ΔR3y and R4 ′ which are displaced in the X-axis direction and the Y-axis direction are shifted from the predicted position R4 in the reference state in the X-axis direction. The amount ΔR4x and the amount of displacement ΔR4y, R5 ′ that is displaced in the Y-axis direction are displaced in the Y-axis direction from the amount of displacement ΔR5x that is displaced in the X-axis direction from the predicted position R5 in the reference state. Based on the deviation amount ΔR5y, the lattice 40 shown in FIG. 2 is deformed. Let
 このとき、格子40内の、位置R2,R3,R4,R5以外の他の位置の移動先の位置は、これら位置R2,R3,R4,R5の移動先の位置R2′,R3′,R4′,R5′を代表点として算出することが可能であり、これにより、格子40を、カーブした道路400に沿うように変形させることができる。 At this time, the movement destination positions other than the positions R2, R3, R4, and R5 in the lattice 40 are the movement destination positions R2 ′, R3 ′, and R4 ′ of these positions R2, R3, R4, and R5. , R5 ′ can be calculated as representative points, whereby the lattice 40 can be deformed along the curved road 400.
 これにより、自車200車に設けられたカメラ10の画像P1に基づいて、カメラ10の画像P1に写った後続車の、現実空間における、道路400のカーブに沿った自車200との位置関係を正確に求めることができる。 Thereby, based on the image P1 of the camera 10 provided in the own vehicle 200, the positional relationship of the subsequent vehicle shown in the image P1 of the camera 10 with the own vehicle 200 along the curve of the road 400 in the real space. Can be obtained accurately.
 なお、道路400のカーブによる格子40内の位置の移動量は、自車200の直近の後方においては少なく、自車200から遠い程大きくなるため、格子40の変形のための補正量の算出は、図12に示した、自車200の後方の一部の領域P2のみを対象として算出してもよい。これにより、補正のための演算量を低減し、演算負荷を手抑制することができる。リソースの限られた車両という移動体に搭載されたカメラECU30としては、演算負荷の抑制は、重要な課題の一つである。 Note that the amount of movement of the position in the grid 40 due to the curve of the road 400 is small in the immediate rear of the host vehicle 200 and increases as the distance from the host vehicle 200 increases. Therefore, the correction amount for deformation of the grid 40 is calculated. The calculation may be performed only for a partial region P2 behind the host vehicle 200 shown in FIG. Thereby, the calculation amount for correction | amendment can be reduced and calculation load can be suppressed. For the camera ECU 30 mounted on a moving body such as a vehicle with limited resources, suppression of calculation load is one of the important issues.
 一方、対象認識部34は、画像P1から後続車を検出する。走行車線検出部35は、その後続車が走行している車線を、補正された格子40′に基づいて特定する。 On the other hand, the target recognition unit 34 detects the following vehicle from the image P1. The traveling lane detector 35 identifies the lane in which the subsequent vehicle is traveling based on the corrected grid 40 '.
 同様に、距離算出部36は、自車200から画像P1において検出された後続車までの、道路400のカーブに沿った距離を、補正された格子40′に基づいて求める。 Similarly, the distance calculation unit 36 obtains the distance along the curve of the road 400 from the own vehicle 200 to the following vehicle detected in the image P1 based on the corrected grid 40 ′.
 このようにして、カメラECU30は、検出された各後続車が走行する車線と、自車200から各後続車までの道路400のカーブに沿った距離とを特定する。そして、報知出力部37は、検出された後続車のうち、自車200が走行する車線L1に隣接する車線L2を走行するとともに、その隣接する車線L2を走行する後続車と自車との、道路400のカーブに沿った距離が、時系列的に縮まっているとき、すなわち、報知出力部37は、自車200に近づいてくる隣接車線L2を走行する後続車が検出したときは、モニタ90に対して報知信号を出力する。 Thus, the camera ECU 30 identifies the detected lane in which each subsequent vehicle travels and the distance along the curve of the road 400 from the own vehicle 200 to each subsequent vehicle. And the alerting | reporting output part 37 while driving the lane L2 adjacent to the lane L1 where the own vehicle 200 drive | works among the detected subsequent vehicles, and the following vehicle and the own vehicle which drive the adjacent lane L2, When the distance along the curve of the road 400 is reduced in time series, that is, the notification output unit 37 detects a succeeding vehicle traveling in the adjacent lane L2 approaching the host vehicle 200, the monitor 90 A notification signal is output.
 モニタ90は、その報知信号の対象となる、隣接車線L2を走行して自車に近づいてくる後続車については、図8に示すように、その後続車302の周囲を囲む車両検知枠350を画像P1に重畳して表示する。 As shown in FIG. 8, the monitor 90 displays a vehicle detection frame 350 that surrounds the succeeding vehicle 302 as shown in FIG. It is displayed superimposed on the image P1.
 これにより、モニタ90を見た自車200の運転者に対して、隣接車線L2への車線変更を行わないように注意を喚起することができる。 Thereby, it is possible to alert the driver of the own vehicle 200 looking at the monitor 90 not to change the lane to the adjacent lane L2.
 一方、報知出力部37は、検出された後続車のうち、自車200が走行する車線L1を走行している後続車や、隣々接車線L3を走行している後続車については、その後続車が自車200に近づいてくるものであっても、報知信号を出力しない。同様に、隣接車線L2を走行する後続車であっても、その後続車が自車200に近づいていないときも、報知出力部37は報知信号を出力しない。 On the other hand, among the detected succeeding vehicles, the notification output unit 37 follows the following vehicle that travels in the lane L1 in which the host vehicle 200 travels and the succeeding vehicle that travels in the adjacent lane L3. Even if the vehicle approaches the vehicle 200, no notification signal is output. Similarly, even if the following vehicle travels in the adjacent lane L2, the notification output unit 37 does not output a notification signal even when the subsequent vehicle is not approaching the host vehicle 200.
 モニタ90は、報知出力部37から報知信号の出力が無いため、いずれの後続車に対しても、運転者の注意を喚起する車両検知枠350を付加することなく画像P1を表示する。 Since there is no notification signal output from the notification output unit 37, the monitor 90 displays the image P1 without adding a vehicle detection frame 350 that alerts the driver to any subsequent vehicle.
 これにより、モニタ90を見た自車200の運転者に対して、隣接車線L2への車線変更を行わないようにとの無用な注意喚起を行わないようにすることができ、運転者への不必要な報知の頻発を防止又は抑制することができる。 As a result, it is possible to prevent the driver of the own vehicle 200 looking at the monitor 90 from performing unnecessary alerts not to change the lane to the adjacent lane L2. The frequent occurrence of unnecessary notifications can be prevented or suppressed.
 以上のように、本実施形態のカメラECU30及び電子ミラーシステム1によれば、自車200に設けられたカメラ10の画像P1に基づいて、カメラ10の画像P1に写った後続車301等の、現実空間における道路400に沿った自車200との位置関係を正確に求めることができる。 As described above, according to the camera ECU 30 and the electronic mirror system 1 of the present embodiment, based on the image P1 of the camera 10 provided in the host vehicle 200, the following vehicle 301 or the like shown in the image P1 of the camera 10, The positional relationship with the own vehicle 200 along the road 400 in the real space can be accurately obtained.
 なお、本実施形態は、自車200の右側に設けられたカメラ10で撮影された、自車200の主に右後方の後続車と自車200との位置関係についてのみ説明したが、自車200の左側に設けられたカメラ10で撮影された、自車200の主に左後方の後続車と自車200との位置関係について、同様に実施される。 In addition, although this embodiment demonstrated only the positional relationship of the own vehicle 200 and the following vehicle mainly on the right rear of the own vehicle 200 image | photographed with the camera 10 provided in the right side of the own vehicle 200, the own vehicle The positional relationship between the own vehicle 200 and the following vehicle, mainly the rear left side of the own vehicle 200, taken by the camera 10 provided on the left side of the own vehicle 200 is similarly implemented.
 また、本実施形態は、自車200よりも後方に存在する後続車と自車200との位置関係についてのみ説明したが、本発明に係るキャリブレーション装置及び電子ミラーシステムは、後続車だけでなく、自車200よりも前方に存在する先行車と自車200との位置関係についても同様に適用可能である。この場合は、カメラは自車の前方を向いたカメラを適用する。 Moreover, although this embodiment demonstrated only the positional relationship of the following vehicle and the own vehicle 200 which exist behind the own vehicle 200, the calibration apparatus and electronic mirror system which concern on this invention are not only a subsequent vehicle. The positional relationship between the preceding vehicle existing ahead of the host vehicle 200 and the host vehicle 200 can be similarly applied. In this case, the camera which faces the front of the own vehicle is applied.
 上述した実施形態は、道路400の曲がり具合に対応して格子40を変形させ、画像P1での後続車の位置を、変形した格子40′で表された車線に沿わせた車線、距離で求めているが、画像P1を、変形した格子40′の横方向の線M1,M2,…と縦方向の線N1,N2,…とで特定される別の座標系の画像に変換し、その変換して得られた別の座標系の画像上で、後続車の位置(車線、自車200からの距離等)を特定してもよい。 In the embodiment described above, the grid 40 is deformed in accordance with the degree of bending of the road 400, and the position of the succeeding vehicle in the image P1 is obtained by the lane and the distance along the lane represented by the deformed grid 40 ′. However, the image P1 is converted into an image of another coordinate system specified by the horizontal lines M1, M2,... And the vertical lines N1, N2,. The position (lane, distance from own vehicle 200, etc.) of the following vehicle may be specified on the image of another coordinate system obtained as described above.
関連出願の相互参照Cross-reference of related applications
 本出願は、2018年6月7日に日本国特許庁に出願された特願2018-109139に基づいて優先権を主張し、その全ての開示は完全に本明細書で参照により組み込まれる。 This application claims priority based on Japanese Patent Application No. 2018-109139 filed with the Japan Patent Office on June 7, 2018, the entire disclosure of which is fully incorporated herein by reference.

Claims (6)

  1.  自車が走行中の道路において時系列的に異なる複数の時点で前記自車に搭載されたカメラによってそれぞれ撮影された複数の画像に基づいて、前記道路のカーブの状態を検出する道路状態検出部と、
     前記道路が直線であると仮定した基準状態での、前記画像に写った部分の、前記画像上における位置と現実空間での前記道路に沿った位置との対応関係を記憶した記憶部と、
     前記道路状態検出部により検出された前記道路のカーブの状態に基づいて、前記記憶部に記憶された前記対応関係を補正する補正部と、を備えたキャリブレーション装置。
    A road state detection unit that detects the state of the curve of the road based on a plurality of images respectively taken by cameras mounted on the own vehicle at a plurality of time points different in time series on the road on which the own vehicle is traveling. When,
    A storage unit that stores a correspondence relationship between a position on the image and a position along the road in a real space of a portion in the image in a reference state on the assumption that the road is a straight line;
    A calibration device comprising: a correction unit that corrects the correspondence stored in the storage unit based on a state of a curve of the road detected by the road state detection unit.
  2.  前記道路状態検出部は、時系列的に異なる複数の時点でそれぞれ撮影された複数の前記画像における、前記車線を区切る線の端点の位置に基づいて、前記道路のカーブの状態を検出する請求項1に記載のキャリブレーション装置。 The road state detection unit detects the state of a curve of the road based on the position of an end point of a line that divides the lane in the plurality of images respectively taken at a plurality of time points different in time series. The calibration apparatus according to 1.
  3.  前記画像に写った他車を検出する対象認識部と、
     前記補正部により補正して得られた前記対応関係に基づいて、前記対象認識部により認識された前記他車の、前記現実空間での前記道路に沿った位置を特定し、前記現実空間における前記自車と前記他車との前記道路に沿った距離を求める距離算出部を備えた請求項1又は2に記載のキャリブレーション装置。
    An object recognition unit for detecting other vehicles in the image;
    Based on the correspondence obtained by correcting by the correction unit, the position of the other vehicle recognized by the target recognition unit along the road in the real space is specified, and the vehicle in the real space The calibration device according to claim 1, further comprising a distance calculation unit that calculates a distance along the road between the host vehicle and the other vehicle.
  4.  前記画像に写った他車を検出する対象認識部と、
     前記補正部により補正して得られた前記対応関係に基づいて、前記対象認識部により認識された前記他車の、前記現実空間での前記道路に沿った位置を特定し、前記現実空間における前記他車が走行している車線を特定する走行車線検出部を備えた請求項1から3のうちいずれか1項に記載のキャリブレーション装置。
    An object recognition unit for detecting other vehicles in the image;
    Based on the correspondence obtained by correcting by the correction unit, the position of the other vehicle recognized by the target recognition unit along the road in the real space is specified, and the vehicle in the real space The calibration device according to any one of claims 1 to 3, further comprising a travel lane detection unit that identifies a lane in which another vehicle is traveling.
  5.  前記補正部により補正して得られた前記対応関係に基づいて、前記現実空間における前記他車が走行している車線を特定する走行車線検出部と、
     前記走行車線検出部により検出された前記他車が走行している車線が、前記自車が走行している車線に隣接する車線であり、かつ、前記距離算出部により算出された距離が時系列的に短くなっているときは、前記他車の接近を報知する信号を出力する報知出力部と、を備えた請求項3に記載のキャリブレーション装置。
    A traveling lane detector that identifies the lane in which the other vehicle is traveling in the real space based on the correspondence obtained by the correction performed by the correcting unit;
    The lane in which the other vehicle is detected detected by the travel lane detector is a lane adjacent to the lane in which the host vehicle is traveling, and the distance calculated by the distance calculator is time-series. The calibration device according to claim 3, further comprising: a notification output unit that outputs a signal that notifies the approach of the other vehicle when the vehicle is shorter.
  6.  自車に搭載されて、反射鏡が備えられるべき部位に設けられた、前記自車の外部を撮影するカメラと、
     前記カメラで撮影された画像を、前記反射鏡で反射して見たように左右反転した画像を表示する、前記自車の車室に設けられた画像表示装置と、
     請求項5に記載のキャリブレーション装置と、を備え、
     前記画像表示装置は、前記報知出力部から出力された信号により、接近する前記他車を強調する表示を行う電子ミラーシステム。
    A camera that is mounted on the vehicle and is provided at a site where a reflector should be provided;
    An image display device provided in a cabin of the host vehicle, which displays an image obtained by reversing the image captured by the camera as viewed from the reflecting mirror;
    A calibration device according to claim 5,
    The said image display apparatus is an electronic mirror system which performs the display which emphasizes the said other vehicle which approaches by the signal output from the said alerting | reporting output part.
PCT/JP2019/008266 2018-06-07 2019-03-04 Calibration device and electronic mirror system WO2019235004A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018109139A JP7226930B2 (en) 2018-06-07 2018-06-07 Calibration device and electronic mirror system
JP2018-109139 2018-06-07

Publications (1)

Publication Number Publication Date
WO2019235004A1 true WO2019235004A1 (en) 2019-12-12

Family

ID=68770232

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008266 WO2019235004A1 (en) 2018-06-07 2019-03-04 Calibration device and electronic mirror system

Country Status (2)

Country Link
JP (1) JP7226930B2 (en)
WO (1) WO2019235004A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113911035A (en) * 2021-11-08 2022-01-11 广州优创电子有限公司 Intelligent rearview mirror and intelligent rearview mirror control method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7304334B2 (en) 2020-12-03 2023-07-06 本田技研工業株式会社 VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07223487A (en) * 1994-02-14 1995-08-22 Mitsubishi Motors Corp Situation display device for vehicle
JP2007164636A (en) * 2005-12-15 2007-06-28 Toyota Motor Corp Lane line detection device
WO2012147187A1 (en) * 2011-04-27 2012-11-01 トヨタ自動車株式会社 Periphery vehicle detection device
JP2015028696A (en) * 2013-07-30 2015-02-12 アルパイン株式会社 Vehicle rear side alarm device, vehicle rear side alarm method and other vehicles distance detection device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4638369B2 (en) 2006-03-28 2011-02-23 富士重工業株式会社 Lane position detector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07223487A (en) * 1994-02-14 1995-08-22 Mitsubishi Motors Corp Situation display device for vehicle
JP2007164636A (en) * 2005-12-15 2007-06-28 Toyota Motor Corp Lane line detection device
WO2012147187A1 (en) * 2011-04-27 2012-11-01 トヨタ自動車株式会社 Periphery vehicle detection device
JP2015028696A (en) * 2013-07-30 2015-02-12 アルパイン株式会社 Vehicle rear side alarm device, vehicle rear side alarm method and other vehicles distance detection device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113911035A (en) * 2021-11-08 2022-01-11 广州优创电子有限公司 Intelligent rearview mirror and intelligent rearview mirror control method
CN113911035B (en) * 2021-11-08 2023-08-25 广州优创电子有限公司 Intelligent rearview mirror and intelligent rearview mirror control method

Also Published As

Publication number Publication date
JP7226930B2 (en) 2023-02-21
JP2019213108A (en) 2019-12-12

Similar Documents

Publication Publication Date Title
JP5172314B2 (en) Stereo camera device
US11210533B1 (en) Method of predicting trajectory of vehicle
JP5229451B2 (en) Automobile lane departure prevention method
US20140043466A1 (en) Environment image display apparatus for transport machine
US10099617B2 (en) Driving assistance device and driving assistance method
JP2009085651A (en) Image processing system
JP6471522B2 (en) Camera parameter adjustment device
TWI533694B (en) Obstacle detection and display system for vehicle
JP6778620B2 (en) Road marking device, road marking system, and road marking method
WO2015129280A1 (en) Image processing device and image processing method
WO2019235004A1 (en) Calibration device and electronic mirror system
CN103381825B (en) Use the full speed lane sensing of multiple photographic camera
JP2012176656A (en) Parking support device
JP2018081363A (en) Driving support device
JP6407596B2 (en) Image processing apparatus and driving support system
US12083959B2 (en) Image control system
JP6489645B2 (en) Image processing device
US20230242137A1 (en) Notification device and notification method
JP5831331B2 (en) Rear side photographing device for vehicle
JP6439233B2 (en) Image display apparatus for vehicle and image processing method
JP6032141B2 (en) Travel road marking detection device and travel road marking detection method
CN115943287A (en) Vehicle attitude estimation system and vehicle attitude estimation method
JP2012178639A (en) Image processing device, parking control system, and image processing method
JP6586972B2 (en) Image display apparatus for vehicle and image processing method
US20230222813A1 (en) Road surface marking detection device, notification system provided with the same, and road surface marking detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19815915

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19815915

Country of ref document: EP

Kind code of ref document: A1