WO2022044294A1 - In-vehicle device and method for calibrating in-vehicle camera - Google Patents

In-vehicle device and method for calibrating in-vehicle camera Download PDF

Info

Publication number
WO2022044294A1
WO2022044294A1 PCT/JP2020/032757 JP2020032757W WO2022044294A1 WO 2022044294 A1 WO2022044294 A1 WO 2022044294A1 JP 2020032757 W JP2020032757 W JP 2020032757W WO 2022044294 A1 WO2022044294 A1 WO 2022044294A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
calibration
camera
marker
mobile terminal
Prior art date
Application number
PCT/JP2020/032757
Other languages
French (fr)
Japanese (ja)
Inventor
隆幸 小笹
康司 大西
直士 垣田
輝彦 上林
修久 池田
Original Assignee
株式会社デンソーテン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソーテン filed Critical 株式会社デンソーテン
Priority to PCT/JP2020/032757 priority Critical patent/WO2022044294A1/en
Priority to JP2022545229A priority patent/JP7449393B2/en
Publication of WO2022044294A1 publication Critical patent/WO2022044294A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the embodiment of the disclosure relates to a calibration method of an in-vehicle device and an in-vehicle camera.
  • One aspect of the embodiment is made in view of the above, and an object thereof is to provide a calibration method for an in-vehicle device and an in-vehicle camera that can reduce the labor of calibration.
  • the in-vehicle device includes a calculation unit, an estimation unit, and an execution unit.
  • the calculation unit calculates the posture of the marker displayed on the mobile terminal, which is captured by the vehicle-mounted camera.
  • the estimation unit estimates the mounting angle of the vehicle-mounted camera from the calculation result calculated by the calculation unit.
  • the execution unit calibrates the vehicle-mounted camera based on the estimation result estimated by the estimation unit.
  • the labor of calibration can be reduced.
  • FIG. 1 is an explanatory diagram (No. 1) of a calibration method for an in-vehicle camera according to a first comparative example.
  • FIG. 2 is an explanatory diagram (No. 2) of the calibration method of the vehicle-mounted camera according to the first comparative example.
  • FIG. 3 is an explanatory diagram of a calibration method for an in-vehicle camera according to a second comparative example.
  • FIG. 4 is a schematic explanatory view of the calibration method of the vehicle-mounted camera according to the embodiment.
  • FIG. 5 is a block diagram showing a configuration example of the in-vehicle system according to the embodiment.
  • FIG. 6 is a block diagram showing a configuration example of the mobile terminal according to the embodiment.
  • FIG. 7 is an explanatory diagram in the three-axis direction relating to the mobile terminal.
  • FIG. 8 is a diagram showing a display example of the marker M related to the TILT direction.
  • FIG. 9 is a diagram showing a display example of the marker M related to the ROLL direction.
  • FIG. 10 is a diagram showing a display example of the marker M in the PAN direction.
  • FIG. 11 is a diagram showing a modified example of the display of the marker M in the ROLL direction.
  • FIG. 12 is a diagram showing a modified example of the display of the marker M in the TILT direction.
  • FIG. 13 is an explanatory diagram (No. 1) of the camera mounting angle estimation process.
  • FIG. 14 is an explanatory diagram (No. 2) of the camera mounting angle estimation process.
  • FIG. 15 is an explanatory diagram (No. 3) of the camera mounting angle estimation process.
  • FIG. 16 is a flowchart showing a processing procedure executed by the drive recorder according to the embodiment.
  • the vehicle-mounted device according to the embodiment is the drive recorder 10 and the vehicle-mounted camera is the camera 11 mounted on the drive recorder 10 will be described as an example.
  • FIG. 1 is an explanatory diagram (No. 1) of the calibration method of the camera 11 according to the first comparative example.
  • FIG. 2 is an explanatory diagram (No. 2) of the calibration method of the camera 11 according to the first comparative example.
  • FIG. 3 is an explanatory diagram of a calibration method of the camera 11 according to the second comparative example.
  • the calibration is performed using the structure ST provided at a predetermined position in front of the vehicle V.
  • the structure ST is provided at a predetermined position in front of the vehicle V, for example, by combining a plurality of poles and a plurality of joints.
  • the worker activates the dedicated application on the mobile terminal 20 such as a smartphone, and detects the structure ST in the captured image of the camera 11 displayed on the mobile terminal 20 while designating the positions P1 and P2. , Perform calibration based on P3.
  • the reference numeral "WP" in FIG. 2 corresponds to the tire position of the vehicle V.
  • the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 is calculated, and the mounting angle of the camera 11 is estimated from the calculation result and estimated. We decided to calibrate based on the results.
  • FIG. 4 is a schematic explanatory diagram of the calibration method of the camera 11 according to the embodiment. Specifically, as shown in FIG. 4, in the calibration method of the camera 11 according to the embodiment, first, the dedicated application is started on the mobile terminal 20 held by the operator, and the marker M is displayed on the mobile terminal 20. ..
  • Marker M is a quadrangle having two sets of parallel two sides. That is, the marker M may be a square, a rectangle, or a parallelogram. In this embodiment, the marker M is assumed to be a square. Further, the mobile terminal 20 for displaying the marker M may be located at an arbitrary position around the vehicle V.
  • the drive recorder 10 acquires the captured image of the mobile terminal 20 by the camera 11 and extracts the marker M from the captured image. Then, the drive recorder 10 calculates the posture of the extracted marker M. That is, in the calibration method of the camera 11 according to the embodiment, first, the posture of the marker M displayed on the mobile terminal 20 is calculated (step S1).
  • the drive recorder 10 estimates the mounting angle of the camera 11 from the calculated calculation result (step S2).
  • two sets of parallel lines are extracted from the feature points of the AR marker used in the AR (Augmented Reality) technique, and the angle of the camera is estimated based on the two sets of parallel lines.
  • a known algorithm can be used. The estimation process using such an algorithm will be described later with reference to FIGS. 13 to 15.
  • the drive recorder 10 performs calibration based on the estimated estimation result (step S3).
  • the camera 11 can be calibrated without providing the structure ST as in the first comparative example and under the limited circumstances as in the second comparative example.
  • the marker M may be displayed only when the angle in the TILT direction or the angle in the ROLL direction of the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane. preferable. This point will be described later with reference to FIGS. 7 to 9, 11 and 12.
  • the marker M cannot accurately adjust the angle of the mobile terminal 20 in the PAN direction with respect to the vehicle V, it is determined that the marker M is not suitable for calibration based on the side length ratio of the extracted marker M. If so, the relocation of the mobile terminal 20 may be requested. This point will be described later with reference to FIG.
  • the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 is calculated, and the mounting angle of the camera 11 is estimated from the calculation result. , It was decided to perform calibration based on the estimation result.
  • the labor of calibration can be reduced.
  • FIG. 5 is a block diagram showing a configuration example of the in-vehicle system 1 according to the embodiment. Further, FIG. 6 is a block diagram showing a configuration example of the mobile terminal 20 according to the embodiment. Note that FIGS. 5 and 6 show only the components necessary for explaining the features of the present embodiment, and the description of general components is omitted.
  • each component shown in FIGS. 5 and 6 is a functional concept and does not necessarily have to be physically configured as shown in the figure.
  • the specific form of distribution / integration of each block is not limited to the one shown in the figure, and all or part of it may be functionally or physically distributed in any unit according to various loads and usage conditions. It can be integrated and configured.
  • the in-vehicle system 1 includes a drive recorder 10 and a mobile terminal 20.
  • a configuration example of the mobile terminal 20 will be described with reference to FIG.
  • the mobile terminal 20 is a terminal device carried by a calibration worker, and is, for example, an information processing terminal such as a smartphone, a tablet terminal, or a notebook PC (Personal Computer).
  • an information processing terminal such as a smartphone, a tablet terminal, or a notebook PC (Personal Computer).
  • the mobile terminal 20 includes a gyro sensor 21, a display unit 22, a storage unit 23, and a control unit 24.
  • FIG. 7 is an explanatory diagram in the three-axis direction according to the mobile terminal 20.
  • a three-axis Cartesian coordinate system is assumed in which the front direction of the mobile terminal 20 is the positive direction of the X axis, the width direction is the Y axis direction, and the height direction is the Z axis direction.
  • the gyro sensor 21 outputs a signal indicating each angle in the ROLL direction around the X axis, the TILT direction around the Y axis, and the PAN direction around the Z axis in the coordinate system according to the posture of the mobile terminal 20. Output to.
  • the display unit 22 is realized by, for example, a touch panel display or the like, and displays the marker M based on the control of the control unit 24.
  • the storage unit 23 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory (Flash Memory), or a storage device such as a hard disk or an optical disk. In the example of FIG. 6, the storage unit 23 stores the application information 23a. do.
  • the application information 23a includes a program of a dedicated application that realizes a function in the mobile terminal 20 in the calibration method according to the embodiment.
  • the control unit 24 is a controller, and for example, various programs stored in a storage device inside the mobile terminal 20 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), etc. use the RAM as a work area. It is realized by being executed. Further, the control unit 24 can be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the control unit 24 has an application execution unit 24a, and realizes or executes an information processing function or operation described below.
  • the application execution unit 24a reads and executes the program of the dedicated application from the application information 23a. Further, the application execution unit 24a causes the display unit 22 to display the marker M based on each angle in the three axial directions acquired from the gyro sensor 21.
  • FIG. 8 is a diagram showing a display example of the marker M related to the TILT direction.
  • FIG. 9 is a diagram showing a display example of the marker M related to the ROLL direction.
  • FIG. 10 is a diagram showing a display example of the marker M related to the PAN direction.
  • FIG. 11 is a diagram showing a modified example of the display of the marker M related to the ROLL direction.
  • FIG. 12 is a diagram showing a modified example of the display of the marker M related to the TILT direction.
  • the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 is within a predetermined angle ⁇ 1 with respect to the horizontal plane XY plane in the TILT direction. (See “Marker display” in the figure).
  • the application execution unit 24a does not display the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 with respect to the TILT direction is other than the predetermined angle ⁇ 1 (see “Marker non-display” in the figure). ..
  • the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 is within a predetermined angle ⁇ 2 with respect to the horizontal plane XY plane in the ROLL direction. Is displayed (see “Marker display” in the figure).
  • the application execution unit 24a does not display the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 with respect to the ROLL direction is other than the predetermined angle ⁇ 2 (see “Marker non-display” in the figure). ..
  • calibration is performed based on the marker M displayed on the display unit 22 only when the mobile terminal 20 is within the predetermined angles ⁇ 1 and ⁇ 2 at least in the TILT direction and the ROLL direction.
  • the calibration method according to the present embodiment is a simple calibration method.
  • the PAN direction it is possible to roughly adjust the angle in the PAN direction based on, for example, the side length ratio of the marker M extracted on the drive recorder 10 side. Specifically, when the marker M is a square as described above, as shown in FIG. 10, if the side length ratio of the marker M extracted on the drive recorder 10 side is within a predetermined range, that is, four side lengths. If are not significantly different lengths, they can be considered available for calibration.
  • the drive recorder 10 requests the rearrangement of the mobile terminal 20 in the PAN direction.
  • the side of the marker M is displayed in the PAN direction. Calibration accuracy can be ensured by requesting relocation if necessary based on the length ratio.
  • the marker M changes according to the change in the inclination of the mobile terminal 20, but it depends on the inclination of the mobile terminal 20.
  • the marker M may be displayed so as to always be horizontal or the same size with respect to the vehicle V.
  • the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 so that the marker M is always horizontal to the vehicle V even if the angle in the ROLL direction changes. May be displayed.
  • the application execution unit 24a has a marker on the display unit 22 of the mobile terminal 20 so that the marker M always has the same size with respect to the vehicle V even if the angle in the TILT direction changes. M may be displayed.
  • the drive recorder 10 includes a camera 11, a notification device 12, a storage unit 13, and a control unit 14.
  • the camera 11 is provided with an image pickup device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and uses such an image pickup device to image a predetermined imaging range.
  • an image pickup device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and uses such an image pickup device to image a predetermined imaging range.
  • the notification device 12 is a device that notifies information about calibration, and is realized by, for example, a display or a speaker.
  • the storage unit 13 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the internal parameter information 13b is stored.
  • the design information 13a is information including design values related to the camera 11, and includes, for example, design values related to the mounting angle of the camera 11.
  • the internal parameter information 13b is information about the internal parameters of the camera 11, and the result is reflected when the calibration is performed.
  • the control unit 14 is a controller like the control unit 24 described above. For example, various programs stored in the storage device inside the drive recorder 10 are executed by the CPU or MPU using the RAM as a work area. It will be realized. Further, the control unit 14 can be realized by an integrated circuit such as an ASIC or FPGA.
  • the control unit 14 has an acquisition unit 14a, a calculation unit 14b, an estimation unit 14c, and a calibration execution unit 14d, and realizes or executes the functions and operations of information processing described below.
  • the acquisition unit 14a acquires an image captured by the mobile terminal 20 captured by the camera 11.
  • the calculation unit 14b extracts the marker M from the captured image acquired by the acquisition unit 14a. Further, the calculation unit 14b calculates the posture of the extracted marker M.
  • the estimation unit 14c estimates the mounting angle of the camera 11 based on the posture of the marker M calculated by the calculation unit 14b.
  • FIG. 13 is an explanatory diagram (No. 1) of the estimation process of the mounting angle of the camera 11.
  • FIG. 14 is an explanatory diagram (No. 2) of the estimation process of the mounting angle of the camera 11.
  • FIG. 15 is an explanatory diagram (No. 3) of the estimation process of the mounting angle of the camera 11.
  • the calculation unit 14b binarizes the captured image acquired by the acquisition unit 14a, for example, and extracts the contour of the marker M from the binarized region. Then, as shown in FIG. 13, the calculation unit 14b projects the quadrangle corresponding to the extracted contour as a quadrangle QL1 on the projection surface IMG of the camera 11 in the three-dimensional space using the internal parameter information 13b.
  • each side of the quadrangle QL1 is defined as the first side SD1, the second side SD2, the third side SD3, and the fourth side SD4.
  • the calculation unit 14b generates a quadrangle QL2 connecting each of these sides and the optical center OC of the camera 11.
  • the quadrangle QL2 is a quadrangle corresponding to the marker M in the three-dimensional space in the real world.
  • each side of the quadrangle QL2 is defined as the first side SD11, the second side SD12, the third side SD13, and the fourth side SD14. Then, as shown in FIG. 13, four faces are generated.
  • the first surface F1 is a surface including the optical center OC, the first side SD1 of the quadrangle QL1, and the first side SD11 of the quadrangle QL2.
  • the second surface F2 is a surface including the optical center OC, the second side SD2 of the quadrangle QL1, and the second side SD12 of the quadrangle QL2.
  • the third surface F3 is a surface including the optical center OC, the third side SD3 of the quadrangle QL1, and the third side SD13 of the quadrangle QL2.
  • the fourth surface F4 is a surface including the optical center OC, the fourth side SD4 of the quadrangle QL1, and the fourth side SD14 of the quadrangle QL2.
  • the calculation unit 14b identifies two sets of faces whose intersections with a predetermined plane are parallel to each other.
  • the predetermined plane is a plane whose normal is known in advance, for example, a road surface.
  • the calculation unit 14b has a set of the second surface F2 and the fourth surface F4 and a set of the first surface F1 and the third surface F3 as a surface whose intersection line with the road surface is parallel. Identify the two pairs. Further, the calculation unit 14b assumes that the quadrangle QL2 is a parallelogram formed on the road surface.
  • the estimation unit 14c calculates the normal of the road surface.
  • the estimation unit 14c obtains the direction of the line of intersection between the surfaces based on the first surface F1 and the third surface F3, which are one of the sets of surfaces specified by the calculation unit 14b. Specifically, as shown in FIG. 14, the estimation unit 14c obtains the direction of the line of intersection CL1 between the first surface F1 and the third surface F3.
  • the direction vector V1 of the line of intersection CL1 is a vector perpendicular to each of the normal vector of the first surface F1 and the normal vector of the third surface F3. Therefore, the estimation unit 14c obtains the direction vector V1 of the line of intersection CL1 by the outer product of the normal vector of the first surface F1 and the normal vector of the third surface F3. Since the line of intersection of the first surface F1 and the third surface F3 is parallel to the road surface, the direction vector V1 is parallel to the road surface.
  • the estimation unit 14c obtains the direction of the line of intersection between the second surface F2 and the fourth surface F4, which is the other side of the set of surfaces specified by the calculation unit 14b. Specifically, as shown in FIG. 15, the estimation unit 14c obtains the direction of the line of intersection CL2 between the second surface F2 and the fourth surface F4.
  • the direction vector V2 of the line of intersection CL2 is a vector perpendicular to each of the normal vector of the second surface F2 and the normal vector of the fourth surface F4. Therefore, the estimation unit 14c obtains the direction vector V2 of the line of intersection CL2 by the outer product of the normal vector of the second surface F2 and the normal vector of the fourth surface F4.
  • the second surface F2 and the fourth surface F4 have parallel lines of intersection with the road surface, so that the direction vector V2 is parallel to the road surface.
  • the estimation unit 14c calculates the normal of the surface of the quadrangle QL2, that is, the normal of the road surface by the outer product of the direction vector V1 and the direction vector V2.
  • the normal of the road surface calculated by the estimation unit 14c is calculated by the camera coordinate system of the camera 11, the coordinate system of the three-dimensional space is obtained from the difference from the normal direction of the actual road surface.
  • the posture of the camera 11 with respect to the road surface can be estimated.
  • the estimation unit 14c estimates the mounting angle of the camera 11 with respect to the vehicle V.
  • the estimation process described with reference to FIGS. 13 to 15 can be executed by using, for example, a known "ARTToolit" algorithm in AR technology.
  • the calibration execution unit 14d performs calibration based on the estimation result by the estimation unit 14c. Specifically, the calibration execution unit 14d compares the mounting angle of the camera 11 estimated by the estimation unit 14c with the design information 13a, and adjusts the internal parameter information 13b.
  • the calibration execution unit 14d notifies the operator of the calibration result via the notification device 12. The operator adjusts the mounting angle of the camera 11 based on the content of the notification.
  • the calibration execution unit 14d may have the aiming mechanism adjust the mounting angle of the camera 11 based on the calibration result.
  • the calibration execution unit 14d may determine the situation at the time of calibration and give a notification via the notification device 12 according to the determination result. For example, the calibration execution unit 14d instructs, for example, to change the direction of the mobile terminal 20 when it is determined that the reflection of sunlight is intense in the acquired captured image and it is not suitable for the extraction of the marker M. Guidance information such as instruction to move the vehicle V may be notified.
  • FIG. 16 is a flowchart showing a processing procedure executed by the drive recorder 10 according to the embodiment.
  • the calculation unit 14b calculates the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 (step S101).
  • the estimation unit 14c estimates the mounting angle of the camera 11 from the calculation result by the calculation unit 14b (step S102).
  • the calibration execution unit 14d executes the calibration of the camera 11 based on the estimation result by the estimation unit 14c (step S103), and ends the process.
  • the drive recorder 10 (corresponding to an example of the "vehicle-mounted device") according to the embodiment includes a calculation unit 14b, an estimation unit 14c, and a calibration execution unit 14d (corresponding to an example of the "execution unit”). And prepare.
  • the calculation unit 14b calculates the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 (corresponding to an example of the “vehicle-mounted camera”).
  • the estimation unit 14c estimates the mounting angle of the camera 11 from the calculation result calculated by the calculation unit 14b.
  • the calibration execution unit 14d executes the calibration of the camera 11 based on the estimation result estimated by the estimation unit 14c.
  • the calibration execution unit 14d executes calibration based on the marker M displayed on the mobile terminal 20 arranged at an arbitrary position.
  • the calibration execution unit 14d executes the calibration based on the marker M displayed according to the posture of the mobile terminal 20.
  • the drive recorder 10 it is possible to reduce the labor of calibration while ensuring the accuracy of calibration.
  • the calibration execution unit 14d executes calibration based on the marker M displayed when the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane in the ROLL direction.
  • the drive recorder 10 it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the ROLL direction.
  • the calibration execution unit 14d executes calibration based on the marker M displayed when the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane in the TILT direction.
  • the drive recorder 10 it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the TILT direction.
  • the calibration execution unit 14d requests the rearrangement of the mobile terminal 20 in the PAN direction when the side length ratio of the marker M is out of the predetermined range.
  • the drive recorder 10 it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the PAN direction.
  • the calibration execution unit 14d executes the calibration based on the marker M which is always displayed so as to be horizontal or the same size with respect to the vehicle V.
  • the drive recorder 10 it is possible to reduce the labor of calibration without deteriorating the accuracy of calibration depending on the posture of the mobile terminal 20.
  • the marker M is a quadrangle having two sets of parallel two sides.
  • the drive recorder 10 it is possible to perform calibration using an algorithm for angle estimation in AR technology and reduce the labor of calibration.
  • the camera 11 mounted on the drive recorder 10 is given as an example of an in-vehicle camera to be calibrated, but the present invention is not limited to this.
  • the present embodiment may be applied to calibration of various in-vehicle cameras such as a front camera provided in front of the vehicle V, a rear camera provided in the rear, and a side camera provided in the side.
  • the drive recorder 10 is given as an example of the in-vehicle device, but the in-vehicle device is not limited to this, and the in-vehicle device includes an ECU (Electronic Control Unit) to which the above-mentioned various in-vehicle cameras are connected. It may be.
  • ECU Electronic Control Unit

Abstract

The present invention addresses the problem of reducing the labor for calibration. In order to solve such a problem, an in-vehicle device according to the embodiment comprises a calculation unit (14b), an estimation unit (14c), and a calibration execution unit (14d). The calculation unit (14b) calculates the posture of a marker (M) captured by a camera (11) and displayed on a mobile terminal (20). The estimation unit (14c) estimates an attachment angle of the camera (11) from the calculation result calculated by the calculation unit (14b). The calibration execution unit (14d) executes the calibration of the camera (11) on the basis of the estimation result estimated by the estimation unit (14c).

Description

車載装置および車載カメラのキャリブレーション方法Calibration method for in-vehicle devices and in-vehicle cameras
 開示の実施形態は、車載装置および車載カメラのキャリブレーション方法に関する。 The embodiment of the disclosure relates to a calibration method of an in-vehicle device and an in-vehicle camera.
 従来、ドライブレコーダ等の車載カメラの取り付け時において、車両周辺の所定位置に設けた専用の構造物を利用したり、車両周辺の床面に配置されて種々の表示パターンを表示するディスプレイを利用したりする、車載カメラのキャリブレーション方法が知られている(たとえば、特許文献1参照)。 Conventionally, when installing an in-vehicle camera such as a drive recorder, a dedicated structure provided at a predetermined position around the vehicle is used, or a display placed on the floor around the vehicle and displaying various display patterns is used. A method for calibrating an in-vehicle camera is known (see, for example, Patent Document 1).
特開2018-203039号公報Japanese Unexamined Patent Publication No. 2018-203039
 しかしながら、上述の従来技術には、キャリブレーションを実施するための専用の構造物やディスプレイ、これらの設置スペース等が必要であり、手間がかかるという問題がある。 However, the above-mentioned conventional technology requires a dedicated structure and display for performing calibration, an installation space for these, and the like, and has a problem that it takes time and effort.
 実施形態の一態様は、上記に鑑みてなされたものであって、キャリブレーションの手間を削減することができる車載装置および車載カメラのキャリブレーション方法を提供することを目的とする。 One aspect of the embodiment is made in view of the above, and an object thereof is to provide a calibration method for an in-vehicle device and an in-vehicle camera that can reduce the labor of calibration.
 実施形態の一態様に係る車載装置は、算出部と、推定部と、実行部とを備える。前記算出部は、車載カメラによって撮像された、携帯端末上に表示されたマーカの姿勢を算出する。前記推定部は、前記算出部によって算出された算出結果から前記車載カメラの取付角度を推定する。前記実行部は、前記推定部によって推定された推定結果に基づいて前記車載カメラのキャリブレーションを実行する。 The in-vehicle device according to one embodiment includes a calculation unit, an estimation unit, and an execution unit. The calculation unit calculates the posture of the marker displayed on the mobile terminal, which is captured by the vehicle-mounted camera. The estimation unit estimates the mounting angle of the vehicle-mounted camera from the calculation result calculated by the calculation unit. The execution unit calibrates the vehicle-mounted camera based on the estimation result estimated by the estimation unit.
 実施形態の一態様によれば、キャリブレーションの手間を削減することができる。 According to one aspect of the embodiment, the labor of calibration can be reduced.
図1は、第1の比較例に係る車載カメラのキャリブレーション方法の説明図(その1)である。FIG. 1 is an explanatory diagram (No. 1) of a calibration method for an in-vehicle camera according to a first comparative example. 図2は、第1の比較例に係る車載カメラのキャリブレーション方法の説明図(その2)である。FIG. 2 is an explanatory diagram (No. 2) of the calibration method of the vehicle-mounted camera according to the first comparative example. 図3は、第2の比較例に係る車載カメラのキャリブレーション方法の説明図である。FIG. 3 is an explanatory diagram of a calibration method for an in-vehicle camera according to a second comparative example. 図4は、実施形態に係る車載カメラのキャリブレーション方法の概要説明図である。FIG. 4 is a schematic explanatory view of the calibration method of the vehicle-mounted camera according to the embodiment. 図5は、実施形態に係る車載システムの構成例を示すブロック図である。FIG. 5 is a block diagram showing a configuration example of the in-vehicle system according to the embodiment. 図6は、実施形態に係る携帯端末の構成例を示すブロック図である。FIG. 6 is a block diagram showing a configuration example of the mobile terminal according to the embodiment. 図7は、携帯端末に係る3軸方向の説明図である。FIG. 7 is an explanatory diagram in the three-axis direction relating to the mobile terminal. 図8は、TILT方向に係るマーカMの表示例を示す図である。FIG. 8 is a diagram showing a display example of the marker M related to the TILT direction. 図9は、ROLL方向に係るマーカMの表示例を示す図である。FIG. 9 is a diagram showing a display example of the marker M related to the ROLL direction. 図10は、PAN方向に係るマーカMの表示例を示す図である。FIG. 10 is a diagram showing a display example of the marker M in the PAN direction. 図11は、ROLL方向に係るマーカMの表示の変形例を示す図である。FIG. 11 is a diagram showing a modified example of the display of the marker M in the ROLL direction. 図12は、TILT方向に係るマーカMの表示の変形例を示す図である。FIG. 12 is a diagram showing a modified example of the display of the marker M in the TILT direction. 図13は、カメラの取付角度の推定処理の説明図(その1)である。FIG. 13 is an explanatory diagram (No. 1) of the camera mounting angle estimation process. 図14は、カメラの取付角度の推定処理の説明図(その2)である。FIG. 14 is an explanatory diagram (No. 2) of the camera mounting angle estimation process. 図15は、カメラの取付角度の推定処理の説明図(その3)である。FIG. 15 is an explanatory diagram (No. 3) of the camera mounting angle estimation process. 図16は、実施形態に係るドライブレコーダが実行する処理手順を示すフローチャートである。FIG. 16 is a flowchart showing a processing procedure executed by the drive recorder according to the embodiment.
 以下、添付図面を参照して、本願の開示する車載装置および車載カメラのキャリブレーション方法の実施形態を詳細に説明する。なお、以下に示す実施形態によりこの発明が限定されるものではない。 Hereinafter, embodiments of the calibration method for the in-vehicle device and the in-vehicle camera disclosed in the present application will be described in detail with reference to the attached drawings. The present invention is not limited to the embodiments shown below.
 また、以下では、実施形態に係る車載装置がドライブレコーダ10であり、車載カメラが、かかるドライブレコーダ10に搭載されたカメラ11である場合を例に挙げて説明を行う。 Further, in the following, the case where the vehicle-mounted device according to the embodiment is the drive recorder 10 and the vehicle-mounted camera is the camera 11 mounted on the drive recorder 10 will be described as an example.
 まず、実施形態の説明に先立って、本実施形態の比較例から説明する。図1は、第1の比較例に係るカメラ11のキャリブレーション方法の説明図(その1)である。また、図2は、第1の比較例に係るカメラ11のキャリブレーション方法の説明図(その2)である。また、図3は、第2の比較例に係るカメラ11のキャリブレーション方法の説明図である。 First, prior to the description of the embodiment, a comparative example of the present embodiment will be described. FIG. 1 is an explanatory diagram (No. 1) of the calibration method of the camera 11 according to the first comparative example. Further, FIG. 2 is an explanatory diagram (No. 2) of the calibration method of the camera 11 according to the first comparative example. Further, FIG. 3 is an explanatory diagram of a calibration method of the camera 11 according to the second comparative example.
 図1および図2に示すように、第1の比較例に係るカメラ11のキャリブレーション方法では、車両Vの前方の所定位置に設けられた構造物STを利用してキャリブレーションを行う。図2に示すように、構造物STは、たとえば複数のポールと複数のジョイントとを組み合わせて車両Vの前方の所定位置に設けられる。 As shown in FIGS. 1 and 2, in the calibration method of the camera 11 according to the first comparative example, the calibration is performed using the structure ST provided at a predetermined position in front of the vehicle V. As shown in FIG. 2, the structure ST is provided at a predetermined position in front of the vehicle V, for example, by combining a plurality of poles and a plurality of joints.
 そして、作業者が、たとえばスマートフォン等の携帯端末20において専用アプリを起動し、携帯端末20上に表示された、カメラ11の撮像画像中の構造物STについて位置指定しながら検出する座標P1,P2,P3に基づいてキャリブレーションを行う。なお、図2中の符号「WP」は車両Vのタイヤ位置に対応する。 Then, the worker activates the dedicated application on the mobile terminal 20 such as a smartphone, and detects the structure ST in the captured image of the camera 11 displayed on the mobile terminal 20 while designating the positions P1 and P2. , Perform calibration based on P3. The reference numeral "WP" in FIG. 2 corresponds to the tire position of the vehicle V.
 しかしながら、かかる第1の比較例を用いた場合、構造物STの設置に手間がかかるうえに、キャリブレーションのために構造物STの設置スペースを含む相応の広さのスペースを要するという問題がある。 However, when the first comparative example is used, there is a problem that it takes time and effort to install the structure ST and a correspondingly large space including the installation space of the structure ST is required for calibration. ..
 また、図3に示すように、第2の比較例に係るカメラ11のキャリブレーション方法では、カメラ11の撮像画像上に2本の水平線H1,H2を設定し、走行中に、水平線H1,H2の間に車両Vの前方の実際の水平線が入るようにキャリブレーションを行う。 Further, as shown in FIG. 3, in the camera 11 calibration method according to the second comparative example, two horizontal lines H1 and H2 are set on the captured image of the camera 11, and the horizontal lines H1 and H2 are set during traveling. Calibrate so that the actual horizon in front of the vehicle V is inserted between.
 しかしながら、かかる第2の比較例を用いた場合、キャリブレーションを行いたい時に車両Vの前方に実際に水平線が見えているとは限らないうえ、走行開始直後のキャリブレーションが保証できないという問題がある。 However, when the second comparative example is used, there is a problem that the horizon is not always actually visible in front of the vehicle V when calibration is desired, and the calibration immediately after the start of traveling cannot be guaranteed. ..
 そこで、実施形態に係るカメラ11のキャリブレーション方法では、カメラ11によって撮像された、携帯端末20上に表示されたマーカMの姿勢を算出し、算出結果からカメラ11の取付角度を推定し、推定結果に基づいてキャリブレーションを行うこととした。 Therefore, in the calibration method of the camera 11 according to the embodiment, the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 is calculated, and the mounting angle of the camera 11 is estimated from the calculation result and estimated. We decided to calibrate based on the results.
 図4は、実施形態に係るカメラ11のキャリブレーション方法の概要説明図である。具体的には、図4に示すように、実施形態に係るカメラ11のキャリブレーション方法では、まず作業者が保持する携帯端末20において専用アプリを起動し、携帯端末20上にマーカMを表示させる。 FIG. 4 is a schematic explanatory diagram of the calibration method of the camera 11 according to the embodiment. Specifically, as shown in FIG. 4, in the calibration method of the camera 11 according to the embodiment, first, the dedicated application is started on the mobile terminal 20 held by the operator, and the marker M is displayed on the mobile terminal 20. ..
 マーカMは、2組の平行な2辺を有する四角形である。すなわち、マーカMは、正方形であってもよいし、矩形であってもよいし、平行四辺形であってもよい。本実施形態では、マーカMは正方形であるものとする。また、マーカMを表示する携帯端末20は、車両Vの周辺の任意の位置にあってよい。 Marker M is a quadrangle having two sets of parallel two sides. That is, the marker M may be a square, a rectangle, or a parallelogram. In this embodiment, the marker M is assumed to be a square. Further, the mobile terminal 20 for displaying the marker M may be located at an arbitrary position around the vehicle V.
 そして、ドライブレコーダ10は、カメラ11によるかかる携帯端末20の撮像画像を取得し、撮像画像中からマーカMを抽出する。そして、ドライブレコーダ10は、抽出したマーカMの姿勢を算出する。すなわち、実施形態に係るカメラ11のキャリブレーション方法では、まず、携帯端末20上に表示されたマーカMの姿勢を算出する(ステップS1)。 Then, the drive recorder 10 acquires the captured image of the mobile terminal 20 by the camera 11 and extracts the marker M from the captured image. Then, the drive recorder 10 calculates the posture of the extracted marker M. That is, in the calibration method of the camera 11 according to the embodiment, first, the posture of the marker M displayed on the mobile terminal 20 is calculated (step S1).
 つづいて、ドライブレコーダ10は、算出した算出結果からカメラ11の取付角度を推定する(ステップS2)。 Subsequently, the drive recorder 10 estimates the mounting angle of the camera 11 from the calculated calculation result (step S2).
 このとき、実施形態に係るカメラ11のキャリブレーション方法では、たとえばAR(Augmented Reality)技術において用いられる、ARマーカの特徴点から2組の平行線を抽出し、これに基づいてカメラの角度を推定する公知のアルゴリズムを利用することができる。かかるアルゴリズムを利用した推定処理については、図13~図15を用いた説明で後述する。 At this time, in the calibration method of the camera 11 according to the embodiment, for example, two sets of parallel lines are extracted from the feature points of the AR marker used in the AR (Augmented Reality) technique, and the angle of the camera is estimated based on the two sets of parallel lines. A known algorithm can be used. The estimation process using such an algorithm will be described later with reference to FIGS. 13 to 15.
 そして、ドライブレコーダ10は、推定した推定結果に基づいてキャリブレーションを行う(ステップS3)。これにより、第1の比較例のように構造物STを設けることなく、また、第2の比較例のように限定された状況下でなく、カメラ11のキャリブレーションを行うことができる。 Then, the drive recorder 10 performs calibration based on the estimated estimation result (step S3). As a result, the camera 11 can be calibrated without providing the structure ST as in the first comparative example and under the limited circumstances as in the second comparative example.
 なお、キャリブレーションの精度を確保するために、たとえばマーカMは、携帯端末20のTILT方向の角度またはROLL方向の角度が、水平面に対して所定の角度以内である場合にのみ表示されることが好ましい。かかる点については、図7~9,11,12を用いた説明で後述する。 In order to ensure the accuracy of calibration, for example, the marker M may be displayed only when the angle in the TILT direction or the angle in the ROLL direction of the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane. preferable. This point will be described later with reference to FIGS. 7 to 9, 11 and 12.
 また、たとえばマーカMは、携帯端末20のPAN方向の角度については、車両Vに対して正確に合わせ込むことはできないため、抽出したマーカMの辺長比に基づいてキャリブレーションに適さないと判定される場合には、携帯端末20の再配置を要求するようにしてもよい。かかる点については、図10を用いた説明で後述する。 Further, for example, since the marker M cannot accurately adjust the angle of the mobile terminal 20 in the PAN direction with respect to the vehicle V, it is determined that the marker M is not suitable for calibration based on the side length ratio of the extracted marker M. If so, the relocation of the mobile terminal 20 may be requested. This point will be described later with reference to FIG.
 このように、実施形態に係るカメラ11のキャリブレーション方法では、カメラ11によって撮像された、携帯端末20上に表示されたマーカMの姿勢を算出し、算出結果からカメラ11の取付角度を推定し、推定結果に基づいてキャリブレーションを行うこととした。 As described above, in the calibration method of the camera 11 according to the embodiment, the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 is calculated, and the mounting angle of the camera 11 is estimated from the calculation result. , It was decided to perform calibration based on the estimation result.
 したがって、実施形態に係るカメラ11のキャリブレーション方法によれば、キャリブレーションの手間を削減することができる。 Therefore, according to the calibration method of the camera 11 according to the embodiment, the labor of calibration can be reduced.
 以下、上述した実施形態に係るカメラ11のキャリブレーション方法を適用した車載システム1の構成例について、より具体的に説明する。 Hereinafter, a configuration example of the in-vehicle system 1 to which the calibration method of the camera 11 according to the above-described embodiment is applied will be described more specifically.
 図5は、実施形態に係る車載システム1の構成例を示すブロック図である。また、図6は、実施形態に係る携帯端末20の構成例を示すブロック図である。なお、図5および図6では、本実施形態の特徴を説明するために必要な構成要素のみを表しており、一般的な構成要素についての記載を省略している。 FIG. 5 is a block diagram showing a configuration example of the in-vehicle system 1 according to the embodiment. Further, FIG. 6 is a block diagram showing a configuration example of the mobile terminal 20 according to the embodiment. Note that FIGS. 5 and 6 show only the components necessary for explaining the features of the present embodiment, and the description of general components is omitted.
 換言すれば、図5および図6に図示される各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。例えば、各ブロックの分散・統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散・統合して構成することが可能である。 In other words, each component shown in FIGS. 5 and 6 is a functional concept and does not necessarily have to be physically configured as shown in the figure. For example, the specific form of distribution / integration of each block is not limited to the one shown in the figure, and all or part of it may be functionally or physically distributed in any unit according to various loads and usage conditions. It can be integrated and configured.
 また、図5および図6を用いた説明では、既に説明済みの構成要素については、説明を簡略するか、説明を省略する場合がある。 Further, in the explanation using FIGS. 5 and 6, the explanation may be simplified or omitted for the components already explained.
 図5に示すように、実施形態に係る車載システム1は、ドライブレコーダ10と、携帯端末20とを含む。まず、図6を用いて、携帯端末20の構成例から説明する。 As shown in FIG. 5, the in-vehicle system 1 according to the embodiment includes a drive recorder 10 and a mobile terminal 20. First, a configuration example of the mobile terminal 20 will be described with reference to FIG.
 携帯端末20は、キャリブレーションの作業者が携帯する端末装置であり、例えばスマートフォンやタブレット端末、ノート型PC(Personal Computer)等の情報処理端末である。 The mobile terminal 20 is a terminal device carried by a calibration worker, and is, for example, an information processing terminal such as a smartphone, a tablet terminal, or a notebook PC (Personal Computer).
 図6に示すように、携帯端末20は、ジャイロセンサ21と、表示部22と、記憶部23と、制御部24とを備える。 As shown in FIG. 6, the mobile terminal 20 includes a gyro sensor 21, a display unit 22, a storage unit 23, and a control unit 24.
 ジャイロセンサ21は、携帯端末20に係る3軸方向の角度を示す信号を制御部24へ出力する。ここで、図7は、携帯端末20に係る3軸方向の説明図である。図7に示すように、携帯端末20の正面方向をX軸の正方向、幅方向をY軸方向、高さ方向をZ軸方向とした3軸の直交座標系を想定する。 The gyro sensor 21 outputs a signal indicating an angle in the three-axis direction related to the mobile terminal 20 to the control unit 24. Here, FIG. 7 is an explanatory diagram in the three-axis direction according to the mobile terminal 20. As shown in FIG. 7, a three-axis Cartesian coordinate system is assumed in which the front direction of the mobile terminal 20 is the positive direction of the X axis, the width direction is the Y axis direction, and the height direction is the Z axis direction.
 ジャイロセンサ21は、携帯端末20の姿勢に応じて、かかる座標系におけるX軸まわりのROLL方向、Y軸まわりのTILT方向、および、Z軸まわりのPAN方向の各角度を示す信号を制御部24へ出力する。 The gyro sensor 21 outputs a signal indicating each angle in the ROLL direction around the X axis, the TILT direction around the Y axis, and the PAN direction around the Z axis in the coordinate system according to the posture of the mobile terminal 20. Output to.
 図6の説明に戻る。表示部22は、たとえばタッチパネルディスプレイ等によって実現され、制御部24の制御に基づいてマーカMを表示する。 Return to the explanation in Fig. 6. The display unit 22 is realized by, for example, a touch panel display or the like, and displays the marker M based on the control of the control unit 24.
 記憶部23は、たとえば、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現され、図6の例では、アプリ情報23aを記憶する。 The storage unit 23 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory (Flash Memory), or a storage device such as a hard disk or an optical disk. In the example of FIG. 6, the storage unit 23 stores the application information 23a. do.
 アプリ情報23aは、実施形態に係るキャリブレーション方法において携帯端末20における機能を実現する専用アプリのプログラムを含む。 The application information 23a includes a program of a dedicated application that realizes a function in the mobile terminal 20 in the calibration method according to the embodiment.
 制御部24は、コントローラ(controller)であり、たとえば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、携帯端末20内部の記憶デバイスに記憶されている各種プログラムがRAMを作業領域として実行されることにより実現される。また、制御部24は、たとえば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現することができる。 The control unit 24 is a controller, and for example, various programs stored in a storage device inside the mobile terminal 20 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), etc. use the RAM as a work area. It is realized by being executed. Further, the control unit 24 can be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 制御部24は、アプリ実行部24aを有し、以下に説明する情報処理の機能や作用を実現または実行する。 The control unit 24 has an application execution unit 24a, and realizes or executes an information processing function or operation described below.
 アプリ実行部24aは、アプリ情報23aから専用アプリのプログラムを読み込んで実行する。また、アプリ実行部24aは、ジャイロセンサ21から取得する3軸方向の各角度に基づいて、表示部22にマーカMを表示させる。 The application execution unit 24a reads and executes the program of the dedicated application from the application information 23a. Further, the application execution unit 24a causes the display unit 22 to display the marker M based on each angle in the three axial directions acquired from the gyro sensor 21.
 ここで、マーカMの表示例について、図8~図12を用いて説明する。図8は、TILT方向に係るマーカMの表示例を示す図である。また、図9は、ROLL方向に係るマーカMの表示例を示す図である。 Here, a display example of the marker M will be described with reference to FIGS. 8 to 12. FIG. 8 is a diagram showing a display example of the marker M related to the TILT direction. Further, FIG. 9 is a diagram showing a display example of the marker M related to the ROLL direction.
 また、図10は、PAN方向に係るマーカMの表示例を示す図である。また、図11は、ROLL方向に係るマーカMの表示の変形例を示す図である。また、図12は、TILT方向に係るマーカMの表示の変形例を示す図である。 Further, FIG. 10 is a diagram showing a display example of the marker M related to the PAN direction. Further, FIG. 11 is a diagram showing a modified example of the display of the marker M related to the ROLL direction. Further, FIG. 12 is a diagram showing a modified example of the display of the marker M related to the TILT direction.
 図8に示すように、アプリ実行部24aは、TILT方向については、携帯端末20の角度が水平面であるXY平面に対する所定角θ1以内であれば、携帯端末20の表示部22にマーカMを表示させる(図中の「マーカ表示」参照)。 As shown in FIG. 8, the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 is within a predetermined angle θ1 with respect to the horizontal plane XY plane in the TILT direction. (See "Marker display" in the figure).
 一方、アプリ実行部24aは、TILT方向についての携帯端末20の角度が所定角θ1以外であれば、携帯端末20の表示部22にマーカMを表示させない(図中の「マーカ非表示」参照)。 On the other hand, the application execution unit 24a does not display the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 with respect to the TILT direction is other than the predetermined angle θ1 (see “Marker non-display” in the figure). ..
 また、図9に示すように、アプリ実行部24aは、ROLL方向については、携帯端末20の角度が水平面であるXY平面に対する所定角θ2以内であれば、携帯端末20の表示部22にマーカMを表示させる(図中の「マーカ表示」参照)。 Further, as shown in FIG. 9, the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 is within a predetermined angle θ2 with respect to the horizontal plane XY plane in the ROLL direction. Is displayed (see "Marker display" in the figure).
 一方、アプリ実行部24aは、ROLL方向についての携帯端末20の角度が所定角θ2以外であれば、携帯端末20の表示部22にマーカMを表示させない(図中の「マーカ非表示」参照)。 On the other hand, the application execution unit 24a does not display the marker M on the display unit 22 of the mobile terminal 20 if the angle of the mobile terminal 20 with respect to the ROLL direction is other than the predetermined angle θ2 (see “Marker non-display” in the figure). ..
 そして、本実施形態では、少なくともかかるTILT方向およびROLL方向について携帯端末20が所定角θ1,θ2以内にある場合にのみ表示部22に表示させたマーカMに基づいて、キャリブレーションを行う。 Then, in the present embodiment, calibration is performed based on the marker M displayed on the display unit 22 only when the mobile terminal 20 is within the predetermined angles θ1 and θ2 at least in the TILT direction and the ROLL direction.
 これは、PAN方向の角度については、TILT方向およびROLL方向の水平面に相当する基準面が存在しない、すなわち上述したように、PAN方向の角度については、車両Vに対して正確に合わせ込むことはできないためである。その意味では、本実施形態に係るキャリブレーション方法は、簡易的なキャリブレーション方法であると言える。 This is because there is no reference plane corresponding to the horizontal plane in the TILT direction and the ROLL direction for the angle in the PAN direction, that is, as described above, the angle in the PAN direction cannot be accurately adjusted with respect to the vehicle V. Because it cannot be done. In that sense, it can be said that the calibration method according to the present embodiment is a simple calibration method.
 ただし、PAN方向については、たとえばドライブレコーダ10側で抽出されるマーカMの辺長比に基づいて、大まかにPAN方向の角度を合わせ込むことが可能である。具体的には、上述したようにマーカMを正方形とした場合、図10に示すように、ドライブレコーダ10側で抽出されるマーカMの辺長比が所定範囲内であれば、すなわち4辺長が著しく異なる長さでないならば、キャリブレーションへの利用可と見なすことができる。 However, regarding the PAN direction, it is possible to roughly adjust the angle in the PAN direction based on, for example, the side length ratio of the marker M extracted on the drive recorder 10 side. Specifically, when the marker M is a square as described above, as shown in FIG. 10, if the side length ratio of the marker M extracted on the drive recorder 10 side is within a predetermined range, that is, four side lengths. If are not significantly different lengths, they can be considered available for calibration.
 一方、ドライブレコーダ10側で抽出されるマーカMの辺長比が所定範囲外であれば、すなわち4辺長が著しく異なる長さであるならば、キャリブレーションへの利用不可と見なすこととなる。かかる場合、たとえばドライブレコーダ10は、PAN方向について、携帯端末20の再配置を要求する。 On the other hand, if the side length ratio of the marker M extracted on the drive recorder 10 side is out of the predetermined range, that is, if the four side lengths are significantly different lengths, it is considered that the marker M cannot be used for calibration. In such a case, for example, the drive recorder 10 requests the rearrangement of the mobile terminal 20 in the PAN direction.
 これまで図8~図10を用いて説明したように、TILT方向およびROLL方向については、所定角θ1,θ2以内である場合にマーカMを表示させることによって、PAN方向については、マーカMの辺長比に基づいて必要であれば再配置を要求することによって、キャリブレーションの精度を確保することができる。 As described above with reference to FIGS. 8 to 10, by displaying the marker M when the predetermined angles θ1 and θ2 are within the TILT direction and the ROLL direction, the side of the marker M is displayed in the PAN direction. Calibration accuracy can be ensured by requesting relocation if necessary based on the length ratio.
 なお、これまでは、たとえば図9に示したように、携帯端末20の傾きが変わればこれに追従してマーカMの傾きも変わる表示例を示してきたが、携帯端末20の傾きに依存せず、常に車両Vに対し水平または同一サイズとなるようにマーカMを表示させてもよい。 Up to now, as shown in FIG. 9, for example, a display example has been shown in which the inclination of the marker M changes according to the change in the inclination of the mobile terminal 20, but it depends on the inclination of the mobile terminal 20. Instead, the marker M may be displayed so as to always be horizontal or the same size with respect to the vehicle V.
 すなわち、図11に示すように、アプリ実行部24aは、ROLL方向の角度が変化しても、マーカMは常に車両Vに対して水平となるように、携帯端末20の表示部22にマーカMを表示させてもよい。 That is, as shown in FIG. 11, the application execution unit 24a displays the marker M on the display unit 22 of the mobile terminal 20 so that the marker M is always horizontal to the vehicle V even if the angle in the ROLL direction changes. May be displayed.
 また、図12に示すように、アプリ実行部24aは、TILT方向の角度が変化しても、マーカMは常に車両Vに対して同一サイズとなるように、携帯端末20の表示部22にマーカMを表示させてもよい。 Further, as shown in FIG. 12, the application execution unit 24a has a marker on the display unit 22 of the mobile terminal 20 so that the marker M always has the same size with respect to the vehicle V even if the angle in the TILT direction changes. M may be displayed.
 図5の説明に戻り、つづいてドライブレコーダ10の構成例について説明する。図5に示すように、ドライブレコーダ10は、カメラ11と、通知デバイス12と、記憶部13と、制御部14とを備える。 Returning to the explanation of FIG. 5, the configuration example of the drive recorder 10 will be described next. As shown in FIG. 5, the drive recorder 10 includes a camera 11, a notification device 12, a storage unit 13, and a control unit 14.
 カメラ11は、たとえば、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子を備え、かかる撮像素子を用いて所定の撮像範囲を撮像する。 The camera 11 is provided with an image pickup device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and uses such an image pickup device to image a predetermined imaging range.
 通知デバイス12は、キャリブレーションに関する情報を通知するデバイスであって、たとえばディスプレイやスピーカ等によって実現される。 The notification device 12 is a device that notifies information about calibration, and is realized by, for example, a display or a speaker.
 記憶部13は、上述した記憶部23と同様に、たとえば、RAM、フラッシュメモリ等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現され、図5の例では、設計情報13aと、内部パラメータ情報13bとを記憶する。 Similar to the storage unit 23 described above, the storage unit 13 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. The internal parameter information 13b is stored.
 設計情報13aは、カメラ11に関する設計値を含む情報であって、たとえばカメラ11の取付角度に関する設計値を含む。内部パラメータ情報13bは、カメラ11の内部パラメータに関する情報であって、キャリブレーションが行われた場合に、その結果が反映される。 The design information 13a is information including design values related to the camera 11, and includes, for example, design values related to the mounting angle of the camera 11. The internal parameter information 13b is information about the internal parameters of the camera 11, and the result is reflected when the calibration is performed.
 制御部14は、上述した制御部24と同様に、コントローラであり、たとえば、CPUやMPUによって、ドライブレコーダ10内部の記憶デバイスに記憶されている各種プログラムがRAMを作業領域として実行されることにより実現される。また、制御部14は、たとえば、ASICやFPGA等の集積回路により実現することができる。 The control unit 14 is a controller like the control unit 24 described above. For example, various programs stored in the storage device inside the drive recorder 10 are executed by the CPU or MPU using the RAM as a work area. It will be realized. Further, the control unit 14 can be realized by an integrated circuit such as an ASIC or FPGA.
 制御部14は、取得部14aと、算出部14bと、推定部14cと、キャリブレーション実行部14dとを有し、以下に説明する情報処理の機能や作用を実現または実行する。 The control unit 14 has an acquisition unit 14a, a calculation unit 14b, an estimation unit 14c, and a calibration execution unit 14d, and realizes or executes the functions and operations of information processing described below.
 取得部14aは、カメラ11によって撮像された携帯端末20の撮像画像を取得する。算出部14bは、取得部14aによって取得された撮像画像中からマーカMを抽出する。また、算出部14bは、抽出したマーカMの姿勢を算出する。 The acquisition unit 14a acquires an image captured by the mobile terminal 20 captured by the camera 11. The calculation unit 14b extracts the marker M from the captured image acquired by the acquisition unit 14a. Further, the calculation unit 14b calculates the posture of the extracted marker M.
 推定部14cは、算出部14bによって算出されたマーカMの姿勢に基づいてカメラ11の取付角度を推定する。ここで、かかる推定処理について、図13~図15を用いて説明する。図13は、カメラ11の取付角度の推定処理の説明図(その1)である。また、図14は、カメラ11の取付角度の推定処理の説明図(その2)である。また、図15は、カメラ11の取付角度の推定処理の説明図(その3)である。 The estimation unit 14c estimates the mounting angle of the camera 11 based on the posture of the marker M calculated by the calculation unit 14b. Here, the estimation process will be described with reference to FIGS. 13 to 15. FIG. 13 is an explanatory diagram (No. 1) of the estimation process of the mounting angle of the camera 11. Further, FIG. 14 is an explanatory diagram (No. 2) of the estimation process of the mounting angle of the camera 11. Further, FIG. 15 is an explanatory diagram (No. 3) of the estimation process of the mounting angle of the camera 11.
 まず、算出部14bは、取得部14aによって取得された撮像画像をたとえば2値化し、2値化した領域からマーカMの輪郭を抽出する。そして、図13に示すように、算出部14bは、抽出した輪郭に相当する四角形を、内部パラメータ情報13bを用いて三次元空間におけるカメラ11の投影面IMG上に四角形QL1として投影する。 First, the calculation unit 14b binarizes the captured image acquired by the acquisition unit 14a, for example, and extracts the contour of the marker M from the binarized region. Then, as shown in FIG. 13, the calculation unit 14b projects the quadrangle corresponding to the extracted contour as a quadrangle QL1 on the projection surface IMG of the camera 11 in the three-dimensional space using the internal parameter information 13b.
 ここで、四角形QL1の各辺を、図13に示すように、第1辺SD1,第2辺SD2、第3辺SD3,第4辺SD4と定義する。次に、算出部14bは、これら各辺と、カメラ11の光学中心OCとを結んだ四角形QL2を生成する。かかる四角形QL2は、実世界における3次元空間上のマーカMに相当する四角形である。 Here, as shown in FIG. 13, each side of the quadrangle QL1 is defined as the first side SD1, the second side SD2, the third side SD3, and the fourth side SD4. Next, the calculation unit 14b generates a quadrangle QL2 connecting each of these sides and the optical center OC of the camera 11. The quadrangle QL2 is a quadrangle corresponding to the marker M in the three-dimensional space in the real world.
 ここで、四角形QL2の各辺を、図13に示すように、第1辺SD11,第2辺SD12、第3辺SD13,第4辺SD14と定義する。すると、図13に示すように、4つの面が生成される。 Here, as shown in FIG. 13, each side of the quadrangle QL2 is defined as the first side SD11, the second side SD12, the third side SD13, and the fourth side SD14. Then, as shown in FIG. 13, four faces are generated.
 第1面F1は、光学中心OCと、四角形QL1の第1辺SD1と、四角形QL2の第1辺SD11とが含まれる面である。また、第2面F2は、光学中心OCと、四角形QL1の第2辺SD2と、四角形QL2の第2辺SD12とが含まれる面である。 The first surface F1 is a surface including the optical center OC, the first side SD1 of the quadrangle QL1, and the first side SD11 of the quadrangle QL2. Further, the second surface F2 is a surface including the optical center OC, the second side SD2 of the quadrangle QL1, and the second side SD12 of the quadrangle QL2.
 また、第3面F3は、光学中心OCと、四角形QL1の第3辺SD3と、四角形QL2の第3辺SD13とが含まれる面である。また、第4面F4は、光学中心OCと、四角形QL1の第4辺SD4と、四角形QL2の第4辺SD14とが含まれる面である。 Further, the third surface F3 is a surface including the optical center OC, the third side SD3 of the quadrangle QL1, and the third side SD13 of the quadrangle QL2. Further, the fourth surface F4 is a surface including the optical center OC, the fourth side SD4 of the quadrangle QL1, and the fourth side SD14 of the quadrangle QL2.
 次に、算出部14bは、所定の平面との交線が互いに平行になる面の組を2組特定する。所定の平面とは、あらかじめ平面の法線が分かっている面であり、たとえば路面である。具体的には、算出部14bは、路面との交線が平行な面として、第2面F2と第4面F4との組と、第1面F1と第3面F3との組と、の2つの組を特定する。また、算出部14bは、四角形QL2を路面上に形成される平行四辺形であると想定する。 Next, the calculation unit 14b identifies two sets of faces whose intersections with a predetermined plane are parallel to each other. The predetermined plane is a plane whose normal is known in advance, for example, a road surface. Specifically, the calculation unit 14b has a set of the second surface F2 and the fourth surface F4 and a set of the first surface F1 and the third surface F3 as a surface whose intersection line with the road surface is parallel. Identify the two pairs. Further, the calculation unit 14b assumes that the quadrangle QL2 is a parallelogram formed on the road surface.
 そして、推定部14cは、路面の法線を算出する。まず、推定部14cは、算出部14bで特定した面の組の一方である第1面F1と第3面F3とに基づき、面同士の交線の方向を求める。具体的には、図14に示すように、推定部14cは、第1面F1と第3面F3との交線CL1の向きを求める。交線CL1の方向ベクトルV1は、第1面F1の法線ベクトルおよび第3面F3の法線ベクトルそれぞれと垂直なベクトルである。したがって、推定部14cは、第1面F1の法線ベクトルと第3面F3の法線ベクトルとの外積により、交線CL1の方向ベクトルV1を求める。第1面F1と第3面F3は、路面との交線が平行となるため、方向ベクトルV1は路面と平行になる。 Then, the estimation unit 14c calculates the normal of the road surface. First, the estimation unit 14c obtains the direction of the line of intersection between the surfaces based on the first surface F1 and the third surface F3, which are one of the sets of surfaces specified by the calculation unit 14b. Specifically, as shown in FIG. 14, the estimation unit 14c obtains the direction of the line of intersection CL1 between the first surface F1 and the third surface F3. The direction vector V1 of the line of intersection CL1 is a vector perpendicular to each of the normal vector of the first surface F1 and the normal vector of the third surface F3. Therefore, the estimation unit 14c obtains the direction vector V1 of the line of intersection CL1 by the outer product of the normal vector of the first surface F1 and the normal vector of the third surface F3. Since the line of intersection of the first surface F1 and the third surface F3 is parallel to the road surface, the direction vector V1 is parallel to the road surface.
 同様に、推定部14cは、算出部14bで特定した面の組の他方である第2面F2と第4面F4との交線の方向を求める。具体的には、図15に示すように、推定部14cは、第2面F2と第4面F4との交線CL2の向きを求める。交線CL2の方向ベクトルV2は、第2面F2の法線ベクトルおよび第4面F4の法線ベクトルそれぞれと垂直なベクトルである。したがって、推定部14cは、第2面F2の法線ベクトルと第4面F4の法線ベクトルとの外積により、交線CL2の方向ベクトルV2を求める。第2面F2と第4面F4も同様に、路面との交線が平行となるため、方向ベクトルV2は路面と平行になる。 Similarly, the estimation unit 14c obtains the direction of the line of intersection between the second surface F2 and the fourth surface F4, which is the other side of the set of surfaces specified by the calculation unit 14b. Specifically, as shown in FIG. 15, the estimation unit 14c obtains the direction of the line of intersection CL2 between the second surface F2 and the fourth surface F4. The direction vector V2 of the line of intersection CL2 is a vector perpendicular to each of the normal vector of the second surface F2 and the normal vector of the fourth surface F4. Therefore, the estimation unit 14c obtains the direction vector V2 of the line of intersection CL2 by the outer product of the normal vector of the second surface F2 and the normal vector of the fourth surface F4. Similarly, the second surface F2 and the fourth surface F4 have parallel lines of intersection with the road surface, so that the direction vector V2 is parallel to the road surface.
 そして、推定部14cは、方向ベクトルV1と方向ベクトルV2との外積により、四角形QL2の面の法線、すなわち路面の法線を算出する。ここで、推定部14cが算出した路面の法線はカメラ11のカメラ座標系で算出されるため、実際の路面の法線である垂直方向との違いから3次元空間の座標系を求めることにより、路面に対するカメラ11の姿勢を推定することができる。その推定結果から、推定部14cは、車両Vに対するカメラ11の取付角度を推定する。なお、図13~図15を用いて説明した推定処理には、たとえばAR技術における公知の「ARToolkit」のアルゴリズムを利用して実行することができる。 Then, the estimation unit 14c calculates the normal of the surface of the quadrangle QL2, that is, the normal of the road surface by the outer product of the direction vector V1 and the direction vector V2. Here, since the normal of the road surface calculated by the estimation unit 14c is calculated by the camera coordinate system of the camera 11, the coordinate system of the three-dimensional space is obtained from the difference from the normal direction of the actual road surface. , The posture of the camera 11 with respect to the road surface can be estimated. From the estimation result, the estimation unit 14c estimates the mounting angle of the camera 11 with respect to the vehicle V. The estimation process described with reference to FIGS. 13 to 15 can be executed by using, for example, a known "ARTToolit" algorithm in AR technology.
 図5の説明に戻る。キャリブレーション実行部14dは、推定部14cによる推定結果に基づいてキャリブレーションを行う。具体的には、キャリブレーション実行部14dは、推定部14cによって推定されたカメラ11の取付角度と設計情報13aとを比較し、内部パラメータ情報13bを調整する。 Return to the explanation in Fig. 5. The calibration execution unit 14d performs calibration based on the estimation result by the estimation unit 14c. Specifically, the calibration execution unit 14d compares the mounting angle of the camera 11 estimated by the estimation unit 14c with the design information 13a, and adjusts the internal parameter information 13b.
 あるいは、キャリブレーション実行部14dは、通知デバイス12を介して作業者へキャリブレーション結果を通知する。作業者は、その通知内容に基づいて、カメラ11の取付角度を調整する。なお、ドライブレコーダ10が図示略のエーミング機構を備えている場合には、キャリブレーション実行部14dは、キャリブレーション結果に基づいてエーミング機構にカメラ11の取付角度を調整させてもよい。 Alternatively, the calibration execution unit 14d notifies the operator of the calibration result via the notification device 12. The operator adjusts the mounting angle of the camera 11 based on the content of the notification. When the drive recorder 10 is provided with an aiming mechanism (not shown), the calibration execution unit 14d may have the aiming mechanism adjust the mounting angle of the camera 11 based on the calibration result.
 また、キャリブレーション実行部14dは、キャリブレーション時の状況を判定し、その判定結果に応じて通知デバイス12を介した通知を行ってもよい。たとえば、キャリブレーション実行部14dは、取得された撮像画像中に太陽光の映り込みが激しく、マーカMの抽出に適さないと判定される場合には、たとえば携帯端末20の向きを変えるように指示したり、車両Vを移動させるように指示したりといったガイダンス情報を通知するようにしてもよい。 Further, the calibration execution unit 14d may determine the situation at the time of calibration and give a notification via the notification device 12 according to the determination result. For example, the calibration execution unit 14d instructs, for example, to change the direction of the mobile terminal 20 when it is determined that the reflection of sunlight is intense in the acquired captured image and it is not suitable for the extraction of the marker M. Guidance information such as instruction to move the vehicle V may be notified.
 次に、実施形態に係るドライブレコーダ10が実行する処理手順について、図16を用いて説明する。図16は、実施形態に係るドライブレコーダ10が実行する処理手順を示すフローチャートである。 Next, the processing procedure executed by the drive recorder 10 according to the embodiment will be described with reference to FIG. FIG. 16 is a flowchart showing a processing procedure executed by the drive recorder 10 according to the embodiment.
 図16に示すように、算出部14bは、カメラ11によって撮像された、携帯端末20上に表示されたマーカMの姿勢を算出する(ステップS101)。 As shown in FIG. 16, the calculation unit 14b calculates the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 (step S101).
 そして、推定部14cが、算出部14bによる算出結果からカメラ11の取付角度を推定する(ステップS102)。 Then, the estimation unit 14c estimates the mounting angle of the camera 11 from the calculation result by the calculation unit 14b (step S102).
 そして、キャリブレーション実行部14dが、推定部14cによる推定結果に基づいてカメラ11のキャリブレーションを実行し(ステップS103)、処理を終了する。 Then, the calibration execution unit 14d executes the calibration of the camera 11 based on the estimation result by the estimation unit 14c (step S103), and ends the process.
 上述してきたように、実施形態に係るドライブレコーダ10(「車載装置」の一例に相当)は、算出部14bと、推定部14cと、キャリブレーション実行部14d(「実行部」の一例に相当)とを備える。算出部14bは、カメラ11(「車載カメラ」の一例に相当)によって撮像された、携帯端末20上に表示されたマーカMの姿勢を算出する。推定部14cは、算出部14bによって算出された算出結果からカメラ11の取付角度を推定する。キャリブレーション実行部14dは、推定部14cによって推定された推定結果に基づいてカメラ11のキャリブレーションを実行する。 As described above, the drive recorder 10 (corresponding to an example of the "vehicle-mounted device") according to the embodiment includes a calculation unit 14b, an estimation unit 14c, and a calibration execution unit 14d (corresponding to an example of the "execution unit"). And prepare. The calculation unit 14b calculates the posture of the marker M displayed on the mobile terminal 20 captured by the camera 11 (corresponding to an example of the “vehicle-mounted camera”). The estimation unit 14c estimates the mounting angle of the camera 11 from the calculation result calculated by the calculation unit 14b. The calibration execution unit 14d executes the calibration of the camera 11 based on the estimation result estimated by the estimation unit 14c.
 したがって、実施形態に係るドライブレコーダ10によれば、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration.
 また、キャリブレーション実行部14dは、任意の位置に配置された携帯端末20上に表示されたマーカMに基づいてキャリブレーションを実行する。 Further, the calibration execution unit 14d executes calibration based on the marker M displayed on the mobile terminal 20 arranged at an arbitrary position.
 したがって、実施形態に係るドライブレコーダ10によれば、キャリブレーションに際しての省スペース化を図ることができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to save space during calibration.
 また、キャリブレーション実行部14dは、携帯端末20の姿勢に応じて表示されるマーカMに基づいてキャリブレーションを実行する。 Further, the calibration execution unit 14d executes the calibration based on the marker M displayed according to the posture of the mobile terminal 20.
 したがって、実施形態に係るドライブレコーダ10によれば、キャリブレーションの精度を確保しつつ、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration while ensuring the accuracy of calibration.
 また、キャリブレーション実行部14dは、携帯端末20がROLL方向について水平面に対し所定角以内にある場合に表示されるマーカMに基づいてキャリブレーションを実行する。 Further, the calibration execution unit 14d executes calibration based on the marker M displayed when the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane in the ROLL direction.
 したがって、実施形態に係るドライブレコーダ10によれば、ROLL方向についてキャリブレーションの精度を確保しつつ、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the ROLL direction.
 また、キャリブレーション実行部14dは、携帯端末20がTILT方向について水平面に対し所定角以内にある場合に表示されるマーカMに基づいてキャリブレーションを実行する。 Further, the calibration execution unit 14d executes calibration based on the marker M displayed when the mobile terminal 20 is within a predetermined angle with respect to the horizontal plane in the TILT direction.
 したがって、実施形態に係るドライブレコーダ10によれば、TILT方向についてキャリブレーションの精度を確保しつつ、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the TILT direction.
 また、キャリブレーション実行部14dは、マーカMの辺長比が所定範囲外である場合に、PAN方向について携帯端末20の再配置を要求する。 Further, the calibration execution unit 14d requests the rearrangement of the mobile terminal 20 in the PAN direction when the side length ratio of the marker M is out of the predetermined range.
 したがって、実施形態に係るドライブレコーダ10によれば、PAN方向についてキャリブレーションの精度を確保しつつ、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration while ensuring the accuracy of calibration in the PAN direction.
 また、キャリブレーション実行部14dは、常に車両Vに対し水平または同一サイズとなるように表示されるマーカMに基づいてキャリブレーションを実行する。 Further, the calibration execution unit 14d executes the calibration based on the marker M which is always displayed so as to be horizontal or the same size with respect to the vehicle V.
 したがって、実施形態に係るドライブレコーダ10によれば、携帯端末20の姿勢によりキャリブレーションの精度を落とすことなく、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to reduce the labor of calibration without deteriorating the accuracy of calibration depending on the posture of the mobile terminal 20.
 また、マーカMは、2組の平行な2辺を有する四角形である。 Further, the marker M is a quadrangle having two sets of parallel two sides.
 したがって、実施形態に係るドライブレコーダ10によれば、AR技術における角度推定のためのアルゴリズムを用いてキャリブレーションを行い、キャリブレーションの手間を削減することができる。 Therefore, according to the drive recorder 10 according to the embodiment, it is possible to perform calibration using an algorithm for angle estimation in AR technology and reduce the labor of calibration.
 なお、上述した実施形態では、ドライブレコーダ10に搭載されたカメラ11をキャリブレーション対象となる車載カメラの一例として挙げたが、これに限られるものではない。たとえば、車両Vの前方に設けられるフロントカメラや、後方に設けられるリアカメラ、側方に設けられるサイドカメラ等、各種の車載カメラのキャリブレーションに本実施形態を適用してもよい。 In the above-described embodiment, the camera 11 mounted on the drive recorder 10 is given as an example of an in-vehicle camera to be calibrated, but the present invention is not limited to this. For example, the present embodiment may be applied to calibration of various in-vehicle cameras such as a front camera provided in front of the vehicle V, a rear camera provided in the rear, and a side camera provided in the side.
 また、上述した実施形態では、ドライブレコーダ10を車載装置の一例として挙げたが、これに限られるものではなく、車載装置は、前述の各種の車載カメラが接続されたECU(Electronic Control Unit)等であってもよい。 Further, in the above-described embodiment, the drive recorder 10 is given as an example of the in-vehicle device, but the in-vehicle device is not limited to this, and the in-vehicle device includes an ECU (Electronic Control Unit) to which the above-mentioned various in-vehicle cameras are connected. It may be.
 さらなる効果や変形例は、当業者によって容易に導き出すことができる。このため、本発明のより広範な態様は、以上のように表しかつ記述した特定の詳細および代表的な実施形態に限定されるものではない。したがって、添付の請求の範囲およびその均等物によって定義される総括的な発明の概念の精神または範囲から逸脱することなく、様々な変更が可能である。 Further effects and variations can be easily derived by those skilled in the art. For this reason, the broader aspects of the invention are not limited to the particular details and representative embodiments described and described above. Accordingly, various modifications can be made without departing from the spirit or scope of the overall concept of the invention as defined by the appended claims and their equivalents.
   1  車載システム
  10  ドライブレコーダ
  11  カメラ
  12  通知デバイス
  13  記憶部
  13a 設計情報
  13b 内部パラメータ情報
  14  制御部
  14a 取得部
  14b 算出部
  14c 推定部
  14d キャリブレーション実行部
  20  携帯端末
  21  ジャイロセンサ
  22  表示部
  23  記憶部
  23a アプリ情報
  24  制御部
  24a アプリ実行部
  V   車両
1 In-vehicle system 10 Drive recorder 11 Camera 12 Notification device 13 Storage unit 13a Design information 13b Internal parameter information 14 Control unit 14a Acquisition unit 14b Calculation unit 14c Estimate unit 14d Calibration execution unit 20 Mobile terminal 21 Gyro sensor 22 Display unit 23 Storage unit 23a App information 24 Control unit 24a App execution unit V Vehicle

Claims (9)

  1.  車載カメラによって撮像された、携帯端末上に表示されたマーカの姿勢を算出する算出部と、
     前記算出部によって算出された算出結果から前記車載カメラの取付角度を推定する推定部と、
     前記推定部によって推定された推定結果に基づいて前記車載カメラのキャリブレーションを実行する実行部と
     を備えることを特徴とする車載装置。
    A calculation unit that calculates the posture of the marker displayed on the mobile terminal captured by the in-vehicle camera,
    An estimation unit that estimates the mounting angle of the in-vehicle camera from the calculation results calculated by the calculation unit, and an estimation unit.
    An in-vehicle device including an execution unit that executes calibration of the in-vehicle camera based on an estimation result estimated by the estimation unit.
  2.  前記実行部は、
     任意の位置に配置された前記携帯端末上に表示された前記マーカに基づいて前記キャリブレーションを実行する
     ことを特徴とする請求項1に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 1, wherein the calibration is executed based on the marker displayed on the mobile terminal arranged at an arbitrary position.
  3.  前記実行部は、
     前記携帯端末の姿勢に応じて表示される前記マーカに基づいて前記キャリブレーションを実行する
     ことを特徴とする請求項1または2に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 1 or 2, wherein the calibration is performed based on the marker displayed according to the posture of the mobile terminal.
  4.  前記実行部は、
     前記携帯端末がROLL方向について水平面に対し所定角以内にある場合に表示される前記マーカに基づいて前記キャリブレーションを実行する
     ことを特徴とする請求項3に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 3, wherein the calibration is performed based on the marker displayed when the mobile terminal is within a predetermined angle with respect to the horizontal plane in the ROLL direction.
  5.  前記実行部は、
     前記携帯端末がTILT方向について水平面に対し所定角以内にある場合に表示される前記マーカに基づいて前記キャリブレーションを実行する
     ことを特徴とする請求項3に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 3, wherein the calibration is performed based on the marker displayed when the mobile terminal is within a predetermined angle with respect to the horizontal plane in the TILT direction.
  6.  前記実行部は、
     前記マーカの辺長比が所定範囲外である場合に、PAN方向について前記携帯端末の再配置を要求する
     ことを特徴とする請求項3に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 3, wherein when the side length ratio of the marker is out of the predetermined range, the mobile terminal is requested to be rearranged in the PAN direction.
  7.  前記実行部は、
     常に車両に対し水平または同一サイズとなるように表示される前記マーカに基づいて前記キャリブレーションを実行する
     ことを特徴とする請求項3に記載の車載装置。
    The execution unit is
    The vehicle-mounted device according to claim 3, wherein the calibration is performed based on the marker displayed so as to always be horizontal or the same size with respect to the vehicle.
  8.  前記マーカは、2組の平行な2辺を有する四角形である
     ことを特徴とする請求項1に記載の車載装置。
    The vehicle-mounted device according to claim 1, wherein the marker is a quadrangle having two sets of two parallel sides.
  9.  車載カメラによって撮像された、携帯端末上に表示されたマーカの姿勢を算出する算出工程と、
     前記算出工程において算出された算出結果から前記車載カメラの取付角度を推定する推定工程と、
     前記推定工程において推定された推定結果に基づいて前記車載カメラのキャリブレーションを実行する実行工程と
     を含むことを特徴とする車載カメラのキャリブレーション方法。
    A calculation process for calculating the posture of the marker displayed on the mobile terminal captured by the in-vehicle camera, and
    An estimation process for estimating the mounting angle of the in-vehicle camera from the calculation results calculated in the calculation process, and
    A method for calibrating an in-vehicle camera, which comprises an execution step of executing calibration of the in-vehicle camera based on an estimation result estimated in the estimation step.
PCT/JP2020/032757 2020-08-28 2020-08-28 In-vehicle device and method for calibrating in-vehicle camera WO2022044294A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/032757 WO2022044294A1 (en) 2020-08-28 2020-08-28 In-vehicle device and method for calibrating in-vehicle camera
JP2022545229A JP7449393B2 (en) 2020-08-28 2020-08-28 Calibration method for in-vehicle devices and in-vehicle cameras

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/032757 WO2022044294A1 (en) 2020-08-28 2020-08-28 In-vehicle device and method for calibrating in-vehicle camera

Publications (1)

Publication Number Publication Date
WO2022044294A1 true WO2022044294A1 (en) 2022-03-03

Family

ID=80354956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/032757 WO2022044294A1 (en) 2020-08-28 2020-08-28 In-vehicle device and method for calibrating in-vehicle camera

Country Status (2)

Country Link
JP (1) JP7449393B2 (en)
WO (1) WO2022044294A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141998A1 (en) * 2008-05-19 2009-11-26 パナソニック株式会社 Calibration method, calibration device, and calibration system having the device
JP2015031601A (en) * 2013-08-02 2015-02-16 国立大学法人電気通信大学 Three-dimensional measurement instrument, method, and program
JP2015033007A (en) * 2013-08-02 2015-02-16 オリンパスイメージング株式会社 Image processing apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018169353A (en) 2017-03-30 2018-11-01 パナソニックIpマネジメント株式会社 Mounting error detecting device, mounting error detecting method, and mounting error detecting program
WO2019049331A1 (en) 2017-09-08 2019-03-14 株式会社ソニー・インタラクティブエンタテインメント Calibration device, calibration system, and calibration method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141998A1 (en) * 2008-05-19 2009-11-26 パナソニック株式会社 Calibration method, calibration device, and calibration system having the device
JP2015031601A (en) * 2013-08-02 2015-02-16 国立大学法人電気通信大学 Three-dimensional measurement instrument, method, and program
JP2015033007A (en) * 2013-08-02 2015-02-16 オリンパスイメージング株式会社 Image processing apparatus

Also Published As

Publication number Publication date
JP7449393B2 (en) 2024-03-13
JPWO2022044294A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
CN110264520B (en) Vehicle-mounted sensor and vehicle pose relation calibration method, device, equipment and medium
US20210012532A1 (en) System and method for calibration of machine vision cameras along at least three discrete planes
US9866818B2 (en) Image processing apparatus and method, image processing system and program
JP5491235B2 (en) Camera calibration device
JP5222597B2 (en) Image processing apparatus and method, driving support system, and vehicle
JP5729158B2 (en) Parking assistance device and parking assistance method
US10645365B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
US10410419B2 (en) Laser projection system with video overlay
EP2061234A1 (en) Imaging apparatus
JP2009288152A (en) Calibration method of on-vehicle camera
US20140085409A1 (en) Wide fov camera image calibration and de-warping
JP2010239409A (en) Calibrating apparatus for on-board camera of vehicle
JP2009042162A (en) Calibration device and method therefor
JP2011087308A (en) Device, method, and program for calibration of in-vehicle camera
JP2004334819A (en) Stereo calibration device and stereo image monitoring device using same
JP6614754B2 (en) Method for converting an omnidirectional image from an omnidirectional camera placed on a vehicle into a rectilinear image
WO2017216998A1 (en) Position change determination device, overhead view image generation device, overhead view image generation system, position change determination method, and program
US9802539B2 (en) Distance and direction estimation of a target point from a vehicle using monocular video camera
JP2014165810A (en) Parameter acquisition device, parameter acquisition method and program
CN115797467A (en) Method, device and equipment for detecting calibration result of vehicle camera and storage medium
WO2022044294A1 (en) In-vehicle device and method for calibrating in-vehicle camera
US20190050959A1 (en) Machine surround view system and method for generating 3-dimensional composite surround view using same
WO2019198399A1 (en) Image processing device and method
CN113610927B (en) AVM camera parameter calibration method and device and electronic equipment
JP2010231395A (en) Camera calibration device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20951544

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022545229

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20951544

Country of ref document: EP

Kind code of ref document: A1