WO2015115770A1 - Dispositif d'étalonnage et système de caméra - Google Patents

Dispositif d'étalonnage et système de caméra Download PDF

Info

Publication number
WO2015115770A1
WO2015115770A1 PCT/KR2015/000835 KR2015000835W WO2015115770A1 WO 2015115770 A1 WO2015115770 A1 WO 2015115770A1 KR 2015000835 W KR2015000835 W KR 2015000835W WO 2015115770 A1 WO2015115770 A1 WO 2015115770A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
displacement
reflecting member
information
interference pattern
Prior art date
Application number
PCT/KR2015/000835
Other languages
English (en)
Korean (ko)
Inventor
윤여찬
Original Assignee
엘지이노텍 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지이노텍 주식회사 filed Critical 엘지이노텍 주식회사
Publication of WO2015115770A1 publication Critical patent/WO2015115770A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention relates to a calibration apparatus and a camera system, and to a calibration apparatus of a stereo camera system and a stereo camera system including the same.
  • Stereo camera system that detects objects from images taken in front of or behind the vehicle to help the driver to drive safely, and extracts distance information between the vehicle and the objects in front of the vehicle using two cameras ) Has been proposed.
  • a calibration process according to a distance between cameras and a relative posture needs to be performed at initial installation.
  • calibration of a stereo camera system is performed using a checker board on which a specific pattern is drawn.
  • a checker board is photographed from different angles with a stereo camera to acquire 10 or more stereo camera images, which are different scenes, and a displacement parameter between the two cameras is derived through mathematical calculations using the acquired images. In order.
  • the distance between the checker board and the stereo camera system needs to satisfy the actual usage distance of 5m to 100m of the vehicle stereo camera system. This requires a large space for calibration.
  • An embodiment of the present invention provides a calibration apparatus and a camera system capable of real-time calibration of a stereo camera system.
  • a first reflecting member disposed on the first element
  • a second reflecting member disposed on the second element
  • a light source unit emitting the first beam
  • a beam splitter for combining the first and second reflection beams into a fourth beam
  • a detector for detecting an image signal corresponding to the fourth beam
  • a displacement detection unit for detecting an interference pattern from the image signal, and comparing displacement patterns with previously stored interference patterns to calculate displacement information between the first element and the second element.
  • the apparatus may further include a case accommodating the light source unit, the beam splitter, and the detector.
  • the apparatus may further include a case accommodating the light source unit, the beam splitter, the mirror, the detector, and the second reflecting member.
  • the first element and the second element may include a camera.
  • the displacement information may include distance movement information between the first element and the second element, and the displacement detection unit may obtain the distance movement information based on a circular fringe included in the interference pattern.
  • the displacement information may include relative rotation information between the first element and the second element, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
  • the camera system includes a camera housing including a first camera and a second camera spaced apart from each other, and a calibration device for calculating displacement information between the first camera and the second camera.
  • the calibration apparatus includes a first reflecting member disposed on the first camera, and a second reflecting member disposed on the second camera.
  • the controller may further include a controller for correcting a displacement parameter used to calculate distance information with respect to the object based on the calculated displacement information.
  • the calibration device includes an interferometer for obtaining an image signal including an interference pattern between a plurality of beams reflected by the first and second reflecting members, and detecting the interference pattern from the image signal. And a displacement detector configured to compare the previously stored interference pattern to calculate displacement information between the first camera and the second camera.
  • the interferometer may further include a light splitter and a beam splitter which separates the beam irradiated from the light source into a plurality of beams having different optical paths and transmits the beam to the first reflecting member and the second reflecting member, respectively.
  • the interferometer may further include a mirror that transmits the beam separated by the beam splitter to the second reflecting member.
  • the light source unit may include a laser or a laser diode.
  • the displacement information may include distance movement information between the first camera and the second camera, and the displacement detection unit may acquire the distance movement information based on a circular fringe included in the interference pattern.
  • the displacement information may include relative rotation information between the first camera and the second camera, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
  • the camera system may be configured to acquire a plurality of image data through the plurality of cameras, and calculate a distance information with respect to an object detected from the plurality of image data based on distance information and relative rotation information between the plurality of cameras.
  • the control apparatus may further update the distance information and the rotation information based on the displacement information.
  • FIG. 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram schematically illustrating an interferometer according to an exemplary embodiment of the present invention.
  • FIG 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
  • FIG. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment.
  • 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to a camera.
  • FIG. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
  • FIG. 8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
  • ordinal numbers such as second and first
  • first and second components may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component.
  • the camera system and the calibration device according to the embodiment of the present invention may be provided to include more or fewer components.
  • 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention.
  • 2 is a block diagram schematically illustrating an interferometer according to an embodiment of the present invention.
  • 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
  • the calibration device 10 may include an interferometer 11, a displacement detector 12, and the like.
  • the interferometer 11 includes a plurality of reflecting members 114 and 115 disposed on each of the plurality of elements spaced apart from each other, and separates the beams emitted from one light source unit 111 into a plurality of beams (or a plurality of light beams).
  • the light source is incident on each of the reflective members 114 and 115, and an image signal is obtained in response to an interference pattern in which a plurality of beams reflected by the reflective members 114 and 115 interfere.
  • the element may be a plurality of cameras mounted on a stereo camera or a component having a predetermined distance and angle, such as a vehicle head lamp.
  • the camera will be described as an example.
  • the interferometer 11 may be a configuration of a Michelson interferometer.
  • the interferometer 11 may include a light source 111, a beam splitter 112, a plurality of reflectors 114 and 115, a detector 116, and the like. In addition, the interferometer 11 may further include a mirror 113.
  • the light source unit 111 emits a beam L1.
  • the beam emitted from the light source unit 111 may include a coherent beam.
  • the displacement is a parameter indicating a change in relative position between two objects, and a parameter indicating how much of the other object is rotated and how much is translated relative to which one of the two objects.
  • the displacement is a parameter indicating how much the remaining camera is rotated and how much the other camera is rotated based on one of the plurality of cameras.
  • the light source unit 111 may include a laser, a laser diode, and the like, which emit a coherent beam.
  • the beam L1 emitted from the light source unit 111 may enter the beam splitter 112. To this end, the emission surface of the light source unit 111 is disposed to face the beam splitter 112.
  • the beam splitter 112 may split the matching beam L1 emitted from the light source unit 111 into a plurality of beams L11 and L12 having different travel directions. For example, the beam splitter 112 may transmit a portion L11 of the beam L1 incident from the light source unit 111 to the beam splitter 112, and reflect the remaining portion L12 to separate the plurality of beams. have.
  • the beam splitter 112 may include a translucent dielectric thin plate such as a translucent mirror.
  • the beams L11 and L12 separated by different beam paths by the beam splitter 112 may be transmitted to the first and second reflecting members 114 and 115, respectively.
  • the beam transmitted by the beam splitter 112 may be transmitted to the first reflecting member 114, and the beam reflected by the beam splitter 112 may be transmitted to the second reflecting member 115.
  • the first and second reflecting members 114 and 115 may reflect the beams to the beam splitter 112.
  • the beams separated by the beam splitter 112 proceeds to form a predetermined angle and not parallel to each other.
  • the beams separated by the beam splitter 112 may proceed to be orthogonal to each other. Therefore, when the first and second reflecting members 114 and 115 are respectively disposed in a plurality of cameras (not shown) spaced apart in a line, and the beam splitter 112 is disposed therebetween for space efficiency,
  • the interferometer 11 may further include a mirror 113 to allow the beams separated from the beam splitter 112 to be incident on the first and second reflecting members 114 and 115, respectively.
  • the mirror 113 is disposed on an optical path of the beam splitter 112 and the second reflecting member 115, and may perform a function of transferring a beam between the beam splitter 112 and the second reflecting member 115. have.
  • the mirror 113 is disposed on a path of the beam L12 reflected by the beam splitter 112, and reflects the beam reflected by the beam splitter 112 to reflect the second reflecting member. Proceed to 115. In addition, when the beam L22 reflected by the second reflecting member 115 is incident, the mirror 114 may reflect the incident light and enter the beam splitter 112.
  • the beam L11 transmitted by the beam splitter 112 proceeds directly to the first reflecting member 114, but the beam L12 reflected by the beam splitter 112 is once again reflected by the mirror 113. The reflection may proceed to the second reflecting member 115.
  • the beam L21 reflected by the first reflecting member 114 proceeds directly to the beam splitter 112, but the beam L22 reflected by the second reflecting member 115 is reflected by the mirror 113. Once again reflected it may proceed to the beam splitter 112.
  • the beam emitted from one light source unit 111 is separated by the beam splitter 112, and a difference occurs in the traveling path, and is reflected by the first and second reflecting members 114 and 115 and the beam Interference occurs in the splitter 112.
  • the beams L21 and L22 interfering in the beam splitter 112 may be combined into one beam L2 by the beam splitter 112, and the beam splitter 112 may transmit it to the detector 116.
  • the detector 116 When the detector 116 receives the beam L2 reflected and recombined from the beam splitter 112, the detector 116 receives the beam L2 to obtain an image signal.
  • the image signal obtained by the detector 116 includes an interference fringe.
  • the interference pattern represents a pattern of light and dark shades formed in a lattice or concentric shape caused by the interference of the beam.
  • the beams reflected by the first reflecting member 114 and the second reflecting member 115 generate interference when they are met at the beam splitter 112 due to the difference in optical paths. This interference generates contrast patterns in the image signal of the beam incident on the detector 116.
  • the displacement detector 12 processes the image signal obtained through the interferometer 11 to obtain image data, and extracts an interference pattern therefrom.
  • the displacement detection unit 12 may calculate displacement information including a change in distance and rotation between the first and second reflecting members 114 and 115 by comparing the previously stored interference pattern with the newly extracted interference pattern.
  • the pre-stored interference pattern used for comparison is an interference pattern which is a reference for detecting displacement information, and is referred to as a 'reference interference pattern' for convenience of description below.
  • the reference interference pattern may be an interference pattern acquired by the displacement detector 12 using the interferometer 11 when the calibration apparatus 10 is initially mounted in a camera system (not shown).
  • the reference interference pattern may be an interference pattern obtained by the displacement detector 12 using the interferometer 11 during the previous calibration.
  • the displacement detector 12 may use the internal memory (not shown) or the external memory (not shown) to acquire an interference pattern obtained by using the interferometer 11 when the calibration apparatus 10 is initially installed in a camera system (not shown). ) Can be used as a reference interference pattern.
  • the displacement detection unit 12 stores the interference pattern obtained through the interferometer each time in an internal memory (not shown) or an external memory (not shown), or only when there is a displacement change. Alternatively, it may be stored in an external memory (not shown) and used as a reference interference pattern.
  • Equation 1 the relationship between the interference pattern detected by the displacement detection unit 12 and the displacement of the measurement target may be expressed by Equation 1 below.
  • d represents the displacement of the measurement object, that is, the displacement between the first reflecting member 114 and the second reflecting member 115
  • n is the number of changes of the fringe in the interference pattern
  • is the light source unit 111 It represents the wavelength of the beam emitted from.
  • the beam path changes. Therefore, the position of the virtual image of the source beam emitted from the light source 111 is changed from S 2 ′ to S 1 ′. Accordingly, at least some of the circular fringes in the image signal detected by the detector 116 converge while changing.
  • the displacement detector 12 may detect a change in relative distance between the first reflecting member 114 and the second reflecting member 115 based on the change of the circular fringe pattern in the image signal detected by the detector 116. .
  • the displacement detector 12 may detect a change in the relative inclination between the first reflective member 114 and the second reflective member 115 based on the change of the linear fringe pattern in the image signal detected by the detector 116.
  • the wavelength of a laser is generally 400 nm-800 nm. In the case of UV lasers, wavelengths of approximately 400 nm or less are possible.
  • the resolution of the interferometer 11 may satisfy 10 nm or less. That is, if the phase is changed by half wavelength due to the optical path difference, one fringe in the interference pattern may change.
  • displacement information between the first and second reflecting members 114 and 115 can be detected in units of 10 nm or less.
  • the calibration device 10 may detect displacement information between the first and second reflecting members 114 and 115 spaced apart from each other using the interferometer 11.
  • displacement information between a plurality of cameras (not shown) in which the first and second reflection members 114 and 115 are disposed may be detected based on the displacement information between the first and second reflection members 114 and 115.
  • FIG. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment.
  • 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to cameras of a stereo camera system.
  • FIG. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
  • the stereo camera system may include a plurality of cameras 21 and 22, a calibration device 10, a control device 30, and the like.
  • the plurality of cameras 21 and 22 may be spaced apart from each other by a predetermined interval, and may acquire image data outside the object.
  • the plurality of cameras 21 and 22 may be mounted before or after the vehicle to acquire image data in front of or behind the vehicle.
  • the calibration device 10 includes an interferometer (see reference numeral 11 of FIG. 1), as described with reference to FIGS. 1 to 3, and between the plurality of cameras 21 and 22 using the interferometer 11.
  • the displacement information may include a relative distance translation between the plurality of cameras 21 and 22 and a relative angular rotation.
  • the reflecting member of the interferometer 11 (reference numeral 114, It is necessary to arrange them in one piece.
  • 5 and 6 show examples of coupling the interferometer 11 with the cameras 21 and 22.
  • the first camera 21 and the second camera 22 are spaced apart by a predetermined interval by the camera housing 200. If necessary, three or more cameras may be mounted in the camera housing 200. In this case, the reflective members may also be arranged in the same number.
  • the first and second reflecting members 114 and 115 of the interferometer 11 are disposed in the bodies of the cameras 21 and 22, respectively.
  • the first and second reflecting members 114 and 115 may be integrally coupled to the bodies of the cameras 21 and 22 so as to move or rotate in response to the body movements of the cameras 21 and 22.
  • an interferometer case 100 is disposed on the optical path between the first and second reflecting members 114 and 115, and the interferometer case 100 has a light source unit 111 and a beam splitter () of the interferometer 11 therein. 112, mirror 113, detector 116, and the like.
  • the interferometer case 100 in which the light source unit 111, the beam splitter 112, the mirror 113, and the detector 116 are integrally accommodated is disposed between the first camera 21 and the second camera 22.
  • the first reflecting member 114 and the second reflecting member 115 may be disposed independently of the first camera 21 and the second camera 22, respectively.
  • the present invention is not limited thereto, and as shown in FIG. 6, the first reflecting member 114 of the interferometer 11 is disposed on the body of the first camera 21, and the interferometer case 100 ′ is provided on the second camera 22. ) May be arranged.
  • the interferometer case 100 ′ may include a light source unit 111, a beam splitter 112, a mirror 113, a second reflecting member 115, a detector 116, and the like of the interferometer 11. . That is, the light source unit 111, the beam splitter 112, the mirror 113, the second reflecting member 115, and the detector 116 may be integrally coupled by the interferometer case 100.
  • the controller 30 continuously receives image data outside the object through the plurality of cameras 21 and 22.
  • the controller 30 detects an object from image data received from the plurality of cameras 21 and 22 through image recognition.
  • the controller 30 obtains object position information in each image data, and controls the position information of the object in each image data and distance information between the plurality of cameras 21 and 22. Based on the distance information with the object can be obtained.
  • the focal length of each of the first and second cameras 21 and 22 is f, and the distance between the first and second cameras 21 and 22 is b, the first and second cameras 21 and 22.
  • the distance between the point corresponding to the center of the first and second cameras 21 and 22 and the object OB is defined as DL and DR, respectively, on the image data input through the plurality of cameras 21 and 22,
  • the distance information Z between the objects OB may be obtained as in Equation 2 below.
  • the focal length f of each of the first and second cameras 21 and 22 may be a preset value according to the specifications of the first and second cameras 21 and 22.
  • the distance d between the first and second cameras 21 and 22 is a distance between the center points of the first and second cameras 21 and 22, and is installed when the first and second cameras 21 and 22 are installed.
  • the set value can be used as the initial value.
  • the position and the posture of the first and second cameras 21 and 22 may be changed due to vehicle vibration and temperature change. . Therefore, without using the process of reflecting the position or attitude change of the first and second cameras 21 and 22, the stereo parameters set at the time of initial installation of the first and second cameras 21 and 22 are used as they are. When the distance information with respect to the object is calculated, the accuracy of the distance information is reduced.
  • the stereo parameter used to calculate distance information with the object may include distance information between the first and second cameras 21 and 22 and relative tilting information between the first and second cameras 21 and 22.
  • the relative inclination information of the first and second cameras 21 and 22 may indicate the degree of inclination of the other cameras with respect to the optical axis of any one of the first and second cameras 21 and 22.
  • the control device 30 is obtained through the calibration device 10 in order to prevent the accuracy of calculating the distance information with the object as the position, posture, etc. of the first and second cameras 21 and 22 are changed. Based on the displacement information, the displacement parameter used to calculate distance information with the object may be corrected and used. That is, the controller 30 may update the stereo parameter between the first and second cameras 21 and 22 that are stored in advance based on the displacement information obtained by the calibration device 10.
  • the above-described displacement detection unit 12 of the calibration device 10 may be implemented in the control device 30, may be implemented separately from the control device 30.
  • FIG. 8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
  • the light source unit 111 of the interferometer 11 irradiates a beam for generating an interference pattern (S110).
  • the calibration device 10 passes through the detector 116 of the interferometer 11. In operation S120, the detector 116 outputs an image signal corresponding to the reflected beam.
  • step S120 the beam irradiated from the light source unit 11 is separated into a plurality of beams (or a plurality of light beams) having different propagation paths by the beam splitter 112.
  • the plurality of beams separated by the beam splitter 112 proceed to the first and second reflecting members 114 and 115, respectively, and are reflected by the first and second reflecting members 114 and 115 to reflect the beam splitter 112.
  • the beam splitter 112 combines the reflected beams and proceeds to the detector 116, which receives the output and outputs a corresponding image signal.
  • the video signal generated in step S120 is input to the displacement detector 12 of the calibration device 10.
  • the displacement detector 12 obtains image data by signal processing the image signal input from the detector 16, and detects an interference pattern between the reflected beams (S130).
  • the displacement detection unit 12 obtains displacement information by analyzing a circular fringe or a straight fringe of the detected interference pattern (S140). Since the displacement information detection method has been described in detail with reference to FIG. 3, the detailed description will be omitted below.
  • step S140 the displacement information obtained by the displacement detection unit 12 is transmitted to the control device 30.
  • the controller 30 corrects the stereo parameter used to detect the distance to the object based on the displacement information (S150). That is, distance information and slope information between the first and second cameras 21 and 22 are corrected based on the displacement information.
  • the stereo parameter corrected in step S150 is stored in an internal memory (not shown) or an external memory (not shown) of the control device 30.
  • the stereo camera system may detect in real time that a distance and a slope between a plurality of cameras change due to vibration, temperature change, etc. generated during driving of a vehicle. Further, accuracy of calculating distance information with an object may be improved by acquiring displacement information between a plurality of cameras in real time and correcting a stereo parameter used to calculate distance information with an object based on the obtained displacement information.
  • the calibration apparatus and the camera system disclosed in this document may perform calibration of the camera system in real time while driving a vehicle.
  • ' ⁇ part' used in the present embodiment refers to software or a hardware component such as a field-programmable gate array (FPGA) or an ASIC, and ' ⁇ part' performs certain roles.
  • ' ⁇ ' is not meant to be limited to software or hardware. May be configured to reside in an addressable recording medium or may be configured to reproduce one or more processors.
  • ' ⁇ ' means components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and the like. Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables.
  • the functionality provided within the components and the 'parts' may be combined into a smaller number of components and the 'parts' or further separated into additional components and the 'parts'.
  • the components and ' ⁇ ' may be implemented to play one or more CPUs in the device or secure multimedia card.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Instruments For Measurement Of Length By Optical Means (AREA)

Abstract

Selon un mode de réalisation de la présente invention, un dispositif d'étalonnage comprend : un premier élément de réflexion disposé dans un premier élément ; un second élément de réflexion disposé dans un second élément ; une unité source de lumière destinée à émettre un premier faisceau ; un diviseur de faisceau destiné à diviser le premier faisceau en un deuxième faisceau et un troisième faisceau qui sont différents l'un de l'autre dans un trajet de propagation, et si un premier faisceau réfléchi dans lequel le deuxième faisceau est réfléchi par le premier élément de réflexion et un deuxième faisceau réfléchi dans lequel le troisième faisceau est réfléchi par le second élément de réflexion sont projetés, à coupler le premier faisceau réfléchi et le deuxième faisceau réfléchi sous la forme d'un faisceau avant ; un détecteur destiné à détecter un signal d'image correspondant au quatrième faisceau ; et une unité de détection de déplacement destinée à détecter un motif d'interférence à partir du signal d'image et à calculer des informations de déplacement entre les premier et second éléments par comparaison du motif d'interférence et d'un motif d'interférence mémorisé au préalable.
PCT/KR2015/000835 2014-01-28 2015-01-27 Dispositif d'étalonnage et système de caméra WO2015115770A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140010590A KR102158026B1 (ko) 2014-01-28 2014-01-28 캘리브레이션 장치 및 카메라 시스템
KR10-2014-0010590 2014-01-28

Publications (1)

Publication Number Publication Date
WO2015115770A1 true WO2015115770A1 (fr) 2015-08-06

Family

ID=53757305

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/000835 WO2015115770A1 (fr) 2014-01-28 2015-01-27 Dispositif d'étalonnage et système de caméra

Country Status (2)

Country Link
KR (1) KR102158026B1 (fr)
WO (1) WO2015115770A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020037179A1 (fr) * 2018-08-17 2020-02-20 Veoneer Us, Inc. Système de surveillance d'habitacle de véhicule

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255643A (zh) * 2021-05-08 2021-08-13 上海砼测检测技术有限公司 一种应用于位移监测的机器视觉识别算法
WO2023219440A1 (fr) * 2022-05-11 2023-11-16 엘지이노텍 주식회사 Appareil de caméra
KR102621435B1 (ko) * 2022-07-12 2024-01-09 주식회사 래비노 레이저광을 이용한 다중 스테레오 카메라 캘리브레이션 방법 및 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980015490A (ko) * 1996-08-22 1998-05-25 미노루 이나바 스테레오 카메라
JP2003504607A (ja) * 1999-07-13 2003-02-04 ビーティ エルウィン エム 電子部品の三次元検査装置及び方法
KR20030089542A (ko) * 2002-05-15 2003-11-22 옵토 다이나믹스(주) 다차원 물체 측정 시스템
US20100295926A1 (en) * 2006-11-28 2010-11-25 Prefixa International Inc. Fast Three Dimensional Recovery Method and Apparatus
JP2012132739A (ja) * 2010-12-21 2012-07-12 Ricoh Co Ltd ステレオカメラの校正装置および校正方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5185424B1 (ja) * 2011-09-30 2013-04-17 株式会社東芝 キャリブレーション方法および映像表示装置
KR101333161B1 (ko) * 2012-02-15 2013-11-27 이연태 공초점을 이용한 영상 처리 장치 및 이를 이용한 영상 처리 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980015490A (ko) * 1996-08-22 1998-05-25 미노루 이나바 스테레오 카메라
JP2003504607A (ja) * 1999-07-13 2003-02-04 ビーティ エルウィン エム 電子部品の三次元検査装置及び方法
KR20030089542A (ko) * 2002-05-15 2003-11-22 옵토 다이나믹스(주) 다차원 물체 측정 시스템
US20100295926A1 (en) * 2006-11-28 2010-11-25 Prefixa International Inc. Fast Three Dimensional Recovery Method and Apparatus
JP2012132739A (ja) * 2010-12-21 2012-07-12 Ricoh Co Ltd ステレオカメラの校正装置および校正方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020037179A1 (fr) * 2018-08-17 2020-02-20 Veoneer Us, Inc. Système de surveillance d'habitacle de véhicule
US11155226B2 (en) 2018-08-17 2021-10-26 Veoneer Us, Inc. Vehicle cabin monitoring system

Also Published As

Publication number Publication date
KR102158026B1 (ko) 2020-09-21
KR20150089678A (ko) 2015-08-05

Similar Documents

Publication Publication Date Title
US20170280132A1 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
WO2015115770A1 (fr) Dispositif d'étalonnage et système de caméra
WO2016200096A1 (fr) Appareil de mesure de forme tridimensionnelle
CN107957237B (zh) 具有闪光对准的激光投影仪
US7800643B2 (en) Image obtaining apparatus
EP3617644A1 (fr) Système de mesure tridimensionnelle et méthode de fonctionnement correspondante
CN108650447B (zh) 图像传感器、深度数据测量头及测量系统
JP5944719B2 (ja) 配光特性測定装置および配光特性測定方法
US10068348B2 (en) Method and apparatus for indentifying structural elements of a projected structural pattern in camera images
JP2586931B2 (ja) カメラの測距装置
US20200348127A1 (en) 3-d environment sensing by means of projector and camera modules
WO2013036076A2 (fr) Dispositif et procédé permettant de mesurer des formes tridimensionnelles au moyen de l'amplitude d'une grille de projection
WO2015080480A1 (fr) Appareil d'inspection d'image de tranche de semi-conducteur
JP6287231B2 (ja) 測距装置及びロボットピッキングシステム
WO2016163840A1 (fr) Appareil de mesure de forme tridimensionnelle
WO2018212395A1 (fr) Dispositif lidar et système lidar comprenant ce dernier
WO2021096328A2 (fr) Dispositif de localisation laser présentant une fonction destinée à la détection de position initiale d'une cible et procédé de localisation
JP2013257162A (ja) 測距装置
JPWO2020183711A1 (ja) 画像処理装置及び3次元計測システム
WO2015115771A1 (fr) Module de correction d'appareil de prise de vues, système d'appareil de prise de vues et procédé de commande de système d'appareil de prise de vues
JP2001082940A (ja) 立体モデル生成装置及び方法
JP7252755B2 (ja) アクティブセンサ、物体識別システム、車両、車両用灯具
US20210156881A1 (en) Dynamic machine vision sensor (dmvs) that performs integrated 3d tracking
WO2017204459A1 (fr) Appareil optique lidar ayant une structure améliorée
WO2014129760A1 (fr) Système de poursuite et procédé de poursuite utilisant celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15743476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15743476

Country of ref document: EP

Kind code of ref document: A1