WO2015115770A1 - Calibration device and camera system - Google Patents

Calibration device and camera system Download PDF

Info

Publication number
WO2015115770A1
WO2015115770A1 PCT/KR2015/000835 KR2015000835W WO2015115770A1 WO 2015115770 A1 WO2015115770 A1 WO 2015115770A1 KR 2015000835 W KR2015000835 W KR 2015000835W WO 2015115770 A1 WO2015115770 A1 WO 2015115770A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
displacement
reflecting member
information
interference pattern
Prior art date
Application number
PCT/KR2015/000835
Other languages
French (fr)
Korean (ko)
Inventor
윤여찬
Original Assignee
엘지이노텍 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지이노텍 주식회사 filed Critical 엘지이노텍 주식회사
Publication of WO2015115770A1 publication Critical patent/WO2015115770A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention relates to a calibration apparatus and a camera system, and to a calibration apparatus of a stereo camera system and a stereo camera system including the same.
  • Stereo camera system that detects objects from images taken in front of or behind the vehicle to help the driver to drive safely, and extracts distance information between the vehicle and the objects in front of the vehicle using two cameras ) Has been proposed.
  • a calibration process according to a distance between cameras and a relative posture needs to be performed at initial installation.
  • calibration of a stereo camera system is performed using a checker board on which a specific pattern is drawn.
  • a checker board is photographed from different angles with a stereo camera to acquire 10 or more stereo camera images, which are different scenes, and a displacement parameter between the two cameras is derived through mathematical calculations using the acquired images. In order.
  • the distance between the checker board and the stereo camera system needs to satisfy the actual usage distance of 5m to 100m of the vehicle stereo camera system. This requires a large space for calibration.
  • An embodiment of the present invention provides a calibration apparatus and a camera system capable of real-time calibration of a stereo camera system.
  • a first reflecting member disposed on the first element
  • a second reflecting member disposed on the second element
  • a light source unit emitting the first beam
  • a beam splitter for combining the first and second reflection beams into a fourth beam
  • a detector for detecting an image signal corresponding to the fourth beam
  • a displacement detection unit for detecting an interference pattern from the image signal, and comparing displacement patterns with previously stored interference patterns to calculate displacement information between the first element and the second element.
  • the apparatus may further include a case accommodating the light source unit, the beam splitter, and the detector.
  • the apparatus may further include a case accommodating the light source unit, the beam splitter, the mirror, the detector, and the second reflecting member.
  • the first element and the second element may include a camera.
  • the displacement information may include distance movement information between the first element and the second element, and the displacement detection unit may obtain the distance movement information based on a circular fringe included in the interference pattern.
  • the displacement information may include relative rotation information between the first element and the second element, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
  • the camera system includes a camera housing including a first camera and a second camera spaced apart from each other, and a calibration device for calculating displacement information between the first camera and the second camera.
  • the calibration apparatus includes a first reflecting member disposed on the first camera, and a second reflecting member disposed on the second camera.
  • the controller may further include a controller for correcting a displacement parameter used to calculate distance information with respect to the object based on the calculated displacement information.
  • the calibration device includes an interferometer for obtaining an image signal including an interference pattern between a plurality of beams reflected by the first and second reflecting members, and detecting the interference pattern from the image signal. And a displacement detector configured to compare the previously stored interference pattern to calculate displacement information between the first camera and the second camera.
  • the interferometer may further include a light splitter and a beam splitter which separates the beam irradiated from the light source into a plurality of beams having different optical paths and transmits the beam to the first reflecting member and the second reflecting member, respectively.
  • the interferometer may further include a mirror that transmits the beam separated by the beam splitter to the second reflecting member.
  • the light source unit may include a laser or a laser diode.
  • the displacement information may include distance movement information between the first camera and the second camera, and the displacement detection unit may acquire the distance movement information based on a circular fringe included in the interference pattern.
  • the displacement information may include relative rotation information between the first camera and the second camera, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
  • the camera system may be configured to acquire a plurality of image data through the plurality of cameras, and calculate a distance information with respect to an object detected from the plurality of image data based on distance information and relative rotation information between the plurality of cameras.
  • the control apparatus may further update the distance information and the rotation information based on the displacement information.
  • FIG. 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram schematically illustrating an interferometer according to an exemplary embodiment of the present invention.
  • FIG 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
  • FIG. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment.
  • 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to a camera.
  • FIG. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
  • FIG. 8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
  • ordinal numbers such as second and first
  • first and second components may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component.
  • the camera system and the calibration device according to the embodiment of the present invention may be provided to include more or fewer components.
  • 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention.
  • 2 is a block diagram schematically illustrating an interferometer according to an embodiment of the present invention.
  • 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
  • the calibration device 10 may include an interferometer 11, a displacement detector 12, and the like.
  • the interferometer 11 includes a plurality of reflecting members 114 and 115 disposed on each of the plurality of elements spaced apart from each other, and separates the beams emitted from one light source unit 111 into a plurality of beams (or a plurality of light beams).
  • the light source is incident on each of the reflective members 114 and 115, and an image signal is obtained in response to an interference pattern in which a plurality of beams reflected by the reflective members 114 and 115 interfere.
  • the element may be a plurality of cameras mounted on a stereo camera or a component having a predetermined distance and angle, such as a vehicle head lamp.
  • the camera will be described as an example.
  • the interferometer 11 may be a configuration of a Michelson interferometer.
  • the interferometer 11 may include a light source 111, a beam splitter 112, a plurality of reflectors 114 and 115, a detector 116, and the like. In addition, the interferometer 11 may further include a mirror 113.
  • the light source unit 111 emits a beam L1.
  • the beam emitted from the light source unit 111 may include a coherent beam.
  • the displacement is a parameter indicating a change in relative position between two objects, and a parameter indicating how much of the other object is rotated and how much is translated relative to which one of the two objects.
  • the displacement is a parameter indicating how much the remaining camera is rotated and how much the other camera is rotated based on one of the plurality of cameras.
  • the light source unit 111 may include a laser, a laser diode, and the like, which emit a coherent beam.
  • the beam L1 emitted from the light source unit 111 may enter the beam splitter 112. To this end, the emission surface of the light source unit 111 is disposed to face the beam splitter 112.
  • the beam splitter 112 may split the matching beam L1 emitted from the light source unit 111 into a plurality of beams L11 and L12 having different travel directions. For example, the beam splitter 112 may transmit a portion L11 of the beam L1 incident from the light source unit 111 to the beam splitter 112, and reflect the remaining portion L12 to separate the plurality of beams. have.
  • the beam splitter 112 may include a translucent dielectric thin plate such as a translucent mirror.
  • the beams L11 and L12 separated by different beam paths by the beam splitter 112 may be transmitted to the first and second reflecting members 114 and 115, respectively.
  • the beam transmitted by the beam splitter 112 may be transmitted to the first reflecting member 114, and the beam reflected by the beam splitter 112 may be transmitted to the second reflecting member 115.
  • the first and second reflecting members 114 and 115 may reflect the beams to the beam splitter 112.
  • the beams separated by the beam splitter 112 proceeds to form a predetermined angle and not parallel to each other.
  • the beams separated by the beam splitter 112 may proceed to be orthogonal to each other. Therefore, when the first and second reflecting members 114 and 115 are respectively disposed in a plurality of cameras (not shown) spaced apart in a line, and the beam splitter 112 is disposed therebetween for space efficiency,
  • the interferometer 11 may further include a mirror 113 to allow the beams separated from the beam splitter 112 to be incident on the first and second reflecting members 114 and 115, respectively.
  • the mirror 113 is disposed on an optical path of the beam splitter 112 and the second reflecting member 115, and may perform a function of transferring a beam between the beam splitter 112 and the second reflecting member 115. have.
  • the mirror 113 is disposed on a path of the beam L12 reflected by the beam splitter 112, and reflects the beam reflected by the beam splitter 112 to reflect the second reflecting member. Proceed to 115. In addition, when the beam L22 reflected by the second reflecting member 115 is incident, the mirror 114 may reflect the incident light and enter the beam splitter 112.
  • the beam L11 transmitted by the beam splitter 112 proceeds directly to the first reflecting member 114, but the beam L12 reflected by the beam splitter 112 is once again reflected by the mirror 113. The reflection may proceed to the second reflecting member 115.
  • the beam L21 reflected by the first reflecting member 114 proceeds directly to the beam splitter 112, but the beam L22 reflected by the second reflecting member 115 is reflected by the mirror 113. Once again reflected it may proceed to the beam splitter 112.
  • the beam emitted from one light source unit 111 is separated by the beam splitter 112, and a difference occurs in the traveling path, and is reflected by the first and second reflecting members 114 and 115 and the beam Interference occurs in the splitter 112.
  • the beams L21 and L22 interfering in the beam splitter 112 may be combined into one beam L2 by the beam splitter 112, and the beam splitter 112 may transmit it to the detector 116.
  • the detector 116 When the detector 116 receives the beam L2 reflected and recombined from the beam splitter 112, the detector 116 receives the beam L2 to obtain an image signal.
  • the image signal obtained by the detector 116 includes an interference fringe.
  • the interference pattern represents a pattern of light and dark shades formed in a lattice or concentric shape caused by the interference of the beam.
  • the beams reflected by the first reflecting member 114 and the second reflecting member 115 generate interference when they are met at the beam splitter 112 due to the difference in optical paths. This interference generates contrast patterns in the image signal of the beam incident on the detector 116.
  • the displacement detector 12 processes the image signal obtained through the interferometer 11 to obtain image data, and extracts an interference pattern therefrom.
  • the displacement detection unit 12 may calculate displacement information including a change in distance and rotation between the first and second reflecting members 114 and 115 by comparing the previously stored interference pattern with the newly extracted interference pattern.
  • the pre-stored interference pattern used for comparison is an interference pattern which is a reference for detecting displacement information, and is referred to as a 'reference interference pattern' for convenience of description below.
  • the reference interference pattern may be an interference pattern acquired by the displacement detector 12 using the interferometer 11 when the calibration apparatus 10 is initially mounted in a camera system (not shown).
  • the reference interference pattern may be an interference pattern obtained by the displacement detector 12 using the interferometer 11 during the previous calibration.
  • the displacement detector 12 may use the internal memory (not shown) or the external memory (not shown) to acquire an interference pattern obtained by using the interferometer 11 when the calibration apparatus 10 is initially installed in a camera system (not shown). ) Can be used as a reference interference pattern.
  • the displacement detection unit 12 stores the interference pattern obtained through the interferometer each time in an internal memory (not shown) or an external memory (not shown), or only when there is a displacement change. Alternatively, it may be stored in an external memory (not shown) and used as a reference interference pattern.
  • Equation 1 the relationship between the interference pattern detected by the displacement detection unit 12 and the displacement of the measurement target may be expressed by Equation 1 below.
  • d represents the displacement of the measurement object, that is, the displacement between the first reflecting member 114 and the second reflecting member 115
  • n is the number of changes of the fringe in the interference pattern
  • is the light source unit 111 It represents the wavelength of the beam emitted from.
  • the beam path changes. Therefore, the position of the virtual image of the source beam emitted from the light source 111 is changed from S 2 ′ to S 1 ′. Accordingly, at least some of the circular fringes in the image signal detected by the detector 116 converge while changing.
  • the displacement detector 12 may detect a change in relative distance between the first reflecting member 114 and the second reflecting member 115 based on the change of the circular fringe pattern in the image signal detected by the detector 116. .
  • the displacement detector 12 may detect a change in the relative inclination between the first reflective member 114 and the second reflective member 115 based on the change of the linear fringe pattern in the image signal detected by the detector 116.
  • the wavelength of a laser is generally 400 nm-800 nm. In the case of UV lasers, wavelengths of approximately 400 nm or less are possible.
  • the resolution of the interferometer 11 may satisfy 10 nm or less. That is, if the phase is changed by half wavelength due to the optical path difference, one fringe in the interference pattern may change.
  • displacement information between the first and second reflecting members 114 and 115 can be detected in units of 10 nm or less.
  • the calibration device 10 may detect displacement information between the first and second reflecting members 114 and 115 spaced apart from each other using the interferometer 11.
  • displacement information between a plurality of cameras (not shown) in which the first and second reflection members 114 and 115 are disposed may be detected based on the displacement information between the first and second reflection members 114 and 115.
  • FIG. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment.
  • 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to cameras of a stereo camera system.
  • FIG. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
  • the stereo camera system may include a plurality of cameras 21 and 22, a calibration device 10, a control device 30, and the like.
  • the plurality of cameras 21 and 22 may be spaced apart from each other by a predetermined interval, and may acquire image data outside the object.
  • the plurality of cameras 21 and 22 may be mounted before or after the vehicle to acquire image data in front of or behind the vehicle.
  • the calibration device 10 includes an interferometer (see reference numeral 11 of FIG. 1), as described with reference to FIGS. 1 to 3, and between the plurality of cameras 21 and 22 using the interferometer 11.
  • the displacement information may include a relative distance translation between the plurality of cameras 21 and 22 and a relative angular rotation.
  • the reflecting member of the interferometer 11 (reference numeral 114, It is necessary to arrange them in one piece.
  • 5 and 6 show examples of coupling the interferometer 11 with the cameras 21 and 22.
  • the first camera 21 and the second camera 22 are spaced apart by a predetermined interval by the camera housing 200. If necessary, three or more cameras may be mounted in the camera housing 200. In this case, the reflective members may also be arranged in the same number.
  • the first and second reflecting members 114 and 115 of the interferometer 11 are disposed in the bodies of the cameras 21 and 22, respectively.
  • the first and second reflecting members 114 and 115 may be integrally coupled to the bodies of the cameras 21 and 22 so as to move or rotate in response to the body movements of the cameras 21 and 22.
  • an interferometer case 100 is disposed on the optical path between the first and second reflecting members 114 and 115, and the interferometer case 100 has a light source unit 111 and a beam splitter () of the interferometer 11 therein. 112, mirror 113, detector 116, and the like.
  • the interferometer case 100 in which the light source unit 111, the beam splitter 112, the mirror 113, and the detector 116 are integrally accommodated is disposed between the first camera 21 and the second camera 22.
  • the first reflecting member 114 and the second reflecting member 115 may be disposed independently of the first camera 21 and the second camera 22, respectively.
  • the present invention is not limited thereto, and as shown in FIG. 6, the first reflecting member 114 of the interferometer 11 is disposed on the body of the first camera 21, and the interferometer case 100 ′ is provided on the second camera 22. ) May be arranged.
  • the interferometer case 100 ′ may include a light source unit 111, a beam splitter 112, a mirror 113, a second reflecting member 115, a detector 116, and the like of the interferometer 11. . That is, the light source unit 111, the beam splitter 112, the mirror 113, the second reflecting member 115, and the detector 116 may be integrally coupled by the interferometer case 100.
  • the controller 30 continuously receives image data outside the object through the plurality of cameras 21 and 22.
  • the controller 30 detects an object from image data received from the plurality of cameras 21 and 22 through image recognition.
  • the controller 30 obtains object position information in each image data, and controls the position information of the object in each image data and distance information between the plurality of cameras 21 and 22. Based on the distance information with the object can be obtained.
  • the focal length of each of the first and second cameras 21 and 22 is f, and the distance between the first and second cameras 21 and 22 is b, the first and second cameras 21 and 22.
  • the distance between the point corresponding to the center of the first and second cameras 21 and 22 and the object OB is defined as DL and DR, respectively, on the image data input through the plurality of cameras 21 and 22,
  • the distance information Z between the objects OB may be obtained as in Equation 2 below.
  • the focal length f of each of the first and second cameras 21 and 22 may be a preset value according to the specifications of the first and second cameras 21 and 22.
  • the distance d between the first and second cameras 21 and 22 is a distance between the center points of the first and second cameras 21 and 22, and is installed when the first and second cameras 21 and 22 are installed.
  • the set value can be used as the initial value.
  • the position and the posture of the first and second cameras 21 and 22 may be changed due to vehicle vibration and temperature change. . Therefore, without using the process of reflecting the position or attitude change of the first and second cameras 21 and 22, the stereo parameters set at the time of initial installation of the first and second cameras 21 and 22 are used as they are. When the distance information with respect to the object is calculated, the accuracy of the distance information is reduced.
  • the stereo parameter used to calculate distance information with the object may include distance information between the first and second cameras 21 and 22 and relative tilting information between the first and second cameras 21 and 22.
  • the relative inclination information of the first and second cameras 21 and 22 may indicate the degree of inclination of the other cameras with respect to the optical axis of any one of the first and second cameras 21 and 22.
  • the control device 30 is obtained through the calibration device 10 in order to prevent the accuracy of calculating the distance information with the object as the position, posture, etc. of the first and second cameras 21 and 22 are changed. Based on the displacement information, the displacement parameter used to calculate distance information with the object may be corrected and used. That is, the controller 30 may update the stereo parameter between the first and second cameras 21 and 22 that are stored in advance based on the displacement information obtained by the calibration device 10.
  • the above-described displacement detection unit 12 of the calibration device 10 may be implemented in the control device 30, may be implemented separately from the control device 30.
  • FIG. 8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
  • the light source unit 111 of the interferometer 11 irradiates a beam for generating an interference pattern (S110).
  • the calibration device 10 passes through the detector 116 of the interferometer 11. In operation S120, the detector 116 outputs an image signal corresponding to the reflected beam.
  • step S120 the beam irradiated from the light source unit 11 is separated into a plurality of beams (or a plurality of light beams) having different propagation paths by the beam splitter 112.
  • the plurality of beams separated by the beam splitter 112 proceed to the first and second reflecting members 114 and 115, respectively, and are reflected by the first and second reflecting members 114 and 115 to reflect the beam splitter 112.
  • the beam splitter 112 combines the reflected beams and proceeds to the detector 116, which receives the output and outputs a corresponding image signal.
  • the video signal generated in step S120 is input to the displacement detector 12 of the calibration device 10.
  • the displacement detector 12 obtains image data by signal processing the image signal input from the detector 16, and detects an interference pattern between the reflected beams (S130).
  • the displacement detection unit 12 obtains displacement information by analyzing a circular fringe or a straight fringe of the detected interference pattern (S140). Since the displacement information detection method has been described in detail with reference to FIG. 3, the detailed description will be omitted below.
  • step S140 the displacement information obtained by the displacement detection unit 12 is transmitted to the control device 30.
  • the controller 30 corrects the stereo parameter used to detect the distance to the object based on the displacement information (S150). That is, distance information and slope information between the first and second cameras 21 and 22 are corrected based on the displacement information.
  • the stereo parameter corrected in step S150 is stored in an internal memory (not shown) or an external memory (not shown) of the control device 30.
  • the stereo camera system may detect in real time that a distance and a slope between a plurality of cameras change due to vibration, temperature change, etc. generated during driving of a vehicle. Further, accuracy of calculating distance information with an object may be improved by acquiring displacement information between a plurality of cameras in real time and correcting a stereo parameter used to calculate distance information with an object based on the obtained displacement information.
  • the calibration apparatus and the camera system disclosed in this document may perform calibration of the camera system in real time while driving a vehicle.
  • ' ⁇ part' used in the present embodiment refers to software or a hardware component such as a field-programmable gate array (FPGA) or an ASIC, and ' ⁇ part' performs certain roles.
  • ' ⁇ ' is not meant to be limited to software or hardware. May be configured to reside in an addressable recording medium or may be configured to reproduce one or more processors.
  • ' ⁇ ' means components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and the like. Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables.
  • the functionality provided within the components and the 'parts' may be combined into a smaller number of components and the 'parts' or further separated into additional components and the 'parts'.
  • the components and ' ⁇ ' may be implemented to play one or more CPUs in the device or secure multimedia card.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Instruments For Measurement Of Length By Optical Means (AREA)

Abstract

According to one embodiment of the present invention, a calibration device comprises: a first reflection member arranged in a first element; a second reflection member arranged in a second element; a light source unit for emitting a first beam; a beam splitter for dividing the first beam into a second beam and a third beam which are different from each other in a propagation path, and if a first reflected beam in which the second beam is reflected by the first reflection member and a second reflected beam in which the third beam is reflected by the second reflection member are projected, coupling the first reflected beam and the second reflected beam as a forth beam; a detector for detecting an image signal corresponding to the fourth beam; and a displacement detection unit for detecting an interference pattern from the image signal and calculating displacement information between the first and second elements by comparing the interference pattern and a pre-stored interference pattern.

Description

캘리브레이션 장치 및 카메라 시스템Calibration device and camera system
본 발명은 캘리브레이션 장치 및 카메라 시스템에 관한 것으로서, 스테레오카메라 시스템의 캘리브레이션 장치 및 이를 포함하는 스테레오 카메라 시스템에 관한 것이다. The present invention relates to a calibration apparatus and a camera system, and to a calibration apparatus of a stereo camera system and a stereo camera system including the same.
차량 전방 또는 후방을 촬영한 영상으로부터 물체를 감지하여 운전자의 안전 운전을 돕는 시스템의 일환으로, 두 대의 카메라를 이용하여 차량과 차량 전방의 물체와의 거리정보를 추출하는 스테레오 카메라 시스템(stereo camera system)이 제안되었다. Stereo camera system that detects objects from images taken in front of or behind the vehicle to help the driver to drive safely, and extracts distance information between the vehicle and the objects in front of the vehicle using two cameras ) Has been proposed.
스테레오 카메라 시스템에서, 정확한 거리정보를 획득하기 위해서는 초기 설치 시 카메라 간의 거리와 상대적인 자세에 따른 캘리브레이션(calibration) 과정이 수행될 필요가 있다. In a stereo camera system, in order to acquire accurate distance information, a calibration process according to a distance between cameras and a relative posture needs to be performed at initial installation.
일반적으로 스테레오 카메라 시스템의 캘리브레이션은, 특정 패턴이 그려진 체커 보드(checker board)를 이용하여 수행된다. 체커 보드를 이용한 캘리브레이션 과정은 스테레오 카메라로 서로 다른 각도에서 체커 보드를 촬영하여 서로 다른 장면인 10장 이상의 스테레오 카메라 이미지를 획득하고, 획득한 이미지들을 이용한 수학적 계산을 통해 두 카메라 사이의 변위 파라미터를 도출하는 순으로 이루어진다. In general, calibration of a stereo camera system is performed using a checker board on which a specific pattern is drawn. In the calibration process using a checker board, a checker board is photographed from different angles with a stereo camera to acquire 10 or more stereo camera images, which are different scenes, and a displacement parameter between the two cameras is derived through mathematical calculations using the acquired images. In order.
이러한 캘리브레이션 과정은 체커 보드와 스테레오 카메라 시스템간의 거리가 차량용 스테레오 카메라 시스템의 실 사용 거리인 5m 내지 100m를 만족할 필요가 있다. 이에 따라, 캘리브레이션에 넓은 공간을 필요로 한다. In this calibration process, the distance between the checker board and the stereo camera system needs to satisfy the actual usage distance of 5m to 100m of the vehicle stereo camera system. This requires a large space for calibration.
또한, 10장 이상의 이미지를 촬영한 후 촬영된 이미지를 계산 프로그램에 대입하여 변위 파라미터를 도출하므로 캘리브레이션에 많은 시간이 소요된다. In addition, since a displacement parameter is derived by substituting the captured image into a calculation program after capturing 10 or more images, a large amount of time is required for calibration.
또한, 캘리브레이션이 완료된 스테레오 카메라 시스템을 차량에 장착하여 사용하는 중에 차량의 기계적, 열적 변형으로 인해 캘리브레이션 파라미터가 달라져도 실시간 캘리브레이션이 어려운 문제가 있다.In addition, even when the calibration parameter is changed due to the mechanical and thermal deformation of the vehicle while mounting and using the stereo camera system in which the calibration is completed, real-time calibration is difficult.
본 발명의 일 실시 예는 스테레오 카메라 시스템의 실시간 캘리브레이션이 가능한 캘리브레이션 장치 및 카메라 시스템을 제공한다.An embodiment of the present invention provides a calibration apparatus and a camera system capable of real-time calibration of a stereo camera system.
본 발명의 일 실시 예에 따르면, 제1엘리먼트에 배치되는 제1반사부재, 제2엘리먼트에 배치되는 제2반사부재, 제1빔을 출사하는 광원부, 상기 제1빔을 진행경로가 서로 다른 제2빔 및 제3빔으로 분리하고, 상기 제2빔이 상기 제1반사부재에 의해 반사된 제1반사빔과 상기 제3빔이 상기 제2반사부재에 의해 반사된 제2반사빔이 입사하면, 상기 제1반사빔과 상기 제2반사빔을 제4빔으로 결합하는 빔 스플리터, 상기 제4빔에 대응하는 영상신호를 검출하는 검출기, 그리고 According to an embodiment of the present invention, a first reflecting member disposed on the first element, a second reflecting member disposed on the second element, a light source unit emitting the first beam, and a traveling path different from each other in the first beam When the first and second reflecting beams reflected by the first reflecting member and the second reflecting beam reflected by the second reflecting member are incident, A beam splitter for combining the first and second reflection beams into a fourth beam, a detector for detecting an image signal corresponding to the fourth beam, and
상기 영상신호로부터 간섭패턴을 검출하고, 상기 간섭패턴과 기 저장된 간섭패턴을 비교하여 상기 제1엘리먼트 및 제2엘리먼트 간의 변위정보를 산출하는 변위검출부 를 포함한다.And a displacement detection unit for detecting an interference pattern from the image signal, and comparing displacement patterns with previously stored interference patterns to calculate displacement information between the first element and the second element.
상기 제3빔과 상기 제2반사빔의 광경로 상에 배치되며, 상기 제3빔을 상기 제2반사부재로 전달하고, 상기 제2반사빔을 상기 빔 스플리터로 전달하는 미러를 더 포함할 수 있다. A mirror disposed on an optical path between the third beam and the second reflecting beam, the mirror configured to transmit the third beam to the second reflecting member, and to transfer the second reflecting beam to the beam splitter. have.
상기 광원부, 상기 빔 스플리터, 및 상기 검출기를 수용하는 케이스를 더 포함할 수 있다. The apparatus may further include a case accommodating the light source unit, the beam splitter, and the detector.
상기 광원부, 상기 빔 스플리터, 상기 미러, 상기 검출기, 및 상기 제2반사부재를 수용하는 케이스를 더 포함할 수 있다. The apparatus may further include a case accommodating the light source unit, the beam splitter, the mirror, the detector, and the second reflecting member.
상기 제1엘리먼트와 제2엘리먼트는 카메라를 포함할 수 있다. The first element and the second element may include a camera.
상기 변위정보는 상기 제1엘리먼트 및 제2엘리먼트 간의 거리 이동정보를 포함하며, 상기 변위검출부는 상기 간섭패턴에 포함된 원형 프린지(fringe)를 토대로 상기 거리 이동정보를 획득할 수 있다. The displacement information may include distance movement information between the first element and the second element, and the displacement detection unit may obtain the distance movement information based on a circular fringe included in the interference pattern.
상기 변위정보는 상기 제1엘리먼트 및 제2엘리먼트 간의 상대적인 회전정보를 포함하며, 상기 변위검출부는 상기 간섭패턴에 포함된 직선형 프린지를 토대로 상기 회전정보를 획득할 수 있다.The displacement information may include relative rotation information between the first element and the second element, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
본 발명의 일 실시 예에 따른 카메라 시스템은, 서로 이격되어 배치되는 제1카메라와 제2카메라를 포함하는 카메라 하우징, 그리고 상기 제1카메라 및 제2카메라 간의 변위정보를 산출하는 캘리브레이션 장치를 포함하고, 상기 캘리브레이션 장치는 상기 제1카메라에 배치되는 제1반사부재, 및 상기 제2카메라에 배치되는 제2반사부재를 포함한다.The camera system according to an embodiment of the present invention includes a camera housing including a first camera and a second camera spaced apart from each other, and a calibration device for calculating displacement information between the first camera and the second camera. The calibration apparatus includes a first reflecting member disposed on the first camera, and a second reflecting member disposed on the second camera.
상기 산출한 변위정보를 토대로, 객체와의 거리정보 산출에 사용되는 변위 파라미터를 보정하는 제어장치를 더 포함할 수 있다.The controller may further include a controller for correcting a displacement parameter used to calculate distance information with respect to the object based on the calculated displacement information.
상기 캘리브레이션 장치는, 상기 제1반사부재 및 제2반사부재에 의해 반사되는 복수의 빔 간의 간섭패턴을 포함하는 영상신호를 획득하는 간섭계, 그리고 상기 영상신호로부터 상기 간섭패턴을 검출하고, 상기 간섭패턴과 기 저장된 간섭패턴을 비교하여 상기 제1카메라 및 제2카메라 간의 변위정보를 산출하는 변위검출부를 포함할 수 있다. The calibration device includes an interferometer for obtaining an image signal including an interference pattern between a plurality of beams reflected by the first and second reflecting members, and detecting the interference pattern from the image signal. And a displacement detector configured to compare the previously stored interference pattern to calculate displacement information between the first camera and the second camera.
상기 간섭계는, 광원부, 그리고 상기 광원부로부터 조사된 빔을 광경로가 다른 복수의 빔으로 분리하여 상기 제1반사부재 및 제2반사부재로 각각 전달하는 빔 스플리터를 더 포함할 수 있다. The interferometer may further include a light splitter and a beam splitter which separates the beam irradiated from the light source into a plurality of beams having different optical paths and transmits the beam to the first reflecting member and the second reflecting member, respectively.
상기 간섭계는, 상기 빔 스플리터에서 분리된 빔을 상기 제2반사부재로 전달하는 미러를 더 포함할 수 있다. The interferometer may further include a mirror that transmits the beam separated by the beam splitter to the second reflecting member.
상기 광원부는 레이저 또는 레이저 다이오드를 포함할 수 있다. The light source unit may include a laser or a laser diode.
상기 변위정보는 상기 제1카메라 및 제2카메라 간의 거리 이동정보를 포함하며, 상기 변위검출부는 상기 간섭패턴에 포함된 원형 프린지를 토대로 상기 거리 이동정보를 획득할 수 있다. The displacement information may include distance movement information between the first camera and the second camera, and the displacement detection unit may acquire the distance movement information based on a circular fringe included in the interference pattern.
상기 변위정보는 상기 제1카메라 및 제2카메라 간의 상대적인 회전정보를 포함하며, 상기 변위검출부는 상기 간섭패턴에 포함된 직선형 프린지를 토대로 상기 회전정보를 획득할 수 있다. The displacement information may include relative rotation information between the first camera and the second camera, and the displacement detection unit may acquire the rotation information based on a straight fringe included in the interference pattern.
상기 카메라 시스템은 상기 복수의 카메라를 통해 복수의 영상 데이터를 획득하고, 상기 복수의 카메라 간의 거리정보 및 상대적인 회전정보를 토대로 상기 복수의 영상 데이터로부터 검출되는 객체와의 거리정보를 산출하는 제어장치를 더 포함하며, 상기 제어장치는 상기 변위정보를 토대로 상기 거리정보 및 회전정보를 업데이트할 수 있다. The camera system may be configured to acquire a plurality of image data through the plurality of cameras, and calculate a distance information with respect to an object detected from the plurality of image data based on distance information and relative rotation information between the plurality of cameras. The control apparatus may further update the distance information and the rotation information based on the displacement information.
도 1은 본 발명의 일 실시 예에 따른 카메라 캘리브레이션 장치를 개략적으로 도시한 블록도이다. 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention.
도 2는 본 발명의 일 실시 예에 따른 간섭계를 개략적으로 도시한 블록도이다. 2 is a block diagram schematically illustrating an interferometer according to an exemplary embodiment of the present invention.
도 3는 본 발명의 일 실시 예에 따른 캘리브레이션 장치에서 간섭패턴으로부터 변위정보를 획득하는 방법을 설명하기 위한 도면이다. 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
도 4는 본 발명의 일 실시 예에 따른 캘리브레이션 장치를 포함하는 스테레오 카메라 시스템을 개략적으로 도시한 블록도이다. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment.
도 5 및 도 6은 본 발명의 일 실시 예에 따른 캘리브레이션 장치가 카메라에 결합하는 예들을 도시한 것이다. 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to a camera.
도 7은 본 발명의 일 실시 예에 따른 스테레오 카메라 시스템에서 객체와의 거리를 산출하는 방법을 설명하기 위한 도면이다. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
도 8은 본 발명의 일 실시 예에 따른 캘리브레이션 장치를 포함하는 스테레오 카메라 시스템의 캘리브레이션 방법을 도시한 흐름도이다.8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
본 발명은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있는 바, 특정 실시예들을 도면에 예시하고 설명하고자 한다. 그러나, 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다. As the invention allows for various changes and numerous embodiments, particular embodiments will be illustrated and described in the drawings. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all modifications, equivalents, and substitutes included in the spirit and scope of the present invention.
제2, 제1 등과 같이 서수를 포함하는 용어는 다양한 구성요소들을 설명하는데 사용될 수 있지만, 상기 구성요소들은 상기 용어들에 의해 한정되지는 않는다. 상기 용어들은 하나의 구성요소를 다른 구성요소로부터 구별하는 목적으로만 사용된다. 예를 들어, 본 발명의 권리 범위를 벗어나지 않으면서 제2 구성요소는 제1 구성요소로 명명될 수 있고, 유사하게 제1 구성요소도 제2 구성요소로 명명될 수 있다. 및/또는 이라는 용어는 복수의 관련된 기재된 항목들의 조합 또는 복수의 관련된 기재된 항목들 중의 어느 항목을 포함한다. Terms including ordinal numbers, such as second and first, may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component. The term and / or includes a combination of a plurality of related items or any item of a plurality of related items.
또한, 이하의 설명에서 사용되는 구성요소에 대한 접미사 "모듈" 및 "부"는 명세서 작성의 용이함만이 고려되어 부여되거나 혼용되는 것으로서, 그 자체로 서로 구별되는 의미 또는 역할을 갖는 것은 아니다.In addition, the suffixes "module" and "unit" for the components used in the following description are given or mixed in consideration of ease of specification, and do not have distinct meanings or roles from each other.
어떤 구성요소가 다른 구성요소에 "연결되어" 있다거나 "접속되어" 있다고 언급된 때에는, 그 다른 구성요소에 직접적으로 연결되어 있거나 또는 접속되어 있을 수도 있지만, 중간에 다른 구성요소가 존재할 수도 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소가 다른 구성요소에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 중간에 다른 구성요소가 존재하지 않는 것으로 이해되어야 할 것이다. When a component is referred to as being "connected" or "connected" to another component, it may be directly connected to or connected to that other component, but it may be understood that other components may be present in between. Should be. On the other hand, when a component is said to be "directly connected" or "directly connected" to another component, it should be understood that there is no other component in between.
본 출원에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. 본 출원에서, "포함하다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
다르게 정의되지 않는 한, 기술적이거나 과학적인 용어를 포함해서 여기서 사용되는 모든 용어들은 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자에 의해 일반적으로 이해되는 것과 동일한 의미를 가지고 있다. 일반적으로 사용되는 사전에 정의되어 있는 것과 같은 용어들은 관련 기술의 문맥 상 가지는 의미와 일치하는 의미를 가지는 것으로 해석되어야 하며, 본 출원에서 명백하게 정의하지 않는 한, 이상적이거나 과도하게 형식적인 의미로 해석되지 않는다.Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in the commonly used dictionaries should be construed as having meanings consistent with the meanings in the context of the related art and shall not be construed in ideal or excessively formal meanings unless expressly defined in this application. Do not.
이하, 첨부된 도면을 참조하여 실시예를 상세히 설명하되, 도면 부호에 관계없이 동일하거나 대응하는 구성 요소는 동일한 참조 번호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다. DETAILED DESCRIPTION Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings, and the same or corresponding components will be given the same reference numerals regardless of the reference numerals, and redundant description thereof will be omitted.
아래에서 도면을 참조하여 설명하는 구성요소들은 필수적인 것은 아니어서, 본 발명의 실시 예에 따른 카메라 시스템 및 캘리브레이션 장치는 그 보다 더 많거나 더 적은 구성요소를 포함하도록 마련될 수 있다. Since the components described below with reference to the drawings are not essential, the camera system and the calibration device according to the embodiment of the present invention may be provided to include more or fewer components.
도 1은 본 발명의 일 실시 예에 따른 카메라 캘리브레이션 장치를 개략적으로 도시한 블록도이다. 또한, 도 2는 본 발명의 일 실시 예에 따른 간섭계를 개략적으로 도시한 블록도이다. 도 3는 본 발명의 일 실시 예에 따른 캘리브레이션 장치에서 간섭패턴으로부터 변위정보를 획득하는 방법을 설명하기 위한 도면이다. 1 is a block diagram schematically illustrating a camera calibration apparatus according to an embodiment of the present invention. 2 is a block diagram schematically illustrating an interferometer according to an embodiment of the present invention. 3 is a view for explaining a method of obtaining displacement information from an interference pattern in a calibration device according to an embodiment of the present invention.
도 1 및 2를 참조하면, 본 발명의 일 실시 예에 따른 캘리브레이션 장치(10)는 간섭계(interferometer, 11), 변위검출부(12) 등을 포함할 수 있다. 1 and 2, the calibration device 10 according to an embodiment of the present invention may include an interferometer 11, a displacement detector 12, and the like.
간섭계(11)는 서로 이격되는 복수의 엘리먼트 각각에 배치되는 복수의 반사부재(114, 115)를 포함하며, 하나의 광원부(111)에서 출사되는 빔을 복수의 빔(또는 복수의 광속)으로 분리하여 각 반사부재(114, 115)로 입사시키고, 각 반사부재(114, 115)에 의해 반사된 복수의 빔이 간섭하는 간섭패턴에 대응하여 영상신호를 획득한다. 상기 엘레먼트는 스테레오 카메라에 장착되는 복수의 카메라, 또는 차량용 헤드 램프와 같이 거리 및 각도가 미리 조절된 부품일 수 있다. 이하에서는 카메라를 일 예로 설명한다.The interferometer 11 includes a plurality of reflecting members 114 and 115 disposed on each of the plurality of elements spaced apart from each other, and separates the beams emitted from one light source unit 111 into a plurality of beams (or a plurality of light beams). The light source is incident on each of the reflective members 114 and 115, and an image signal is obtained in response to an interference pattern in which a plurality of beams reflected by the reflective members 114 and 115 interfere. The element may be a plurality of cameras mounted on a stereo camera or a component having a predetermined distance and angle, such as a vehicle head lamp. Hereinafter, the camera will be described as an example.
간섭계(11)는 마이켈슨 간섭계(Michelson Interferometer)의 구성이 적용될 수 있다. The interferometer 11 may be a configuration of a Michelson interferometer.
간섭계(11)는 광원부(111), 빔 스플리터(beam splitter, 112), 복수의 반사부재(reflector, 114, 115), 검출기(116) 등을 포함할 수 있다. 또한, 간섭계(11)는 미러(mirror, 113)를 더 포함할 수 있다. The interferometer 11 may include a light source 111, a beam splitter 112, a plurality of reflectors 114 and 115, a detector 116, and the like. In addition, the interferometer 11 may further include a mirror 113.
광원부(111)는 빔(beam, L1)을 출사한다. The light source unit 111 emits a beam L1.
간섭계(11)에서 검출되는 간섭패턴으로부터 변위(displacement)를 검출하기 위해서는 광원부(111)로부터 출사되는 빔은 결맞는(coherent) 빔을 포함할 수 있다. 여기서, 변위는 두 객체 간의 상대적인 위치변화를 나타내는 파라미터로서, 두 객체 중 어느 하나를 기준으로 나머지 객체가 어느 정도 회전(rotation)하였는지와 어느 정도 이동(translation)하였는지를 나타내는 파라미터이다. In order to detect displacement from the interference pattern detected by the interferometer 11, the beam emitted from the light source unit 111 may include a coherent beam. Herein, the displacement is a parameter indicating a change in relative position between two objects, and a parameter indicating how much of the other object is rotated and how much is translated relative to which one of the two objects.
본 발명의 실시 예에서, 변위는 복수의 카메라 중 어느 하나를 기준으로 나머지 카메라가 어느 정도 회전하였는지와 어느 정도 이동하였는지를 나타내는 파라미터이다.In an embodiment of the present invention, the displacement is a parameter indicating how much the remaining camera is rotated and how much the other camera is rotated based on one of the plurality of cameras.
광원부(111)는 결맞는 빔을 출사하는 레이저, 레이저 다이오드 등을 포함할 수 있다. The light source unit 111 may include a laser, a laser diode, and the like, which emit a coherent beam.
광원부(111)에서 출사된 빔(L1)은 빔 스플리터(112)로 입사할 수 있다. 이를 위해, 광원부(111)의 출사면은 빔 스플리터(112)와 대향하도록 배치된다. The beam L1 emitted from the light source unit 111 may enter the beam splitter 112. To this end, the emission surface of the light source unit 111 is disposed to face the beam splitter 112.
빔 스플리터(112)는 광원부(111)에서 출사되는 결맞는 빔(L1)을 진행방향이 서로 다른 복수의 빔(L11, L12)으로 분리할 수 있다. 예를 들어, 빔 스플리터(112)는 광원부(111)로부터 빔 스플리터(112)로 입사하는 빔(L1) 중 일부(L11)는 투과시키고 나머지 일부(L12)는 반사하여 복수의 빔으로 분리할 수 있다. 이 경우, 빔 스플리터(112)는 반투명 미러 등 반투과 유전체 박판을 포함할 수 있다. The beam splitter 112 may split the matching beam L1 emitted from the light source unit 111 into a plurality of beams L11 and L12 having different travel directions. For example, the beam splitter 112 may transmit a portion L11 of the beam L1 incident from the light source unit 111 to the beam splitter 112, and reflect the remaining portion L12 to separate the plurality of beams. have. In this case, the beam splitter 112 may include a translucent dielectric thin plate such as a translucent mirror.
빔 스플리터(112)에 의해 서로 다른 진행경로로 분리된 빔들(L11, L12)은 각각 제1 및 제2반사부재(114, 115)로 전달될 수 있다. 예를 들어, 빔 스플리터(112)에 의해 투과된 빔은 제1반사부재(114)로 전달되고, 빔 스플리터(112)에 의해 반사된 빔은 제2반사부재(115)로 전달될 수 있다. The beams L11 and L12 separated by different beam paths by the beam splitter 112 may be transmitted to the first and second reflecting members 114 and 115, respectively. For example, the beam transmitted by the beam splitter 112 may be transmitted to the first reflecting member 114, and the beam reflected by the beam splitter 112 may be transmitted to the second reflecting member 115.
제1 및 제2반사부재(114, 115)는 빔 스플리터(112)에 의해 분리된 빔들(L11, L12)이 입사하면, 이를 반사하여 빔 스플리터(112)로 전달할 수 있다. When the beams L11 and L12 separated by the beam splitter 112 are incident, the first and second reflecting members 114 and 115 may reflect the beams to the beam splitter 112.
한편, 빔 스플리터(112)의 특성에 따라 다르지만, 일반적으로 빔 스플리터(112)에 의해 분리된 빔들은 서로 평행하지 않고 소정의 각도를 이루도록 진행된다. 예를 들어, 빔 스플리터(112)에 의해 분리된 빔들은 서로 직교하도록 진행될 수 있다. 따라서, 일렬로 이격되어 배치되는 복수의 카메라(미도시)에 제1 및 제2반사부재(114, 115)가 각각 배치되고, 공간 효율을 위해 그 사이에 빔 스플리터(112)가 배치되는 경우, 빔 스플리터(112)에서 분리되어 출사되는 빔들을 제1 및 제2반사부재(114, 115)에 각각 입사시키기 위해, 간섭계(11)는 미러(113)를 더 포함할 수 있다. On the other hand, depending on the characteristics of the beam splitter 112, in general, the beams separated by the beam splitter 112 proceeds to form a predetermined angle and not parallel to each other. For example, the beams separated by the beam splitter 112 may proceed to be orthogonal to each other. Therefore, when the first and second reflecting members 114 and 115 are respectively disposed in a plurality of cameras (not shown) spaced apart in a line, and the beam splitter 112 is disposed therebetween for space efficiency, The interferometer 11 may further include a mirror 113 to allow the beams separated from the beam splitter 112 to be incident on the first and second reflecting members 114 and 115, respectively.
미러(113)는 빔 스플리터(112)와 제2반사부재(115)의 광 경로 상에 배치되며, 빔 스플리터(112)와 제2반사부재(115) 사이에 빔을 전달하는 기능을 수행할 수 있다. The mirror 113 is disposed on an optical path of the beam splitter 112 and the second reflecting member 115, and may perform a function of transferring a beam between the beam splitter 112 and the second reflecting member 115. have.
도 2를 참조하면, 미러(113)는 빔 스플리터(112)에 의해 반사된 빔(L12)의 진행경로 상에 배치되며, 빔 스플리터(112)에 의해 반사된 빔을 다시 반사하여 제2반사부재(115)로 진행시킬 수 있다. 또한, 미러(114)는 제2반사부재(115)에 의해 반사된 빔(L22)이 입사하면, 이를 반사하여 빔 스플리터(112)로 입사시킬 수도 있다. Referring to FIG. 2, the mirror 113 is disposed on a path of the beam L12 reflected by the beam splitter 112, and reflects the beam reflected by the beam splitter 112 to reflect the second reflecting member. Proceed to 115. In addition, when the beam L22 reflected by the second reflecting member 115 is incident, the mirror 114 may reflect the incident light and enter the beam splitter 112.
즉, 빔 스플리터(112)에 의해 투과된 빔(L11)은 제1반사부재(114)로 바로 진행되나, 빔 스플리터(112)에 의해 반사된 빔(L12)은 미러(113)에 의해 다시 한번 반사되어 제2반사부재(115)로 진행될 수 있다. 또한, 제1반사부재(114)에 의해 반사된 빔(L21)은 빔 스플리터(112)로 바로 진행되나, 제2반사부재(115)에 의해 반사된 빔(L22)은 미러(113)에 의해 다시 한번 반사되어 빔 스플리터(112)로 진행될 수 있다. That is, the beam L11 transmitted by the beam splitter 112 proceeds directly to the first reflecting member 114, but the beam L12 reflected by the beam splitter 112 is once again reflected by the mirror 113. The reflection may proceed to the second reflecting member 115. In addition, the beam L21 reflected by the first reflecting member 114 proceeds directly to the beam splitter 112, but the beam L22 reflected by the second reflecting member 115 is reflected by the mirror 113. Once again reflected it may proceed to the beam splitter 112.
전술한 바에 따르면, 하나의 광원부(111)에서 출사된 빔은 빔 스플리터(112)에 의해 분리되어 진행경로에 차이가 발생하며, 제1 및 제2반사부재(114, 115)에 의해 반사되어 빔 스플리터(112)에서 간섭하게 된다. As described above, the beam emitted from one light source unit 111 is separated by the beam splitter 112, and a difference occurs in the traveling path, and is reflected by the first and second reflecting members 114 and 115 and the beam Interference occurs in the splitter 112.
빔 스플리터(112)에서 간섭한 빔들(L21, L22)은 빔 스플리터(112)에 의해 하나의 빔(L2)으로 결합되고, 빔 스플리터(112)는 이를 검출기(116)로 전달할 수 있다. The beams L21 and L22 interfering in the beam splitter 112 may be combined into one beam L2 by the beam splitter 112, and the beam splitter 112 may transmit it to the detector 116.
검출기(116)는 빔 스플리터(112)로부터 반사되어 재결합된 빔(L2)이 입사하면, 이를 수신하여 영상 신호를 획득한다. 검출기(116)에 의해 획득되는 영상신호는 간섭패턴(interference fringe)를 포함한다. 여기서, 간섭패턴은 빔의 간섭에 의하여 생기는 격자 또는 동심원 모양으로 된 명암의 무늬를 나타낸다. When the detector 116 receives the beam L2 reflected and recombined from the beam splitter 112, the detector 116 receives the beam L2 to obtain an image signal. The image signal obtained by the detector 116 includes an interference fringe. Here, the interference pattern represents a pattern of light and dark shades formed in a lattice or concentric shape caused by the interference of the beam.
제1반사부재(114)와 제2반사부재(115)에 의해 반사된 빔들은 광경로 차로 인해 빔 스플리터(112)에서 만났을 때 간섭을 발생시킨다. 이러한 간섭은 검출기(116)로 입사되는 빔의 영상신호에서 명암의 무늬를 발생시킨다. The beams reflected by the first reflecting member 114 and the second reflecting member 115 generate interference when they are met at the beam splitter 112 due to the difference in optical paths. This interference generates contrast patterns in the image signal of the beam incident on the detector 116.
다시, 도 1을 보면, 변위검출부(12)는 간섭계(11)를 통해 획득한 영상신호를 신호처리하여 영상데이터를 획득하고, 이로부터 간섭패턴을 추출한다. Referring again to FIG. 1, the displacement detector 12 processes the image signal obtained through the interferometer 11 to obtain image data, and extracts an interference pattern therefrom.
또한, 변위검출부(12)는 기 저장된 간섭패턴과 새로 추출한 간섭패턴을 비교하여, 제1 및 제2반사부재(114, 115) 사이의 거리 변화와 회전을 포함하는 변위정보를 산출할 수 있다. 여기서, 비교에 사용되는 기 저장된 간섭패턴은 변위정보를 검출하기 위한 기준이 되는 간섭패턴으로서, 이하 설명의 편의를 위해 '기준 간섭패턴'이라 명명하여 사용한다. In addition, the displacement detection unit 12 may calculate displacement information including a change in distance and rotation between the first and second reflecting members 114 and 115 by comparing the previously stored interference pattern with the newly extracted interference pattern. Here, the pre-stored interference pattern used for comparison is an interference pattern which is a reference for detecting displacement information, and is referred to as a 'reference interference pattern' for convenience of description below.
기준 간섭패턴은 변위검출부(12)가 캘리브레이션 장치(10)를 카메라 시스템(미도시)에 초기 장착 시 간섭계(11)를 이용하여 획득한 간섭패턴일 수 있다. 또한, 기준 간섭패턴은 변위검출부(12)가 이전 캘리브레이션 시 간섭계(11)를 이용하여 획득한 간섭패턴일 수 있다. 전자의 경우, 변위검출부(12)는 캘리브레이션 장치(10)를 카메라 시스템(미도시)에 초기 장착 시 간섭계(11)를 이용하여 획득한 간섭패턴을 내부 메모리(미도시) 또는 외부 메모리(미도시)에 저장하여 기준 간섭패턴으로 사용할 수 있다. 또한, 후자의 경우, 변위검출부(12)는 간섭계를 통해 획득되는 간섭패턴을 매번 내부 메모리(미도시) 또는 외부 메모리(미도시)에 저장하거나, 변위 변화가 있을 시에만 내부 메모리(미도시) 또는 외부 메모리(미도시)에 저장하여 기준 간섭패턴으로 사용할 수 있다. The reference interference pattern may be an interference pattern acquired by the displacement detector 12 using the interferometer 11 when the calibration apparatus 10 is initially mounted in a camera system (not shown). In addition, the reference interference pattern may be an interference pattern obtained by the displacement detector 12 using the interferometer 11 during the previous calibration. In the former case, the displacement detector 12 may use the internal memory (not shown) or the external memory (not shown) to acquire an interference pattern obtained by using the interferometer 11 when the calibration apparatus 10 is initially installed in a camera system (not shown). ) Can be used as a reference interference pattern. In addition, in the latter case, the displacement detection unit 12 stores the interference pattern obtained through the interferometer each time in an internal memory (not shown) or an external memory (not shown), or only when there is a displacement change. Alternatively, it may be stored in an external memory (not shown) and used as a reference interference pattern.
한편, 변위검출부(12)에서 검출되는 간섭패턴과 측정 대상의 변위의 관계는 아래의 수학식 1로 나타낼 수 있다. Meanwhile, the relationship between the interference pattern detected by the displacement detection unit 12 and the displacement of the measurement target may be expressed by Equation 1 below.
Figure PCTKR2015000835-appb-M000001
Figure PCTKR2015000835-appb-M000001
여기서, d는 측정 대상의 변위, 즉, 제1반사부재(114)와 제2반사부재(115) 간의 변위를 나타내고, n은 간섭패턴 내 프린지(fringe)의 변화 개수, λ는 광원부(111)에서 출사되는 빔의 파장을 나타낸다. Here, d represents the displacement of the measurement object, that is, the displacement between the first reflecting member 114 and the second reflecting member 115, n is the number of changes of the fringe in the interference pattern, λ is the light source unit 111 It represents the wavelength of the beam emitted from.
도 3의 (a)를 참조하면, 제1반사부재(114)의 위치가 M1에서 M2'로 이동하면 즉, 제1반사부재(114)의 위치가 제2반사부재(115)로부터 멀어지면, 빔의 경로 변화로 인해 광원부(111)로부터 출사되는 소스 빔(source beam)의 허상의 위치가 S1'에서 S2'로 변경된다. 이에 따라, 검출기(116)에서 검출되는 영상신호 내 적어도 일부의 원형 프린지가 밖으로 퍼져나가면서 변화한다. Referring to FIG. 3A, when the position of the first reflecting member 114 moves from M 1 to M 2 ′, that is, the position of the first reflecting member 114 is far from the second reflecting member 115. Ground, the position of the virtual image of the source beam (source beam) emitted from the light source unit 111 is changed from S 1 'to S 2 ' due to the path change of the beam. As a result, at least some circular fringes in the image signal detected by the detector 116 spread out and change.
반면에, 제1반사부재(114)의 위치가 M2'에서 M1으로 이동하면 즉, 제1반사부재(114)의 위치가 제2반사부재(115)로 가까워지면, 빔의 경로 변화로 인해 광원부(111)로부터 출사되는 소스 빔(source beam)의 허상의 위치가 S2'에서 S1’로 변경된다. 이에 따라, 검출기(116)에서 검출되는 영상신호 내 적어도 일부의 원형 프린지가 안으로 모이면서 변화한다.On the other hand, when the position of the first reflecting member 114 moves from M 2 ′ to M 1 , that is, when the position of the first reflecting member 114 approaches the second reflecting member 115, the beam path changes. Therefore, the position of the virtual image of the source beam emitted from the light source 111 is changed from S 2 ′ to S 1 ′. Accordingly, at least some of the circular fringes in the image signal detected by the detector 116 converge while changing.
따라서, 변위검출부(12)는 검출기(116)에서 검출되는 영상신호 내에서 원형 프린지 패턴의 변화를 토대로 제1반사부재(114)와 제2반사부재(115) 간의 상대적인 거리 변화를 검출할 수 있다. Accordingly, the displacement detector 12 may detect a change in relative distance between the first reflecting member 114 and the second reflecting member 115 based on the change of the circular fringe pattern in the image signal detected by the detector 116. .
도 3의 (b)를 참조하면, 제1반사부재(114)의 각도가 M1에서 M2'로 회전하면, 광원부(111)로부터 출사되는 소스 빔의 허상의 위치가 S1'에서 S2’로 변경된다. 이에 따라, 검출기(116)에서 검출되는 영상신호에서 광축(Optical Axis) 근방의 직선형 프린지가 우측(또는 좌측)으로 이동하며 변화한다. Referring to FIG. 3B, when the angle of the first reflecting member 114 rotates from M 1 to M 2 ′, the position of the virtual image of the source beam emitted from the light source unit 111 is S 1 ′ to S 2. Is changed to '. As a result, the linear fringe near the optical axis is shifted to the right (or left) in the image signal detected by the detector 116.
반면에, 제1반사부재(114)의 각도가 M2'에서 M1으로 회전하면, 광원부(111)로부터 출사되는 소스 빔의 허상의 위치가 S2'에서 S1'로 변경된다. 이에 따라, 검출기(116)에서 검출되는 영상신호에서 광축(Optical Axis) 근방의 직선형 프린지가 좌측(또는 우측)으로 이동하며 변화한다.On the other hand, when the angle of the first reflecting member 114 rotates from M 2 ′ to M 1 , the position of the virtual image of the source beam emitted from the light source unit 111 is changed from S 2 ′ to S 1 ′. As a result, the linear fringe near the optical axis is shifted to the left (or right) in the image signal detected by the detector 116.
따라서, 변위검출부(12)는 검출기(116)에서 검출되는 영상신호 내 직선형 프린지 패턴의 변화를 토대로 제1반사부재(114)와 제2반사부재(115) 간의 상대적인 기울기 변화를 검출할 수 있다. Accordingly, the displacement detector 12 may detect a change in the relative inclination between the first reflective member 114 and the second reflective member 115 based on the change of the linear fringe pattern in the image signal detected by the detector 116.
한편, 일반적으로 레이저의 파장은 400nm 내지 800nm이다. UV 레이저의 경우, 대략 400nm 이하의 파장도 가능하다. On the other hand, the wavelength of a laser is generally 400 nm-800 nm. In the case of UV lasers, wavelengths of approximately 400 nm or less are possible.
따라서, 광원부(111)에서 조사되는 빔이 레이저인 경우, 수학식 1을 참조하면, 간섭계(11)의 분해능(resolution)은 10nm 이하를 만족할 수 있다. 즉, 광경로 차로 인해 위상이 반파장 변화하면, 간섭패턴 내 1개의 프린지가 변화할 수 있다. Therefore, when the beam irradiated from the light source unit 111 is a laser, referring to Equation 1, the resolution of the interferometer 11 may satisfy 10 nm or less. That is, if the phase is changed by half wavelength due to the optical path difference, one fringe in the interference pattern may change.
따라서, 전술한 간섭계(11)를 사용하는 경우, 제1 및 제2반사부재(114, 115) 간의 변위정보를 10nm 이하 단위로 검출할 수 있다. Therefore, when the above-described interferometer 11 is used, displacement information between the first and second reflecting members 114 and 115 can be detected in units of 10 nm or less.
전술한 바에 따르면, 캘리브레이션 장치(10)는 간섭계(11)를 이용하여 서로 이격되어 배치되는 제1 및 제2반사부재(114, 115) 간의 변위정보를 검출할 수 있다. 또한, 제1 및 제2반사부재(114, 115) 간의 변위정보를 토대로 제1 및 제2반사부재(114, 115)가 배치된 복수의 카메라(미도시) 간의 변위정보를 검출할 수 있다. As described above, the calibration device 10 may detect displacement information between the first and second reflecting members 114 and 115 spaced apart from each other using the interferometer 11. In addition, displacement information between a plurality of cameras (not shown) in which the first and second reflection members 114 and 115 are disposed may be detected based on the displacement information between the first and second reflection members 114 and 115.
전술한 캘리브레이션 장치(10)를 스테레오 카메라 시스템에 적용하는 경우, 스테레오 카메라 시스템의 실시간 카메라 캘리브레이션이 가능한 효과가 있다. When the above-described calibration device 10 is applied to a stereo camera system, real time camera calibration of the stereo camera system is possible.
아래에서는 도 4 내지 도 6을 참조하여, 본 발명의 일 실시 예에 따른 캘리브레이션 장치가 스테레오 카메라 시스템에 적용되는 경우를 설명하기로 한다. Hereinafter, a case in which the calibration device according to an embodiment of the present invention is applied to a stereo camera system will be described with reference to FIGS. 4 to 6.
도 4는 본 발명의 일 실시 예에 따른 캘리브레이션 장치를 포함하는 스테레오 카메라 시스템을 개략적으로 도시한 블록도이다. 또한, 도 5 및 도 6은 본 발명의 일 실시 예에 따른 캘리브레이션 장치가 스테레오 카메라 시스템의 카메라들에 결합하는 예들을 도시한 것이다. 또한, 도 7은 본 발명의 일 실시 예에 따른 스테레오 카메라 시스템에서 객체와의 거리를 산출하는 방법을 설명하기 위한 도면이다. 4 is a block diagram schematically illustrating a stereo camera system including a calibration device according to an exemplary embodiment. 5 and 6 illustrate examples in which a calibration device according to an embodiment of the present invention is coupled to cameras of a stereo camera system. FIG. 7 is a diagram for describing a method of calculating a distance from an object in a stereo camera system according to an exemplary embodiment.
도 4를 참조하면, 본 발명의 일 실시 예에 따른 스테레오 카메라 시스템은 복수의 카메라(21, 22), 캘리브레이션 장치(10), 제어장치(30) 등을 포함할 수 있다. Referring to FIG. 4, the stereo camera system according to an exemplary embodiment may include a plurality of cameras 21 and 22, a calibration device 10, a control device 30, and the like.
복수의 카메라(21, 22)는 서로 소정 간격 이격되어 배치되며, 대상체 외부의 영상 데이터를 획득할 수 있다. 예를 들어, 대상체가 차량인 경우, 복수의 카메라(21, 22)는 차량의 전/후에 장착되어, 차량 전방 또는 후방의 영상 데이터를 획득할 수 있다. The plurality of cameras 21 and 22 may be spaced apart from each other by a predetermined interval, and may acquire image data outside the object. For example, when the object is a vehicle, the plurality of cameras 21 and 22 may be mounted before or after the vehicle to acquire image data in front of or behind the vehicle.
캘리브레이션 장치(10)는 전술한 도 1 내지 도 3을 참조하여 설명한 바와 같이, 간섭계(도 1의 도면부호 11 참조)를 포함하며, 간섭계(11)를 이용하여 복수의 카메라(21, 22) 간의 변위정보를 획득한다. 여기서, 변위정보는 복수의 카메라(21, 22) 간의 상대적인 거리 변화(translation)량 및 상대적인 각도 회전량(rotation)을 포함할 수 있다. The calibration device 10 includes an interferometer (see reference numeral 11 of FIG. 1), as described with reference to FIGS. 1 to 3, and between the plurality of cameras 21 and 22 using the interferometer 11. Obtain displacement information. Here, the displacement information may include a relative distance translation between the plurality of cameras 21 and 22 and a relative angular rotation.
한편, 간섭계(11)를 통해 복수의 카메라(21, 22) 간의 상대적인 변위정보를 검출하기 위해서는, 각 카메라(21, 22)의 몸체에 간섭계(11)의 반사부재(도 2의 도면부호 114, 115 참조)를 일체형으로 배치할 필요가 있다. On the other hand, in order to detect the relative displacement information between the plurality of cameras 21, 22 through the interferometer 11, the reflecting member of the interferometer 11 (reference numeral 114, It is necessary to arrange them in one piece.
도 5 및 도 6은 간섭계(11)를 카메라(21, 22)와 결합하는 예들을 도시한 것이다. 5 and 6 show examples of coupling the interferometer 11 with the cameras 21 and 22.
도 5를 참조하면, 제1카메라(21)와 제2카메라(22)는 카메라 하우징(200)에 의해 소정 간격 이격되어 배치된다. 필요에 따라 카메라 하우징(200)에는 3개 이상의 카메라가 장착될 수도 있다. 이때, 반사부재 역시 동일한 개수로 배치될 수 있다.Referring to FIG. 5, the first camera 21 and the second camera 22 are spaced apart by a predetermined interval by the camera housing 200. If necessary, three or more cameras may be mounted in the camera housing 200. In this case, the reflective members may also be arranged in the same number.
각 카메라(21, 22)의 몸체에는 간섭계(11)의 제1 및 제2반사부재(114, 115)가 각각 배치된다. 제1 및 제2반사부재(114, 115)는 각 카메라(21, 22)의 몸체 움직임에 대응하여 이동 또는 회전하도록 각 카메라(21, 22)의 몸체에 일체형으로 결합할 수 있다. The first and second reflecting members 114 and 115 of the interferometer 11 are disposed in the bodies of the cameras 21 and 22, respectively. The first and second reflecting members 114 and 115 may be integrally coupled to the bodies of the cameras 21 and 22 so as to move or rotate in response to the body movements of the cameras 21 and 22.
또한, 제1 및 제2반사부재(114, 115) 사이의 광경로 상에는 간섭계 케이스가(100)가 배치되고, 간섭계 케이스(100)는 내부에 간섭계(11)의 광원부(111), 빔 스플리터(112), 미러(113), 검출기(116) 등을 포함할 수 있다. In addition, an interferometer case 100 is disposed on the optical path between the first and second reflecting members 114 and 115, and the interferometer case 100 has a light source unit 111 and a beam splitter () of the interferometer 11 therein. 112, mirror 113, detector 116, and the like.
구체적으로 즉, 광원부(111), 빔 스플리터(112), 미러(113) 및 검출기(116)가 일체로 수용된 간섭계 케이스(100)는 제1카메라(21)와 제2카메라(22)의 사이에 배치되고, 제1반사부재(114)와 제2반사부재(115)는 각각 독립적으로 제1카메라(21)와 제2카메라(22)에 배치될 수 있다.Specifically, the interferometer case 100 in which the light source unit 111, the beam splitter 112, the mirror 113, and the detector 116 are integrally accommodated is disposed between the first camera 21 and the second camera 22. The first reflecting member 114 and the second reflecting member 115 may be disposed independently of the first camera 21 and the second camera 22, respectively.
그러나, 반드시 이에 한정되는 것은 아니고, 도 6과 같이 제1카메라(21)의 몸체에는 간섭계(11)의 제1반사부재(114)가 배치되고, 제2카메라(22)에는 간섭계 케이스(100')가 배치될 수도 있다.However, the present invention is not limited thereto, and as shown in FIG. 6, the first reflecting member 114 of the interferometer 11 is disposed on the body of the first camera 21, and the interferometer case 100 ′ is provided on the second camera 22. ) May be arranged.
이때, 간섭계 케이스(100')는 내부에 간섭계(11)의 광원부(111), 빔 스플리터(112), 미러(113), 제2반사부재(115), 검출기(116) 등을 포함할 수 있다. 즉, 광원부(111), 빔 스플리터(112), 미러(113), 제2반사부재(115) 및 검출기(116)가 간섭계 케이스(100)에 의해 일체형으로 결합될 수 있다.In this case, the interferometer case 100 ′ may include a light source unit 111, a beam splitter 112, a mirror 113, a second reflecting member 115, a detector 116, and the like of the interferometer 11. . That is, the light source unit 111, the beam splitter 112, the mirror 113, the second reflecting member 115, and the detector 116 may be integrally coupled by the interferometer case 100.
다시, 도 4를 보면, 제어장치(30)는 복수의 카메라(21, 22)를 통해 대상체 외부의 영상 데이터를 지속적으로 수신한다. 또한, 제어장치(30)는 영상 인식을 통해 복수의 카메라(21, 22)로부터 수신한 영상 데이터에서 객체를 검출한다. Again, referring to FIG. 4, the controller 30 continuously receives image data outside the object through the plurality of cameras 21 and 22. In addition, the controller 30 detects an object from image data received from the plurality of cameras 21 and 22 through image recognition.
제어장치(30)는 영상 데이터로부터 객체가 검출되면, 각 영상 데이터 내에서의 객체 위치정보를 획득하고, 각 영상 데이터 내에서의 객체의 위치정보와 복수의 카메라(21, 22) 간 거리정보를 토대로 객체와의 거리정보를 획득할 수 있다. When the object is detected from the image data, the controller 30 obtains object position information in each image data, and controls the position information of the object in each image data and distance information between the plurality of cameras 21 and 22. Based on the distance information with the object can be obtained.
도 7을 참조하면, 제1 및 제2카메라(21, 22) 각각의 초점 거리를 f, 제1 및제2카메라(21, 22) 간의 거리를 b, 제1 및 제2카메라(21, 22)를 통해 입력되는 영상 데이터 상에서 제1 및 제2카메라(21, 22)의 중심에 대응하는 지점과 객체(OB)간의 거리를 각각 DL 및 DR이라고 정의하는 경우, 복수의 카메라(21, 22)와 객체(OB) 간의 거리정보(Z)는 아래의 수학식 2과 같이 획득할 수 있다. Referring to FIG. 7, the focal length of each of the first and second cameras 21 and 22 is f, and the distance between the first and second cameras 21 and 22 is b, the first and second cameras 21 and 22. When the distance between the point corresponding to the center of the first and second cameras 21 and 22 and the object OB is defined as DL and DR, respectively, on the image data input through the plurality of cameras 21 and 22, The distance information Z between the objects OB may be obtained as in Equation 2 below.
Figure PCTKR2015000835-appb-M000002
Figure PCTKR2015000835-appb-M000002
위 수학식 2에서, 제1 및 제2카메라(21, 22) 각각의 초점 거리(f)는 제1 및 제2카메라(21, 22)의 사양에 따라 미리 설정된 값이 사용될 수 있다. In Equation 2, the focal length f of each of the first and second cameras 21 and 22 may be a preset value according to the specifications of the first and second cameras 21 and 22.
또한, 제1 및 제2카메라(21, 22) 간 거리(d)는 제1 및 제2카메라(21, 22)의 중심점 간의 거리로서, 제1 및 제2카메라(21, 22)의 설치 시 설정된 값이 초기 값으로 사용될 수 있다. In addition, the distance d between the first and second cameras 21 and 22 is a distance between the center points of the first and second cameras 21 and 22, and is installed when the first and second cameras 21 and 22 are installed. The set value can be used as the initial value.
한편, 제1 및 제2카메라(21, 22)가 대상체인 차량에 설치된 이후 차량 진동, 온도 변화 등으로 인해, 제1 및 제2카메라(21, 22)의 위치, 자세 등이 변경될 수 있다. 따라서, 제1 및 제2카메라(21, 22)의 위치 또는 자세 변화를 반영하는 과정 없이, 제1 및 제2카메라(21, 22)의 초기 설치 시 설정된 스테레오 파라미터(stereo parameter)를 그대로 사용하여 물체와의 거리정보를 산출하는 경우, 거리정보의 정확도가 떨어지게 된다. Meanwhile, after the first and second cameras 21 and 22 are installed in the vehicle that is the object, the position and the posture of the first and second cameras 21 and 22 may be changed due to vehicle vibration and temperature change. . Therefore, without using the process of reflecting the position or attitude change of the first and second cameras 21 and 22, the stereo parameters set at the time of initial installation of the first and second cameras 21 and 22 are used as they are. When the distance information with respect to the object is calculated, the accuracy of the distance information is reduced.
여기서, 객체와의 거리정보 산출에 사용되는 스테레오 파라미터는 제1 및 제2카메라(21, 22) 간의 거리정보와 제1 및 제2카메라(21, 22) 간의 상대적인 기울기(tilting)정보를 포함할 수 있다. 제1 및 제2카메라(21, 22)의 상대적인 기울기정보는 제1 및 제2카메라(21, 22) 중 어느 하나의 광축을 기준으로 나머지 카메라의 광축이 기울어진 정도를 나타낼 수 있다.Here, the stereo parameter used to calculate distance information with the object may include distance information between the first and second cameras 21 and 22 and relative tilting information between the first and second cameras 21 and 22. Can be. The relative inclination information of the first and second cameras 21 and 22 may indicate the degree of inclination of the other cameras with respect to the optical axis of any one of the first and second cameras 21 and 22.
제어장치(30)는 제1 및 제2카메라(21, 22)의 위치, 자세 등이 변경됨에 따라 객체와의 거리정보 산출의 정확도가 떨어지는 것을 방지하기 위해, 캘리브레이션 장치(10)를 통해 획득한 변위정보를 토대로, 객체와의 거리정보 산출에 사용되는 변위 파라미터를 보정하여 사용할 수 있다. 즉, 제어장치(30)는 기 저장된 제1 및 제2카메라(21, 22) 간의 스테레오 파라미터를 캘리브레이션 장치(10)를 획득한 변위정보를 토대로 업데이트할 수 있다. The control device 30 is obtained through the calibration device 10 in order to prevent the accuracy of calculating the distance information with the object as the position, posture, etc. of the first and second cameras 21 and 22 are changed. Based on the displacement information, the displacement parameter used to calculate distance information with the object may be corrected and used. That is, the controller 30 may update the stereo parameter between the first and second cameras 21 and 22 that are stored in advance based on the displacement information obtained by the calibration device 10.
한편, 전술한 캘리브레이션 장치(10)의 변위검출부(12)는 제어장치(30) 내에 구현될 수도 있고, 제어장치(30)와는 별도로 구현될 수도 있다. On the other hand, the above-described displacement detection unit 12 of the calibration device 10 may be implemented in the control device 30, may be implemented separately from the control device 30.
도 8은 본 발명의 일 실시 예에 따른 캘리브레이션 장치를 포함하는 스테레오 카메라 시스템의 캘리브레이션 방법을 도시한 흐름도이다. 8 is a flowchart illustrating a calibration method of a stereo camera system including a calibration device according to an exemplary embodiment.
도 8을 참조하면, 캘리브레이션이 시작되면, 간섭계(11)의 광원부(111)는 간섭패턴 발생을 위한 빔을 조사한다(S110). Referring to FIG. 8, when calibration is started, the light source unit 111 of the interferometer 11 irradiates a beam for generating an interference pattern (S110).
이후, 캘리브레이션 장치(10)는 광원부(111)로부터 조사된 빔이 제1 및 제2반사부재(114, 115)에 의해 서로 다른 광경로로 반사되면, 간섭계(11)의 검출기(116)를 통해 이를 검출하고(S120), 검출기(116)는 반사된 빔에 대응하는 영상신호를 출력한다. After that, when the beam irradiated from the light source unit 111 is reflected by different light paths by the first and second reflecting members 114 and 115, the calibration device 10 passes through the detector 116 of the interferometer 11. In operation S120, the detector 116 outputs an image signal corresponding to the reflected beam.
상기 S120 단계에서, 광원부(11)에서 조사된 빔은 빔 스플리터(112)에 의해 진행경로가 서로 다른 복수의 빔(또는 복수의 광속)으로 분리된다. 빔 스플리터(112)에 분리된 복수의 빔은 각각 제1 및 제2반사부재(114, 115)로 진행하고, 제1 및 제2반사부재(114, 115)에 의해 반사되어 빔 스플리터(112)에서 다시 결합된다. 빔 스플리터(112)는 반사빔들을 결합하여 검출기(116)로 진행시키며, 검출기(116)는 이를 수신하여 대응하는 영상신호를 출력한다. In step S120, the beam irradiated from the light source unit 11 is separated into a plurality of beams (or a plurality of light beams) having different propagation paths by the beam splitter 112. The plurality of beams separated by the beam splitter 112 proceed to the first and second reflecting members 114 and 115, respectively, and are reflected by the first and second reflecting members 114 and 115 to reflect the beam splitter 112. Are combined again in. The beam splitter 112 combines the reflected beams and proceeds to the detector 116, which receives the output and outputs a corresponding image signal.
상기 S120 단계에서 생성된 영상신호를 캘리브레이션 장치(10)의 변위검출부(12)로 입력된다. The video signal generated in step S120 is input to the displacement detector 12 of the calibration device 10.
변위검출부(12)는 검출기(16)로부터 입력되는 영상신호를 신호처리하여 영상 데이터를 획득하고, 이로부터 반사빔들 간의 간섭패턴을 검출한다(S130). The displacement detector 12 obtains image data by signal processing the image signal input from the detector 16, and detects an interference pattern between the reflected beams (S130).
또한, 변위검출부(12)는 검출한 간섭패턴의 원형 프린지 또는 직선형 프린지를 분석하여 변위정보를 획득한다(S140). 변위정보 검출 방법은 전술한 도 3을 참조하여 상세히 설명하였으므로 아래에서는 상세한 설명을 생략한다. In addition, the displacement detection unit 12 obtains displacement information by analyzing a circular fringe or a straight fringe of the detected interference pattern (S140). Since the displacement information detection method has been described in detail with reference to FIG. 3, the detailed description will be omitted below.
상기 S140 단계에서, 변위검출부(12)에서 획득된 변위정보는 제어장치(30)로 전달된다. In step S140, the displacement information obtained by the displacement detection unit 12 is transmitted to the control device 30.
제어장치(30)는 변위검출부(12)로부터 변위정보가 수신되면, 이를 토대로 객체와의 거리 검출에 사용되는 스테레오 파라미터를 보정한다(S150). 즉, 변위정보를 토대로 제1 및 제2카메라(21, 22) 간의 거리정보 및 기울기정보를 보정한다. When the displacement information is received from the displacement detector 12, the controller 30 corrects the stereo parameter used to detect the distance to the object based on the displacement information (S150). That is, distance information and slope information between the first and second cameras 21 and 22 are corrected based on the displacement information.
상기 S150 단계에서 보정된 스테레오 파라미터는 제어장치(30)의 내부 메모리(미도시) 또는 외부 메모리(미도시)에 저장된다. The stereo parameter corrected in step S150 is stored in an internal memory (not shown) or an external memory (not shown) of the control device 30.
전술한 본 발명의 일 실시 예에 따른 스테레오 카메라 시스템은 차량의 운행 중에 발생하는 진동, 온도 변화 등으로 복수의 카메라 간의 거리 및 기울기가 변하는 것을 실시간으로 감지하는 것이 가능하다. 또한, 더 나아가서 복수의 카메라 간의 변위정보를 실시간으로 획득하고, 획득한 변위정보를 토대로 객체와의 거리정보 산출에 사용되는 스테레오 파라미터를 보정함으로써 객체와의 거리정보 산출의 정확도를 향상시킬 수 있다. The stereo camera system according to an exemplary embodiment of the present invention described above may detect in real time that a distance and a slope between a plurality of cameras change due to vibration, temperature change, etc. generated during driving of a vehicle. Further, accuracy of calculating distance information with an object may be improved by acquiring displacement information between a plurality of cameras in real time and correcting a stereo parameter used to calculate distance information with an object based on the obtained displacement information.
또한, 간섭계의 특성 상 설치 공간을 많이 필요로 하지 않아, 카메라 시스템에 일체형으로 장착하여 사용하는 것이 가능하다. In addition, due to the characteristics of the interferometer does not require a lot of installation space, it is possible to use integrally mounted to the camera system.
본 문서에 개시된 캘리브레이션 장치 및 카메라 시스템은, 차량 운행 중 실시간으로 카메라 시스템의 캘리브레이션을 수행할 수 있다. The calibration apparatus and the camera system disclosed in this document may perform calibration of the camera system in real time while driving a vehicle.
본 실시예에서 사용되는 '~부'라는 용어는 소프트웨어 또는 FPGA(field-programmable gate array) 또는 ASIC과 같은 하드웨어 구성요소를 의미하며, '~부'는 어떤 역할들을 수행한다. 그렇지만 '~부'는 소프트웨어 또는 하드웨어에 한정되는 의미는 아니다. '~부'는 어드레싱할 수 있는 기록 매체에 있도록 구성될 수도 있고 하나 또는 그 이상의 프로세서들을 재생시키도록 구성될 수도 있다. 따라서, 일 예로서 '~부'는 소프트웨어 구성요소들, 객체지향 소프트웨어 구성요소들, 클래스 구성요소들 및 태스크 구성요소들과 같은 구성요소들과, 프로세스들, 함수들, 속성들, 프로시저들, 서브루틴들, 프로그램 코드의 세그먼트들, 드라이버들, 펌웨어, 마이크로코드, 회로, 데이터, 데이터베이스, 데이터 구조들, 테이블들, 어레이들, 및 변수들을 포함한다. 구성요소들과 '~부'들 안에서 제공되는 기능은 더 작은 수의 구성요소들 및 '~부'들로 결합되거나 추가적인 구성요소들과 '~부'들로 더 분리될 수 있다. 뿐만 아니라, 구성요소들 및 '~부'들은 디바이스 또는 보안 멀티미디어카드 내의 하나 또는 그 이상의 CPU들을 재생시키도록 구현될 수도 있다.The term '~ part' used in the present embodiment refers to software or a hardware component such as a field-programmable gate array (FPGA) or an ASIC, and '~ part' performs certain roles. However, '~' is not meant to be limited to software or hardware. May be configured to reside in an addressable recording medium or may be configured to reproduce one or more processors. Thus, as an example, '~' means components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and the like. Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables. The functionality provided within the components and the 'parts' may be combined into a smaller number of components and the 'parts' or further separated into additional components and the 'parts'. In addition, the components and '~' may be implemented to play one or more CPUs in the device or secure multimedia card.
상기에서는 본 발명의 바람직한 실시예를 참조하여 설명하였지만, 해당 기술 분야의 숙련된 당업자는 하기의 특허 청구의 범위에 기재된 본 발명의 사상 및 영역으로부터 벗어나지 않는 범위 내에서 본 발명을 다양하게 수정 및 변경시킬 수 있음을 이해할 수 있을 것이다. Although described above with reference to a preferred embodiment of the present invention, those skilled in the art will be variously modified and changed within the scope of the invention without departing from the spirit and scope of the invention described in the claims below I can understand that you can.

Claims (16)

  1. 제1엘리먼트에 배치되는 제1반사부재, A first reflecting member disposed on the first element,
    제2엘리먼트에 배치되는 제2반사부재, A second reflecting member disposed on the second element,
    제1빔을 출사하는 광원부,A light source unit emitting the first beam,
    상기 제1빔을 진행경로가 서로 다른 제2빔 및 제3빔으로 분리하고, 상기 제2빔이 상기 제1반사부재에 의해 반사된 제1반사빔과 상기 제3빔이 상기 제2반사부재에 의해 반사된 제2반사빔이 입사하면, 상기 제1반사빔과 상기 제2반사빔을 제4빔으로 결합하는 빔 스플리터, The first beam is divided into a second beam and a third beam having different propagation paths, and the first reflecting beam and the third beam reflecting the second beam by the first reflecting member are the second reflecting member. A beam splitter for combining the first reflection beam and the second reflection beam into a fourth beam when the second reflection beam reflected by the light is incident;
    상기 제4빔에 대응하는 영상신호를 검출하는 검출기, 그리고 A detector for detecting an image signal corresponding to the fourth beam, and
    상기 영상신호로부터 간섭패턴을 검출하고, 상기 간섭패턴과 기 저장된 간섭패턴을 비교하여 상기 제1엘리먼트 및 제2엘리먼트 간의 변위정보를 산출하는 변위검출부 A displacement detector which detects an interference pattern from the image signal and compares the interference pattern with previously stored interference patterns to calculate displacement information between the first element and the second element.
    를 포함하는 캘리브레이션 장치.Calibration device comprising a.
  2. 제1항에 있어서, The method of claim 1,
    상기 제3빔과 상기 제2반사빔의 광경로 상에 배치되며, 상기 제3빔을 상기 제2반사부재로 전달하고, 상기 제2반사빔을 상기 빔 스플리터로 전달하는 미러를 더 포함하는 캘리브레이션 장치.And a mirror disposed on an optical path between the third beam and the second reflecting beam, the mirror configured to transmit the third beam to the second reflecting member and to transmit the second reflecting beam to the beam splitter. Device.
  3. 제2항에 있어서, The method of claim 2,
    상기 광원부, 상기 빔 스플리터, 및 상기 검출기를 수용하는 케이스를 더 포함하는 캘리브레이션 장치. And a case accommodating the light source unit, the beam splitter, and the detector.
  4. 제2항에 있어서, The method of claim 2,
    상기 광원부, 상기 빔 스플리터, 상기 미러, 상기 검출기, 및 상기 제2반사부재를 수용하는 케이스를 더 포함하는 캘리브레이션 장치. And a case accommodating the light source unit, the beam splitter, the mirror, the detector, and the second reflecting member.
  5. 제1항에 있어서, The method of claim 1,
    상기 제1엘리먼트와 제2엘리먼트는 카메라를 포함하는 캘리브레이션 장치.The first element and the second element calibration device comprising a camera.
  6. 제1항에 있어서, The method of claim 1,
    상기 변위정보는 상기 제1엘리먼트 및 제2엘리먼트 간의 거리 이동정보를 포함하며,The displacement information includes distance movement information between the first element and the second element,
    상기 변위검출부는 상기 간섭패턴에 포함된 원형 프린지(fringe)를 토대로 상기 거리 이동정보를 획득하는 캘리브레이션 장치. And the displacement detection unit obtains the distance movement information based on a circular fringe included in the interference pattern.
  7. 제1항에 있어서, The method of claim 1,
    상기 변위정보는 상기 제1엘리먼트 및 제2엘리먼트 간의 상대적인 회전정보를 포함하며, The displacement information includes relative rotation information between the first element and the second element,
    상기 변위검출부는 상기 간섭패턴에 포함된 직선형 프린지를 토대로 상기 회전정보를 획득하는 캘리브레이션 장치. And the displacement detection unit obtains the rotation information based on a straight fringe included in the interference pattern.
  8. 서로 이격되어 배치되는 제1카메라와 제2카메라를 포함하는 카메라 하우징, 그리고A camera housing including a first camera and a second camera spaced apart from each other, and
    상기 제1카메라 및 제2카메라 간의 변위정보를 산출하는 캘리브레이션 장치를 포함하고,It includes a calibration device for calculating displacement information between the first camera and the second camera,
    상기 캘리브레이션 장치는 상기 제1카메라에 배치되는 제1반사부재, 및 상기 제2카메라에 배치되는 제2반사부재The calibration device includes a first reflecting member disposed on the first camera, and a second reflecting member disposed on the second camera.
    를 포함하는 카메라 시스템. Camera system comprising a.
  9. 제8항에 있어서, The method of claim 8,
    상기 산출한 변위정보를 토대로, 객체와의 거리정보 산출에 사용되는 변위 파라미터를 보정하는 제어장치를 더 포함하는 카메라 시스템.And a controller for correcting a displacement parameter used to calculate distance information with respect to an object based on the calculated displacement information.
  10. 제8항에 있어서,The method of claim 8,
    상기 캘리브레이션 장치는,The calibration device,
    상기 제1반사부재 및 제2반사부재에 의해 반사되는 복수의 빔 간의 간섭패턴을 포함하는 영상신호를 획득하는 간섭계, 그리고An interferometer for acquiring an image signal including an interference pattern between a plurality of beams reflected by the first reflecting member and the second reflecting member, and
    상기 영상신호로부터 상기 간섭패턴을 검출하고, 상기 간섭패턴과 기 저장된 간섭패턴을 비교하여 상기 제1카메라 및 제2카메라 간의 변위정보를 산출하는 변위검출부를 포함하는 카메라 시스템.And a displacement detector configured to detect the interference pattern from the image signal and compare displacement patterns with previously stored interference patterns to calculate displacement information between the first camera and the second camera.
  11. 제10항에 있어서, The method of claim 10,
    상기 간섭계는, The interferometer,
    광원부, 그리고A light source unit, and
    상기 광원부로부터 조사된 빔을 광경로가 다른 복수의 빔으로 분리하여 상기 제1반사부재 및 제2반사부재로 각각 전달하는 빔 스플리터를 더 포함하는 카메라 시스템. And a beam splitter which separates the beam irradiated from the light source into a plurality of beams having different optical paths and transmits the beam to the first reflecting member and the second reflecting member, respectively.
  12. 제11항에 있어서, The method of claim 11,
    상기 간섭계는, The interferometer,
    상기 빔 스플리터에서 분리된 빔을 상기 제2반사부재로 전달하는 미러를 더 포함하는 카메라 시스템. And a mirror for transferring the beam separated by the beam splitter to the second reflecting member.
  13. 제11항에 있어서, The method of claim 11,
    상기 광원부는 레이저 또는 레이저 다이오드를 포함하는 카메라 시스템.The light source unit comprises a laser or a laser diode.
  14. 제10항에 있어서, The method of claim 10,
    상기 변위정보는 상기 제1카메라 및 제2카메라 간의 거리 이동정보를 포함하며,The displacement information includes distance movement information between the first camera and the second camera,
    상기 변위검출부는 상기 간섭패턴에 포함된 원형 프린지를 토대로 상기 거리 이동정보를 획득하는 카메라 시스템. And the displacement detection unit obtains the distance movement information based on a circular fringe included in the interference pattern.
  15. 제10항에 있어서, The method of claim 10,
    상기 변위정보는 상기 제1카메라 및 제2카메라 간의 상대적인 회전정보를 포함하며, The displacement information includes relative rotation information between the first camera and the second camera,
    상기 변위검출부는 상기 간섭패턴에 포함된 직선형 프린지를 토대로 상기 회전정보를 획득하는 카메라 시스템The displacement detection unit is a camera system for obtaining the rotation information based on a straight fringe included in the interference pattern
  16. 제15항에 있어서, The method of claim 15,
    상기 카메라 시스템은 상기 복수의 카메라를 통해 복수의 영상 데이터를 획득하고, 상기 복수의 카메라 간의 거리정보 및 상대적인 회전정보를 토대로 상기 복수의 영상 데이터로부터 검출되는 객체와의 거리정보를 산출하는 제어장치를 더 포함하며, The camera system may be configured to acquire a plurality of image data through the plurality of cameras, and calculate a distance information with respect to an object detected from the plurality of image data based on distance information and relative rotation information between the plurality of cameras. More,
    상기 제어장치는 상기 변위정보를 토대로 상기 거리정보 및 회전정보를 업데이트하는 카메라 시스템.And the control device updates the distance information and the rotation information based on the displacement information.
PCT/KR2015/000835 2014-01-28 2015-01-27 Calibration device and camera system WO2015115770A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140010590A KR102158026B1 (en) 2014-01-28 2014-01-28 calibaraton device and camera system
KR10-2014-0010590 2014-01-28

Publications (1)

Publication Number Publication Date
WO2015115770A1 true WO2015115770A1 (en) 2015-08-06

Family

ID=53757305

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/000835 WO2015115770A1 (en) 2014-01-28 2015-01-27 Calibration device and camera system

Country Status (2)

Country Link
KR (1) KR102158026B1 (en)
WO (1) WO2015115770A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020037179A1 (en) * 2018-08-17 2020-02-20 Veoneer Us, Inc. Vehicle cabin monitoring system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255643A (en) * 2021-05-08 2021-08-13 上海砼测检测技术有限公司 Machine vision recognition algorithm applied to displacement monitoring
KR102676294B1 (en) * 2021-12-01 2024-06-19 광주과학기술원 Diffuser camera with conical lens
WO2023219440A1 (en) * 2022-05-11 2023-11-16 엘지이노텍 주식회사 Camera apparatus
KR102621435B1 (en) * 2022-07-12 2024-01-09 주식회사 래비노 Multi-stereo camera calibration method and system using laser light

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980015490A (en) * 1996-08-22 1998-05-25 미노루 이나바 Stereo camera
JP2003504607A (en) * 1999-07-13 2003-02-04 ビーティ エルウィン エム Apparatus and method for three-dimensional inspection of electronic components
KR20030089542A (en) * 2002-05-15 2003-11-22 옵토 다이나믹스(주) Multi-Dimensional Measurement System For Object Features
US20100295926A1 (en) * 2006-11-28 2010-11-25 Prefixa International Inc. Fast Three Dimensional Recovery Method and Apparatus
JP2012132739A (en) * 2010-12-21 2012-07-12 Ricoh Co Ltd Stereo camera calibrating device and calibrating method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5185424B1 (en) * 2011-09-30 2013-04-17 株式会社東芝 Calibration method and video display device
KR101333161B1 (en) * 2012-02-15 2013-11-27 이연태 Apparatus of processing image based on confocal and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980015490A (en) * 1996-08-22 1998-05-25 미노루 이나바 Stereo camera
JP2003504607A (en) * 1999-07-13 2003-02-04 ビーティ エルウィン エム Apparatus and method for three-dimensional inspection of electronic components
KR20030089542A (en) * 2002-05-15 2003-11-22 옵토 다이나믹스(주) Multi-Dimensional Measurement System For Object Features
US20100295926A1 (en) * 2006-11-28 2010-11-25 Prefixa International Inc. Fast Three Dimensional Recovery Method and Apparatus
JP2012132739A (en) * 2010-12-21 2012-07-12 Ricoh Co Ltd Stereo camera calibrating device and calibrating method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020037179A1 (en) * 2018-08-17 2020-02-20 Veoneer Us, Inc. Vehicle cabin monitoring system
US11155226B2 (en) 2018-08-17 2021-10-26 Veoneer Us, Inc. Vehicle cabin monitoring system

Also Published As

Publication number Publication date
KR20150089678A (en) 2015-08-05
KR102158026B1 (en) 2020-09-21

Similar Documents

Publication Publication Date Title
WO2015115770A1 (en) Calibration device and camera system
US9967545B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
CN107957237B (en) Laser projector with flash alignment
WO2016200096A1 (en) Three-dimensional shape measurement apparatus
JP5944719B2 (en) Light distribution characteristic measuring apparatus and light distribution characteristic measuring method
US11073379B2 (en) 3-D environment sensing by means of projector and camera modules
CN108650447B (en) Image sensor, depth data measuring head and measuring system
US20060290781A1 (en) Image obtaining apparatus
JP2018525638A (en) 3D imager
WO2013036076A2 (en) Device and method for measuring three-dimensional shapes using amplitude of a projection grid
US10068348B2 (en) Method and apparatus for indentifying structural elements of a projected structural pattern in camera images
JP2586931B2 (en) Camera ranging device
JP2002543411A (en) Optical detection method of object shape
WO2015080480A1 (en) Wafer image inspection apparatus
JP6287231B2 (en) Ranging device and robot picking system
WO2016163840A1 (en) Three-dimensional shape measuring apparatus
WO2018212395A1 (en) Lidar device and lidar system including the same
WO2021096328A2 (en) Laser tracking device having function for sensing initial position of target, and tracking method
WO2015115771A1 (en) Camera correction module, camera system and method for controlling camera system
JP2013257162A (en) Distance measuring device
JP2001082940A (en) Apparatus and method for generating three-dimensional model
JP7252755B2 (en) Active sensors, object identification systems, vehicles, vehicle lighting
WO2024154931A1 (en) Vision scanning device having 3d shape recognition function
US20210156881A1 (en) Dynamic machine vision sensor (dmvs) that performs integrated 3d tracking
WO2017204459A1 (en) Lidar optical apparatus having improved structure

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15743476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15743476

Country of ref document: EP

Kind code of ref document: A1