WO2009119229A1 - 3次元撮像装置及び3次元撮像装置の校正方法 - Google Patents
3次元撮像装置及び3次元撮像装置の校正方法 Download PDFInfo
- Publication number
- WO2009119229A1 WO2009119229A1 PCT/JP2009/053369 JP2009053369W WO2009119229A1 WO 2009119229 A1 WO2009119229 A1 WO 2009119229A1 JP 2009053369 W JP2009053369 W JP 2009053369W WO 2009119229 A1 WO2009119229 A1 WO 2009119229A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- calibration
- imaging device
- dimensional imaging
- laser
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/671—Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
Definitions
- the present invention relates to a three-dimensional imaging apparatus including a plurality of imaging apparatuses and a calibration method thereof.
- a vehicle-mounted stereo camera that measures the distance between vehicles using a plurality of cameras mounted on the vehicle is known.
- Such an in-vehicle stereo camera is required to continue to operate for a long time (several years or more) once it is installed in the vehicle.
- calibration is usually performed before shipment.
- the mounting position of the lens and image sensor and the dimensions and shapes of the components such as the housing May occur and deviate from the initially set conditions.
- a reference subject is extracted from the photographed subjects and is used to calibrate the in-vehicle stereo camera and maintain the measurement accuracy over the long term. .
- Patent Document 1 discloses a calibration method for an in-vehicle stereo camera using a traffic light or the like.
- Patent Documents 2 and 3 disclose a stereo camera having an automatic calibration function using a license plate or the like.
- Patent Document 4 discloses a calibration method and apparatus for a stereo camera.
- Japanese Patent Laid-Open No. 10-341458 JP 2004-354257 A JP 2004-354256 A Japanese Patent Laid-Open No. 2005-17286
- a reference subject is extracted from the captured video, and calibration is performed using this.
- the reference subject is not always obtained, and the calibration timing has to be irregular, such as shifting the calibration timing until the reference subject is obtained.
- the reference subject is not always at the same position, complicated signal processing is required, and good accuracy is not always obtained.
- the present invention provides a three-dimensional imaging apparatus that can always perform calibration at a necessary timing regardless of subject conditions, and can perform calibration with a certain accuracy.
- An object of the present invention is to provide a calibration method for a three-dimensional imaging apparatus.
- the three-dimensional imaging device includes a plurality of imaging devices each having an imaging device that converts incident light into an electrical signal, and a light emitter that emits laser light.
- a light emission point by plasma is formed in the air in front of the image pickup device by laser light from the light emitter, and positional deviations regarding the plurality of image pickup devices are calibrated using the light emission point as a reference point.
- this three-dimensional imaging device by emitting laser light from the light emitter, a light emission point by plasma is formed in the air in front of the imaging device, and the positional relationship of the plurality of imaging devices is determined using this light emission point as a reference point. Since the displacement can be calibrated, the 3D imaging apparatus can be calibrated anytime and anywhere, and can be calibrated at a necessary timing regardless of the subject conditions, and calibration with a certain degree of accuracy is possible.
- the imaging device and the light emitter are integrally configured.
- a plurality of calibrations can be performed using the plurality of light emitting points as reference points, respectively. Calibration accuracy is improved.
- the light emission pattern may display information for the driver.
- the calibration can be executed periodically when the apparatus is started.
- the calibration may be performed at regular time intervals by emitting the laser light at regular time intervals.
- light invisible light having a long wavelength or a short wavelength can be used as the laser light.
- the calibration method for a three-dimensional imaging device is a method for calibrating a three-dimensional imaging device including a plurality of imaging devices each having an imaging element that converts incident light into an electrical signal, from a light emitter to a laser beam. Is emitted in front of the image pickup device, a light emission point by plasma is formed in the air in front of the image pickup device by the laser light, and a deviation related to a positional relationship among the plurality of image pickup devices is calibrated with the light emission point as a reference point. It is characterized by that.
- a laser beam is emitted from a light emitter to form a light emission point by plasma in the air in front of the image pickup device, and the light emission point is used as a reference point for a plurality of image pickup devices. Since the positional deviation can be calibrated, the 3D imaging device can be calibrated anytime and anywhere, and it can always be calibrated at the required timing regardless of the conditions of the subject, maintaining a certain level of accuracy. Calibration is possible.
- calibration can always be performed at a necessary timing regardless of subject conditions, and calibration with a certain accuracy can be performed. .
- FIG. 2 is a block diagram schematically showing an overall configuration of the three-dimensional imaging apparatus in FIG. 1.
- 3 is a flowchart for explaining a calibration step of a stereo camera in the three-dimensional imaging apparatus of FIGS. 1 and 2. It is a figure which shows the structure of the principal part of another three-dimensional imaging device. It is a figure which shows schematic structure of the laser light emitter of the three-dimensional imaging device of FIG. It is a figure which shows the structure of the principal part of another three-dimensional imaging device.
- FIG. 1 is a diagram illustrating a configuration of a main part of a three-dimensional imaging apparatus.
- FIG. 2 is a block diagram schematically showing the overall configuration of the three-dimensional imaging apparatus.
- the three-dimensional imaging apparatus 10 includes a stereo camera 11 and a laser light emitter (light emitter) 14.
- the stereo camera 11 includes a reference camera (imaging device) 11 a having a lens 1 and an image sensor 2, and a reference camera (imaging device) 11 b having a lens 3 and an image sensor 4.
- the laser emitter 14 includes a laser light source 14a made of a semiconductor laser that generates invisible light of infrared light or ultraviolet light, and a lens optical system 14b made of a lens.
- the three-dimensional imaging device 10 includes a stereo camera 11, an image input unit 12 for inputting each data of the standard image and the reference image from each camera 11 a and 11 b, and a stereo image of the standard image and the reference image.
- the vehicle includes an obstacle detection unit 18 that detects a preceding vehicle, a pedestrian, and the like based on the obtained distance image, and a control unit 19 that controls each of the units 11 to 18, and is mounted on a vehicle such as an automobile.
- the reference camera 11a of the stereo camera 11 is composed of an optical system of a lens 1 with a focal length f and an image pickup device 2 such as a CCD or a CMOS image sensor, and the reference camera 11b is a lens 3 with a focal length f.
- an imaging device 4 composed of a CCD, a CMOS image sensor, or the like.
- image data signals captured from the image sensor 2 and the image sensor 4 are output, and a standard image is obtained from the image sensor 2 of the standard camera 11a and a reference image is obtained from the image sensor 4 of the reference camera 11b.
- the reference camera 11a, the reference camera 11b, and the laser emitter 14 are installed on a common substrate 21 and fixed integrally so as to have a predetermined positional relationship.
- the laser emitter 14 is disposed between the reference camera 11a and the reference camera 11b, and the laser light B from the laser light source 14a is condensed at a point A in the air on the optical axis p by the lens optical system 14b. At this condensing point (light emitting point) A, light emission by plasma occurs.
- Plasma emission by focusing laser light in the air is a well-known physical phenomenon.
- 3D (3D) image floating in the air by AIST (AIST TODAY 2006-04)
- VOL.6 NO.04 pages 16 to 19 (http://www.aist.go.jp/aist_j/aistinfo/aist_today/vol06_04/vol06_04_topics/vol06_04_topics.html) Has been.
- the condensing point (light emitting point) A by the laser emitter 14 is fixed at a constant distance within a range of 0.5 to 3 m, for example, from the front surface of the three-dimensional imaging device 10. This distance can be appropriately set according to the focal length of the lens optical system 14b of the laser emitter 14. By fixing the light emission point A, the laser light emitter 14 does not require a drive system and can be configured simply.
- the laser emitter 14 is provided at the center of the two cameras 11a and 11b, the light emission point A is formed in the air at a fixed distance from the cameras 11a and 11b, and the light emission point A is used as a reference point. By doing so, the positional deviation between the two cameras 11a and 11b can be calibrated.
- the imaging devices 2 and 4 of the base camera 11a and the reference camera 11b have their imaging surfaces 2a and 4a arranged on a common surface g, and the lenses 1 and 3 have their lens centers O1 and O3.
- the optical axis a passing through the optical axis b and the optical axis b are arranged in parallel and at a lens center interval L in the lateral direction.
- the lenses 1 and 3 are arranged on a common lens surface h that is orthogonal to the optical axes a and b and passes through the lens centers O1 and O3.
- the common surface g of the imaging surfaces 2a and 4a and the lens surface h are separated by a focal length f and are parallel.
- the horizontal interval between the reference points 2b and 4b where the optical axes a and b are orthogonal to each other on the imaging surfaces 2a and 4a is equal to the lens center interval L.
- the optical axis p of the laser emitter 14 in FIG. 1 is orthogonal to the common plane g of the imaging surfaces 2 a and 4 a, the distance L 1 between the optical axis p and the optical axis a of the lens 1, and the optical axis p and the lens 3.
- L1 + L2 L (1)
- the distance measurement target is a light emitting point A on the optical axis p
- the distance from the lens surface h to the light emitting point A is H.
- the light from the light emitting point A passes through the center O1 of the lens 1 of the reference camera 11a and forms an image at the imaging position 2c on the imaging surface 2a as shown by the broken line in FIG. 1, while the lens 3 of the reference camera 11b Assume that an image is formed at the imaging position 4c on the imaging surface 4a through the center O3.
- the distance m from the reference point 2b on the imaging surface 2a of the reference camera 11a to the imaging position 2c and the distance n from the reference point 4b on the imaging surface 4a of the reference camera 11b to the imaging position 4c are the reference camera 11a.
- the shift amount (parallax) resulting from the reference camera 11b being arranged at the interval L. From FIG.
- H (L1 ⁇ f) / m (2)
- H (L2 ⁇ f) / n (3)
- the distance image generation unit 13 processes the calculation by the SAD method or the POC method by hardware using an integrated element or the like, but may be processed by a CPU (Central Processing Unit) in software. In this case, the CPU executes a predetermined calculation according to a predetermined program.
- SAD Sum of Absolute Difference
- POC Phase-only correlation method
- the light emission point A is used as the reference point.
- the three-dimensional imaging device 10 detects the positional deviation between the two cameras 11a and 11b using the known distance H0 until the calibration (calibration).
- the calibration deviation determination unit 16 in FIG. 2 detects a positional deviation in the stereo camera 11 and determines the presence or absence of the positional deviation.
- the positional deviation in the stereo camera 11 in FIG. 1 is the positional deviation between the camera 11a and the camera 11b, the inclination of the optical axis a and the optical axis b, the parallelism of the optical axis a and the optical axis b, and the lens center distance L. This means that an error occurs in the distance value detected by the three-dimensional imaging device 10 due to the shift of the image, and the epipolar line on the image shifts.
- the calibration data holding unit 15 stores and holds a known distance H0 to the light emitting point A formed by the laser beam B from the laser emitter 14 and calibration data.
- the distance image generation unit 13 measures the distance H from the distance image to the light emission point A, and the calibration deviation determination unit 16 compares the measured distance H with the known distance H0 to determine whether or not there is a positional deviation. For example, if both distances H and H0 match or are deviated and are within a predetermined range, it is determined that there is no misalignment, otherwise, it is determined that there is misalignment, and the misalignment determination result is used as calibration data. The result is output to the calculation / generation unit 17.
- the calibration data calculation / generation unit 17 calculates and generates calibration data such as the degree of parallelism of the stereo camera 11 based on the deviation determination result, and the calibration data holding unit 15 stores and holds the calibration data.
- the distance image generation unit 13 corrects the distance error based on the calibration data from the calibration data holding unit 15 and generates a distance image so as to correct the epipolar line on the image.
- the control unit 19 in FIG. 2 includes a CPU (Central Processing Unit) and a storage medium such as a ROM in which a program for distance image generation and calibration as described above is stored, and is read from the storage medium.
- the CPU performs control so as to execute each step as shown in the flowchart of FIG.
- a calibration step of the stereo camera 11 in the three-dimensional imaging device 10 of FIGS. 1 and 2 will be described with reference to the flowchart of FIG.
- the three-dimensional imaging device 10 shifts to the calibration mode (S02), and the laser emitter 14 is activated (S03). Thereby, the light emission point A of FIG. 1 is formed by plasma in the air in front of the vehicle (S04).
- the distance H to the light emission point A is measured by the distance image generation unit 13 in FIG. 2 (S05), and the calibration deviation determination unit 16 compares the measured distance H with the known distance H0 (S06), and the positional deviation. If yes (S07), calibration is performed as follows (S08).
- the deviation determination result of the calibration deviation determination unit 16 is output to the calibration data calculation / generation unit 17, and the calibration data calculation / generation unit 17 calculates the calibration data such as the degree of parallelization of the stereo camera 11 based on the deviation determination result.
- the calibration data holding unit 15 stores and holds the calibration data.
- the distance image generation unit 13 corrects the distance error based on the calibration data from the calibration data holding unit 15 and generates a distance image so as to correct the epipolar line on the image.
- the light emission point A by plasma is formed in the air in front of the vehicle by the laser light from the laser emitter 14, and the stereo camera 11 is related to the light emission point A as a reference point. Since positional deviation can be calibrated, the location where the 3D imaging apparatus 10 can be calibrated is not limited, and calibration can be performed anytime and anywhere. Further, the calibration can be performed at a necessary timing regardless of the condition of the subject in front of the vehicle, and calibration with a certain accuracy can be performed.
- the obstacle detection unit 18 detects a preceding vehicle, a pedestrian, etc., measures the distance to the preceding vehicle, etc., and uses the detection / measurement information as an image or sound.
- the detection / measurement information can be made more accurate by appropriately executing the calibration.
- FIG. 4 is a diagram illustrating a configuration of a main part of another three-dimensional imaging apparatus.
- FIG. 5 is a diagram showing a schematic configuration of a laser emitter of the three-dimensional imaging apparatus of FIG.
- the laser emitter 24 is a stereo camera.
- 11 is arranged between the reference camera 11a and the reference camera 11b, and is controlled by the control unit 19 in FIG.
- the laser emitter 24 includes a laser light source 25 made of a semiconductor laser that generates invisible light such as infrared light or ultraviolet light, a lens optical system 26, and an optical scanning unit 27.
- the optical scanning unit 27 can be rotated about a rotation axis 28a in a rotation direction r and the opposite direction r ′ by a driving means (not shown) such as a motor, and the laser beam from the laser light source 25 is incident.
- the moving reflecting member 28 and the reflecting member 29 that reflects the light from the rotating reflecting member 28 are provided.
- the laser light from the laser light source 25 is reflected by the rotating reflecting member 28 and the reflecting member 29 and is emitted to the outside from the lens optical system 26.
- the rotating reflecting member 28 is rotated about the rotating shaft 28a.
- the laser beam is scanned in the rotating direction.
- the laser light enters the lens optical system 26 so as to diverge with respect to the optical axis p, and is emitted from the lens optical system 26 while being inclined with respect to the optical axis p as shown in FIG.
- a plurality of light emitting points C, D, and E can be formed in the air as shown in FIG. Since the distances to the plurality of light emitting points C, D, and E are constant and fixed, the calibration can be performed a plurality of times in the same manner as described above using the plurality of light emitting points C, D, and E as reference points. More accurate calibration can be performed.
- the plurality of light emitting points C, D, and E need only be formed at the time of calibration, and do not need to be formed at the same time.
- the rotating reflecting member 28 is rotated by a predetermined angle.
- the light emission point C may be formed by stopping to form the light emission point D by stopping at the neutral position
- the light emission point E may be formed by rotating the light beam to the opposite side by a predetermined angle.
- the rotation reflection member 28 is used as the optical scanning unit 27, the present invention is not limited to this, and other optical scanning means may be used.
- a polarizing member such as a prism is arranged on the optical axis p, and polarized light is used.
- Optical scanning may be performed by changing the position of the member around the optical axis p.
- a micro electro mechanical element (MEMS) type optical scanner may be used. Further, the position of the rotating reflecting member 28 in FIG.
- MEMS micro electro mechanical element
- FIG. 6 is a diagram illustrating a configuration of a main part of another three-dimensional imaging apparatus.
- the three-dimensional imaging device 40 of FIG. 6 has the same configuration as that of FIGS. 1 and 2 except that the laser emitter 34 forms a light emission pattern composed of a plurality of light emission points in the air. Is arranged between the reference camera 11a and the reference camera 11b of the stereo camera 11, and is controlled by the control unit 19 in FIG.
- the laser emitter 34 includes a laser light source 25 made of a semiconductor laser that generates invisible light of infrared light or ultraviolet light, a lens optical system 26, and an optical scanning unit 27.
- the scanning unit 27 can scan the laser light from the laser light source 25 in two different directions.
- the reflection member 29 is configured to be rotatable in the same manner as the rotation reflection member 28, and the rotation direction is different from the rotation direction of the rotation reflection member 28.
- a two-dimensional arbitrary pattern such as a lattice pattern Z as shown in FIG. 6 can be formed in the air.
- a pattern formed in the air can be used for information display, and information display for a driver may be used for calibration of the stereo camera 11.
- information display for a driver may be used for calibration of the stereo camera 11.
- information for the driver of the vehicle in the air in front of the vehicle, it can be used for displaying information for the driver.
- the information for the driver is not particularly limited. For example, there is caution information for wearing a seat belt, vehicle maintenance information, and the like, and in conjunction with a navigation system mounted on the vehicle, direction indication information, road congestion information, and a place name are included. Information or the like may be displayed.
- a MEMS type optical scanner may be used in the same manner as described above.
- a one-dimensional type is provided at each of the reflecting members 28 and 29 in FIG.
- the two-dimensional type is arranged at the position of the reflecting member 28 or 29.
- other optical scanning means such as a galvanometer mirror or a polygon mirror may be used.
- the present invention is not limited to these, and various modifications are possible within the scope of the technical idea of the present invention.
- the three-dimensional imaging apparatus of FIGS. 1 and 2 includes a stereo camera including two cameras, the present invention is not limited to this, and may include three or more cameras. Good.
- the calibration is automatically performed at the time of starting the vehicle, and the calibration is automatically repeated again after a predetermined time from the calibration, but only at the time of starting, only when the predetermined time has elapsed from the time of starting.
- the calibration may be automatically performed every time a predetermined time elapses without being performed at the time of starting.
- a manual button may be provided in the three-dimensional imaging apparatus 10 and the calibration may be performed manually by operating the manual button.
- L1 L2 with respect to the distance L1 between the optical axis p of the laser emitter 14 and the optical axis a of the lens 1 and the distance L2 between the optical axis p and the optical axis b of the lens 3.
- the laser emitter 14 may be arranged so that L1 ⁇ L2.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
Abstract
Description
1,3 レンズ
2,4 撮像素子
11 ステレオカメラ
11a 基準カメラ
11b 参照カメラ
14,24,34 レーザ発光器
27 光走査部
A 発光点、集光点
B レーザ光
C~I 発光点
L1+L2=L ・・・(1)
ここで、距離計測対象を光軸p上の発光点Aとし、レンズ面hから発光点Aまでの距離をHとする。発光点Aからの光が図1の破線のように、基準カメラ11aのレンズ1の中心O1を通過して撮像面2a上の結像位置2cに結像する一方、参照カメラ11bのレンズ3の中心O3を通過して撮像面4a上の結像位置4cに結像したとする。基準カメラ11aの撮像面2a上の基準点2bから結像位置2cまでの距離m、及び、参照カメラ11bの撮像面4a上の基準点4bから結像位置4cまでの距離nは、基準カメラ11aと参照カメラ11bが間隔Lで配置されたことに起因するシフト量(視差)である。図1からH/L1=f/m、及び、H/L2=f/nが成り立ち、次式(2),(3)を得る。
H=(L1・f)/m ・・・(2)
H=(L2・f)/n ・・・(3)
本実施の形態の図1では、L1=L2であるから、上記式(1)から、L1=L2=L/2である。したがって、上記式(2)、(3)から、次式(4),(5)を得る。
H=(L・f)/2m ・・・(4)
H=(L・f)/2n ・・・(5)
上記式(4)、(5)から、レンズ中心間隔L,焦点距離fが一定であるので、シフト量m,nから発光点Aまでの距離Hを計測できる。このように三角測量の原理によりステレオカメラ11からの画像情報に基づいて発光点Aまでの距離Hを計測できる。
Claims (9)
- 入射した光を電気信号に変換する撮像素子をそれぞれ有する複数の撮像装置と、レーザ光を発光する発光器と、を備え、
前記発光器からのレーザ光により前記撮像装置の前方の空中にプラズマによる発光点を形成し、
前記発光点を基準点として前記複数の撮像装置に関する位置関係のずれを校正することを特徴とする3次元撮像装置。 - 前記撮像装置と前記発光器とが一体的に構成されていることを特徴とする請求の範囲第1項に記載の3次元撮像装置。
- 前記レーザ光により空中に複数の発光点を形成し、前記複数の発光点に基づいて前記校正を行うことを特徴とする請求の範囲第1項または第2項に記載の3次元撮像装置。
- 前記レーザ光により空中に発光パターンを形成し、前記発光パターンに基づいて前記校正を行うことを特徴とする請求の範囲第1項または第2項に記載の3次元撮像装置。
- 装置立ち上がり時に前記レーザ光を発光させて前記校正を行うことを特徴とする請求の範囲第1項から第4項のいずれか1項に記載の3次元撮像装置。
- 前記レーザ光を一定の時間間隔で発光させて前記校正を一定の時間間隔で行うことを特徴とする請求の範囲第1項から第5項のいずれか1項に記載の3次元撮像装置。
- 前記レーザ光として不可視光を用いることを特徴とする請求の範囲第1項から第6項のいずれか1項に記載の3次元撮像装置。
- 前記発光パターンがドライバへの情報を表示するものであることを特徴とする請求の範囲第4項に記載の3次元撮像装置。
- 入射した光を電気信号に変換する撮像素子をそれぞれ有する複数の撮像装置を備える3次元撮像装置を校正する方法であって、
発光器からレーザ光を前記撮像装置の前方に発光し、
前記レーザ光により前記撮像装置の前方の空中にプラズマによる発光点を形成し、
前記発光点を基準点として前記複数の撮像装置に関する位置関係のずれを校正することを特徴とする3次元撮像装置の校正方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010505467A JPWO2009119229A1 (ja) | 2008-03-26 | 2009-02-25 | 3次元撮像装置及び3次元撮像装置の校正方法 |
US12/933,696 US20110018973A1 (en) | 2008-03-26 | 2009-02-25 | Three-dimensional imaging device and method for calibrating three-dimensional imaging device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-080153 | 2008-03-26 | ||
JP2008080153 | 2008-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009119229A1 true WO2009119229A1 (ja) | 2009-10-01 |
Family
ID=41113435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/053369 WO2009119229A1 (ja) | 2008-03-26 | 2009-02-25 | 3次元撮像装置及び3次元撮像装置の校正方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110018973A1 (ja) |
JP (1) | JPWO2009119229A1 (ja) |
WO (1) | WO2009119229A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011058876A1 (ja) * | 2009-11-13 | 2011-05-19 | 富士フイルム株式会社 | 測距装置、測距方法、測距プログラムおよび測距システムならびに撮像装置 |
DE102010042821A1 (de) * | 2010-10-22 | 2012-04-26 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Bestimmung einer Basisbreite eines Stereo-Erfassungssystems |
JP2014098625A (ja) * | 2012-11-14 | 2014-05-29 | Toshiba Corp | 計測装置、方法及びプログラム |
EP2818826A1 (en) | 2013-06-27 | 2014-12-31 | Ricoh Company, Ltd. | Distance measuring apparatus, vehicle and method of calibration in distance measuring apparatus |
JP2015132540A (ja) * | 2014-01-14 | 2015-07-23 | 株式会社リコー | 測距装置及びロボットピッキングシステム |
WO2018147340A1 (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
CN109916279A (zh) * | 2019-03-04 | 2019-06-21 | Oppo广东移动通信有限公司 | 终端盖板的平整度检测方法、装置、测试机台及存储介质 |
JPWO2018043225A1 (ja) * | 2016-09-01 | 2019-06-24 | パナソニックIpマネジメント株式会社 | 多視点撮像システム、三次元空間再構成システム、及び三次元空間認識システム |
WO2019124750A1 (ko) * | 2017-12-19 | 2019-06-27 | (주)리플레이 | 타임슬라이스 촬영을 위한 카메라 캘리브레이션 방법 및 이를 위한 장치 |
WO2020053936A1 (ja) * | 2018-09-10 | 2020-03-19 | 三菱電機株式会社 | カメラ設置支援装置及び方法並びに設置角度算出方法、並びにプログラム及び記録媒体 |
JP2020204583A (ja) * | 2019-06-19 | 2020-12-24 | 株式会社Subaru | 画像処理装置 |
WO2022004248A1 (ja) * | 2020-06-30 | 2022-01-06 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及びプログラム |
TWI781109B (zh) * | 2016-08-02 | 2022-10-21 | 南韓商三星電子股份有限公司 | 立體三角測量的系統和方法 |
Families Citing this family (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3328048B1 (en) | 2008-05-20 | 2021-04-21 | FotoNation Limited | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US8514491B2 (en) | 2009-11-20 | 2013-08-20 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US20120012748A1 (en) | 2010-05-12 | 2012-01-19 | Pelican Imaging Corporation | Architectures for imager arrays and array cameras |
TWI448666B (zh) * | 2010-06-15 | 2014-08-11 | Pixart Imaging Inc | 依據環境溫度以校正測距裝置所量測之待測物之待測距離之校正方法與其相關裝置 |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
WO2012155119A1 (en) | 2011-05-11 | 2012-11-15 | Pelican Imaging Corporation | Systems and methods for transmitting and receiving array camera image data |
US20130265459A1 (en) | 2011-06-28 | 2013-10-10 | Pelican Imaging Corporation | Optical arrangements for use with an array camera |
EP2726930A4 (en) | 2011-06-28 | 2015-03-04 | Pelican Imaging Corp | OPTICAL ARRANGEMENTS FOR USE WITH AN ARRAY CAMERA |
US9270974B2 (en) * | 2011-07-08 | 2016-02-23 | Microsoft Technology Licensing, Llc | Calibration between depth and color sensors for depth cameras |
KR101300350B1 (ko) * | 2011-08-09 | 2013-08-28 | 삼성전기주식회사 | 영상 처리 장치 및 영상 처리 방법 |
US20130070060A1 (en) | 2011-09-19 | 2013-03-21 | Pelican Imaging Corporation | Systems and methods for determining depth from multiple views of a scene that include aliasing using hypothesized fusion |
US9129183B2 (en) | 2011-09-28 | 2015-09-08 | Pelican Imaging Corporation | Systems and methods for encoding light field image files |
US9412206B2 (en) | 2012-02-21 | 2016-08-09 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US9324190B2 (en) * | 2012-02-24 | 2016-04-26 | Matterport, Inc. | Capturing and aligning three-dimensional scenes |
US10848731B2 (en) | 2012-02-24 | 2020-11-24 | Matterport, Inc. | Capturing and aligning panoramic image and depth data |
US11282287B2 (en) | 2012-02-24 | 2022-03-22 | Matterport, Inc. | Employing three-dimensional (3D) data predicted from two-dimensional (2D) images using neural networks for 3D modeling applications and other applications |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
JP2015534734A (ja) | 2012-06-28 | 2015-12-03 | ペリカン イメージング コーポレイション | 欠陥のあるカメラアレイ、光学アレイ、およびセンサを検出するためのシステムおよび方法 |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
KR102111181B1 (ko) | 2012-08-21 | 2020-05-15 | 포토내이션 리미티드 | 어레이 카메라를 사용하여 포착된 영상에서의 시차 검출 및 보정을 위한 시스템 및 방법 |
CN104685513B (zh) | 2012-08-23 | 2018-04-27 | 派力肯影像公司 | 根据使用阵列源捕捉的低分辨率图像的基于特征的高分辨率运动估计 |
US9214013B2 (en) | 2012-09-14 | 2015-12-15 | Pelican Imaging Corporation | Systems and methods for correcting user identified artifacts in light field images |
EP4307659A1 (en) | 2012-09-28 | 2024-01-17 | Adeia Imaging LLC | Generating images from light fields utilizing virtual viewpoints |
US9143711B2 (en) | 2012-11-13 | 2015-09-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
WO2014130849A1 (en) | 2013-02-21 | 2014-08-28 | Pelican Imaging Corporation | Generating compressed light field representation data |
WO2014133974A1 (en) | 2013-02-24 | 2014-09-04 | Pelican Imaging Corporation | Thin form computational and modular array cameras |
US9638883B1 (en) | 2013-03-04 | 2017-05-02 | Fotonation Cayman Limited | Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process |
US9917998B2 (en) | 2013-03-08 | 2018-03-13 | Fotonation Cayman Limited | Systems and methods for measuring scene information while capturing images using array cameras |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
US9519972B2 (en) | 2013-03-13 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
WO2014164550A2 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
WO2014164909A1 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | Array camera architecture implementing quantum film sensors |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
WO2014153098A1 (en) | 2013-03-14 | 2014-09-25 | Pelican Imaging Corporation | Photmetric normalization in array cameras |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9497370B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Array camera architecture implementing quantum dot color filters |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
JP2016524125A (ja) | 2013-03-15 | 2016-08-12 | ペリカン イメージング コーポレイション | カメラアレイを用いた立体撮像のためのシステムおよび方法 |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
JP6210748B2 (ja) * | 2013-06-17 | 2017-10-11 | キヤノン株式会社 | 三次元位置計測装置、及び三次元位置計測装置のキャリブレーションずれ判定方法 |
US9898856B2 (en) | 2013-09-27 | 2018-02-20 | Fotonation Cayman Limited | Systems and methods for depth-assisted perspective distortion correction |
US9264592B2 (en) | 2013-11-07 | 2016-02-16 | Pelican Imaging Corporation | Array camera modules incorporating independently aligned lens stacks |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US9456134B2 (en) | 2013-11-26 | 2016-09-27 | Pelican Imaging Corporation | Array camera configurations incorporating constituent array cameras and constituent cameras |
WO2015134996A1 (en) | 2014-03-07 | 2015-09-11 | Pelican Imaging Corporation | System and methods for depth regularization and semiautomatic interactive matting using rgb-d images |
US9247117B2 (en) | 2014-04-07 | 2016-01-26 | Pelican Imaging Corporation | Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array |
US9521319B2 (en) | 2014-06-18 | 2016-12-13 | Pelican Imaging Corporation | Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor |
CN113256730B (zh) | 2014-09-29 | 2023-09-05 | 快图有限公司 | 用于阵列相机的动态校准的系统和方法 |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
JP6158258B2 (ja) * | 2015-08-07 | 2017-07-05 | 日立オートモティブシステムズ株式会社 | 車載画像処理装置 |
US10531073B2 (en) | 2016-03-17 | 2020-01-07 | Samsung Electronics Co., Ltd. | Method and apparatus for automatic calibration of RGBZ sensors utilizing epipolar geometry and scanning beam projector |
ES2614228B2 (es) * | 2016-09-13 | 2018-01-09 | Defensya Ingeniería Internacional, S.L. | Dispositivo para la creación de señalización luminosa en el espacio circundante a uno o más vehículos |
US10261515B2 (en) * | 2017-01-24 | 2019-04-16 | Wipro Limited | System and method for controlling navigation of a vehicle |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
JPWO2019065218A1 (ja) * | 2017-09-28 | 2020-10-22 | 株式会社小糸製作所 | センサシステム |
US11423572B2 (en) * | 2018-12-12 | 2022-08-23 | Analog Devices, Inc. | Built-in calibration of time-of-flight depth imaging systems |
CN118033607A (zh) * | 2018-12-12 | 2024-05-14 | 美国亚德诺半导体公司 | 飞行时间深度成像系统的内置校准 |
EP3821267A4 (en) | 2019-09-17 | 2022-04-13 | Boston Polarimetrics, Inc. | SURFACE MODELING SYSTEMS AND METHODS USING POLARIZATION MARKERS |
CN114766003B (zh) | 2019-10-07 | 2024-03-26 | 波士顿偏振测定公司 | 用于利用偏振增强传感器系统和成像系统的系统和方法 |
KR102558903B1 (ko) | 2019-11-30 | 2023-07-24 | 보스턴 폴라리메트릭스, 인크. | 편광 신호를 이용한 투명 물체 분할을 위한 시스템 및 방법 |
JP7462769B2 (ja) | 2020-01-29 | 2024-04-05 | イントリンジック イノベーション エルエルシー | 物体の姿勢の検出および測定システムを特徴付けるためのシステムおよび方法 |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
DE102020007645A1 (de) * | 2020-04-03 | 2021-10-07 | Daimler Ag | Verfahren zur Kalibrierung eines Lidarsensors |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US11620937B2 (en) * | 2020-07-14 | 2023-04-04 | Samsung Electronics Co.. Ltd. | Light source device and light emission control method |
US11587260B2 (en) * | 2020-10-05 | 2023-02-21 | Zebra Technologies Corporation | Method and apparatus for in-field stereo calibration |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03503680A (ja) * | 1988-04-12 | 1991-08-15 | メトロノール・エイ・エス | 光電子工学式角度測定システム |
JPH0771956A (ja) * | 1993-09-06 | 1995-03-17 | Fuji Film Micro Device Kk | 距離計測装置 |
JP2000234926A (ja) * | 1999-02-16 | 2000-08-29 | Honda Motor Co Ltd | 立体画像処理装置及びその画像領域対応付け方法 |
JP2004354256A (ja) * | 2003-05-29 | 2004-12-16 | Olympus Corp | キャリブレーションずれ検出装置及びこの装置を備えたステレオカメラ並びにステレオカメラシステム |
JP2007206588A (ja) * | 2006-02-06 | 2007-08-16 | National Institute Of Advanced Industrial & Technology | 空中可視像形成装置および空中可視像形成方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6542840B2 (en) * | 2000-01-27 | 2003-04-01 | Matsushita Electric Industrial Co., Ltd. | Calibration system, target apparatus and calibration method |
JP3650811B2 (ja) * | 2002-02-13 | 2005-05-25 | 株式会社トプコン | 空中可視像形成装置 |
DE10246067B4 (de) * | 2002-10-02 | 2008-01-03 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Kalibrierung eines Bildsensorsystems in einem Kraftfahrzeug |
WO2005115017A1 (en) * | 2003-02-14 | 2005-12-01 | Lee Charles C | 3d camera system and method |
-
2009
- 2009-02-25 US US12/933,696 patent/US20110018973A1/en not_active Abandoned
- 2009-02-25 WO PCT/JP2009/053369 patent/WO2009119229A1/ja active Application Filing
- 2009-02-25 JP JP2010505467A patent/JPWO2009119229A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03503680A (ja) * | 1988-04-12 | 1991-08-15 | メトロノール・エイ・エス | 光電子工学式角度測定システム |
JPH0771956A (ja) * | 1993-09-06 | 1995-03-17 | Fuji Film Micro Device Kk | 距離計測装置 |
JP2000234926A (ja) * | 1999-02-16 | 2000-08-29 | Honda Motor Co Ltd | 立体画像処理装置及びその画像領域対応付け方法 |
JP2004354256A (ja) * | 2003-05-29 | 2004-12-16 | Olympus Corp | キャリブレーションずれ検出装置及びこの装置を備えたステレオカメラ並びにステレオカメラシステム |
JP2007206588A (ja) * | 2006-02-06 | 2007-08-16 | National Institute Of Advanced Industrial & Technology | 空中可視像形成装置および空中可視像形成方法 |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102597693B (zh) * | 2009-11-13 | 2015-04-01 | 富士胶片株式会社 | 测距装置、测距方法、测距程序及测距系统以及拍摄装置 |
CN102597693A (zh) * | 2009-11-13 | 2012-07-18 | 富士胶片株式会社 | 测距装置、测距方法、测距程序及测距系统以及拍摄装置 |
JP5214811B2 (ja) * | 2009-11-13 | 2013-06-19 | 富士フイルム株式会社 | 測距装置、測距方法、測距プログラムおよび測距システムならびに撮像装置 |
US8654195B2 (en) | 2009-11-13 | 2014-02-18 | Fujifilm Corporation | Distance measuring apparatus, distance measuring method, distance measuring program, distance measuring system, and image pickup apparatus |
WO2011058876A1 (ja) * | 2009-11-13 | 2011-05-19 | 富士フイルム株式会社 | 測距装置、測距方法、測距プログラムおよび測距システムならびに撮像装置 |
DE102010042821A1 (de) * | 2010-10-22 | 2012-04-26 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Bestimmung einer Basisbreite eines Stereo-Erfassungssystems |
DE102010042821B4 (de) * | 2010-10-22 | 2014-11-20 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Bestimmung einer Basisbreite eines Stereo-Erfassungssystems |
JP2014098625A (ja) * | 2012-11-14 | 2014-05-29 | Toshiba Corp | 計測装置、方法及びプログラム |
US9866819B2 (en) | 2013-06-27 | 2018-01-09 | Ricoh Company, Ltd. | Distance measuring apparatus, vehicle and method of calibration in distance measuring apparatus |
EP2818826A1 (en) | 2013-06-27 | 2014-12-31 | Ricoh Company, Ltd. | Distance measuring apparatus, vehicle and method of calibration in distance measuring apparatus |
JP2015132540A (ja) * | 2014-01-14 | 2015-07-23 | 株式会社リコー | 測距装置及びロボットピッキングシステム |
TWI781109B (zh) * | 2016-08-02 | 2022-10-21 | 南韓商三星電子股份有限公司 | 立體三角測量的系統和方法 |
JPWO2018043225A1 (ja) * | 2016-09-01 | 2019-06-24 | パナソニックIpマネジメント株式会社 | 多視点撮像システム、三次元空間再構成システム、及び三次元空間認識システム |
JP7113294B2 (ja) | 2016-09-01 | 2022-08-05 | パナソニックIpマネジメント株式会社 | 多視点撮像システム |
WO2018147340A1 (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
JP2018128397A (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 位置計測システム、作業機械、及び位置計測方法 |
CN108700402A (zh) * | 2017-02-09 | 2018-10-23 | 株式会社小松制作所 | 位置测量系统、作业机械及位置测量方法 |
US11120577B2 (en) | 2017-02-09 | 2021-09-14 | Komatsu Ltd. | Position measurement system, work machine, and position measurement method |
WO2019124750A1 (ko) * | 2017-12-19 | 2019-06-27 | (주)리플레이 | 타임슬라이스 촬영을 위한 카메라 캘리브레이션 방법 및 이를 위한 장치 |
KR101988630B1 (ko) * | 2017-12-19 | 2019-09-30 | (주)리플레이 | 타임슬라이스 촬영을 위한 카메라 캘리브레이션 방법 및 이를 위한 장치 |
JPWO2020053936A1 (ja) * | 2018-09-10 | 2021-05-13 | 三菱電機株式会社 | カメラ設置支援装置及び方法並びに設置角度算出方法、並びにプログラム及び記録媒体 |
CN112913229A (zh) * | 2018-09-10 | 2021-06-04 | 三菱电机株式会社 | 摄像机设置辅助装置和方法、设置角度计算方法、程序和记录介质 |
JP7019064B2 (ja) | 2018-09-10 | 2022-02-14 | 三菱電機株式会社 | カメラ設置支援装置及び方法並びに設置角度算出方法、並びにプログラム及び記録媒体 |
US11259013B2 (en) | 2018-09-10 | 2022-02-22 | Mitsubishi Electric Corporation | Camera installation assistance device and method, and installation angle calculation method, and program and recording medium |
WO2020053936A1 (ja) * | 2018-09-10 | 2020-03-19 | 三菱電機株式会社 | カメラ設置支援装置及び方法並びに設置角度算出方法、並びにプログラム及び記録媒体 |
CN112913229B (zh) * | 2018-09-10 | 2023-04-21 | 三菱电机株式会社 | 摄像机设置辅助装置和方法、设置角度计算方法和记录介质 |
CN109916279B (zh) * | 2019-03-04 | 2020-09-22 | Oppo广东移动通信有限公司 | 终端盖板的平整度检测方法、装置、测试机台及存储介质 |
CN109916279A (zh) * | 2019-03-04 | 2019-06-21 | Oppo广东移动通信有限公司 | 终端盖板的平整度检测方法、装置、测试机台及存储介质 |
JP2020204583A (ja) * | 2019-06-19 | 2020-12-24 | 株式会社Subaru | 画像処理装置 |
WO2022004248A1 (ja) * | 2020-06-30 | 2022-01-06 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20110018973A1 (en) | 2011-01-27 |
JPWO2009119229A1 (ja) | 2011-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009119229A1 (ja) | 3次元撮像装置及び3次元撮像装置の校正方法 | |
US20200278202A1 (en) | Ranging apparatus and moving object capable of high-accuracy ranging | |
CN109416399B (zh) | 三维成像系统 | |
EP3100002B1 (en) | Camera calibration method | |
JP6111617B2 (ja) | レーザレーダ装置 | |
CN102183235B (zh) | 测距设备和模块、使用测距设备或模块的图像捕获设备 | |
US9201237B2 (en) | Diffraction-based sensing of mirror position | |
JP2018529102A (ja) | Lidarセンサ | |
US6741082B2 (en) | Distance information obtaining apparatus and distance information obtaining method | |
CN108885099B (zh) | 能够获取图像并进行高精度测距的测距设备和移动物体 | |
JP3594706B2 (ja) | 光源位置調整装置 | |
US20180290460A1 (en) | Optical deflection apparatus, head-up display apparatus, optical writing unit, image forming apparatus, and object recognition apparatus | |
EP1391778A1 (en) | Apparatus for detecting the inclination angle of a projection screen and projector comprising the same | |
JP6186863B2 (ja) | 測距装置及びプログラム | |
JP2008304248A (ja) | 車載用ステレオカメラの校正方法、車載用距離画像生成装置及びプログラム | |
JP2018518708A (ja) | スキャン装置及びスキャン方法 | |
JP2006322853A (ja) | 距離計測装置、距離計測方法および距離計測プログラム | |
JP2019074535A (ja) | 校正方法、校正装置、及びプログラム | |
JP2008292278A (ja) | 距離検出装置の光学ずれ検出方法及び距離検出装置 | |
JP2013050352A (ja) | ステレオカメラの取り付け調整方法及びステレオカメラ | |
JP4174154B2 (ja) | 防振機能付き撮影装置 | |
JP2007170948A (ja) | 幅測定装置、端部位置検出装置、端部厚さ測定装置、及び形状測定装置 | |
JP2005077391A (ja) | 位置姿勢計測装置および位置と姿勢の計測方法 | |
KR101423829B1 (ko) | 투영격자의 진폭을 적용한 3차원 형상 측정장치 및 방법 | |
JP4098194B2 (ja) | 角度検出装置およびそれを備えたプロジェクタ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09723646 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010505467 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12933696 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09723646 Country of ref document: EP Kind code of ref document: A1 |