WO2018235256A1 - Dispositif et système de mesure stéréo - Google Patents
Dispositif et système de mesure stéréo Download PDFInfo
- Publication number
- WO2018235256A1 WO2018235256A1 PCT/JP2017/023159 JP2017023159W WO2018235256A1 WO 2018235256 A1 WO2018235256 A1 WO 2018235256A1 JP 2017023159 W JP2017023159 W JP 2017023159W WO 2018235256 A1 WO2018235256 A1 WO 2018235256A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- stereo
- unit
- control
- distance
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 69
- 238000012545 processing Methods 0.000 claims abstract description 84
- 238000004364 calculation method Methods 0.000 claims abstract description 51
- 238000000034 method Methods 0.000 claims description 39
- 230000008569 process Effects 0.000 claims description 24
- 238000012937 correction Methods 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 2
- 239000011159 matrix material Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
Definitions
- the present invention relates to a stereo measurement apparatus and system including a single-eye camera and a plurality of cameras.
- stereo measurement devices and systems that measure an object or an environment using a plurality of cameras have been put to practical use. For example, using an in-vehicle stereo camera of a car, various measurements such as detection of an intruder or detection of a dangerous act are performed, and control of voice notification, danger avoidance action, etc. is performed based on the measurement result.
- automatic car driving there is an example of controlling a car by measuring a longitudinal direction with a stereo camera.
- stereo cameras there are adaptation cases of stereo cameras in the surveillance field and marketing.
- the measurement areas of the two cameras need to be approximately the same.
- the parameters, angle of view, parallax depth, etc. of the cameras handled in the series of processes depend on the specifications of the two cameras, and the measurement range is fixed.
- the measurement range for example, in the case of an on-vehicle camera for automatic vehicle operation, a range of 2 m to 40 m ahead is said to be a measurement range. For this reason, there is a problem that it becomes impossible to measure within 2 m and 40 m or more.
- Non-Patent Document 1 proposes a stereo camera with variable parameters. By controlling the camera zoom, it is possible to measure the proximity and the distance with a single unit, and it is possible to expand the measurement range.
- Patent Document 1 proposes to widen a measurement width by combining a plurality of sets of stereo cameras.
- Patent Document 2 proposes that the measurement width can be widened by combining a plurality of sets of stereo cameras.
- the measurement target The position is estimated, and calibration data prepared in advance is selected for the estimated position, and three-dimensional measurement is performed.
- image information such as human face.
- Patent Document 1 Japanese Patent Application Publication No. 2006-033282 JP 2012-058188 A
- Non-Patent Document 1 it is difficult to appropriately control and measure the camera according to the measurement target in a wide range from a near area to an ultra-far-distance area.
- Patent Document 1 has a problem that the cost of the measurement system is high. Also, since the combined camera's Sbeck is fixed in advance, it can not be changed later. There is also the problem of low versatility.
- Patent Document 2 it is difficult to dynamically control the camera and perform appropriate measurement on an actual moving measurement object. Further, as in Patent Document 1, calibration data is prepared in advance, and there is a problem that versatility is low. In addition, in the measurement range, there are cases where focusing can not be performed due to light, disturbance, etc., and in this case, the estimated position is unclear and there is a problem that the measurement is affected.
- a stereo measuring device that can control and measure the camera appropriately according to the measurement target for a wide region from the near side with a single stereo camera. And provide a system.
- the stereo measurement device for processing left and right image information from the stereo camera unit including the camera control unit of the left and right cameras while providing left and right image information with the left and right cameras
- the stereo measuring device performs calibration using a feature point selection unit that selects feature points from image scenes of left and right image information, and a calibration calculation unit that calculates calibration data by performing calibration using the selected feature points.
- the camera control unit of the left and right cameras is controlled from the distance calculation unit that calculates the distance to the target object to be sensed using the calibration data and the left and right image information, and the calculated distance and the image information Control amount calculating unit for calculating a control amount for controlling the camera control unit of the left and right cameras of the stereo camera unit. It is obtained by the stereoscopic measurement system "according to claim Rukoto.
- the present invention is a "stereo measurement system configured by a stereo measurement device and a stereo camera unit which gives left and right image information by the left and right cameras and includes camera control units of the left and right cameras.”
- a single stereo camera can be used to measure a wide area at a very far distance from near.
- the system can zoom in on the camera and make measurements.
- the zoom factor can be controlled based on the currently measured position.
- a person gets close it is possible to zoom out and measure.
- FIG. 2 is a view showing an example of the arrangement of a camera control unit 12;
- FIG. 8 is a view showing an example of the flow of processing of the feature point extraction unit 13;
- FIG. 2 is a view showing an example of the arrangement of a distance calculation unit 16;
- FIG. 6 is a diagram showing a flow of processing in a zoom control unit 21 and a baseline control unit 22.
- FIG. 7 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25.
- FIG. 5 is a diagram showing an example of the configuration of a stereo camera device 100 according to a second embodiment of the present invention.
- FIG. 1 is a view showing a configuration example of a stereo camera device according to Embodiment 1 of the present invention.
- the stereo camera device 100 is roughly divided into a stereo camera unit 101 and a processing unit 102 configured by a computer.
- the computer may be a device having a processing function such as a personal computer, a small built-in computer, or a computer device connectable to a network.
- the stereo camera unit 101 includes an actual image input unit 10 configured by the left and right actual image input units 10R and 10L including a lens and an imaging unit, and an image acquisition unit configured by the left and right image acquisition units 11R and 11L.
- a camera control unit 12 is composed of the camera control unit 11 and the left and right camera control units 12R and 12L.
- the stereo camera unit 101 can be said to be a mechanical mechanism unit mainly composed of a camera.
- the real video input unit 10 in the stereo camera unit 101 is configured to include two sets of imaging units and lenses, and inputs video from the left and right cameras of the stereo camera.
- the imaging unit in the real video input unit 10 is a mechanism including an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD), which is an image sensor.
- CMOS complementary metal oxide semiconductor
- CCD charge coupled device
- the lens has a zoomable lens. By zooming, it is possible to image a distant region, and it is also possible to image a near region.
- the image acquisition unit 11 is a function of converting an electronic signal input from the real video input unit 10 into an image.
- the image format is various. BMP, JPEG, PNG etc.
- the system determines the image format.
- the camera control unit 12 can adjust camera parameters such as the angle of view and orientation of the camera in the real video input unit 10 and the distance between the cameras.
- the scene to be photographed is not specified. If it is a stereo camera for surveillance function, let a moving object, surveillance area, etc. be a photography scene. In the case of an in-vehicle stereo camera, a road or a car ahead is taken as a shooting scene. As a shooting scene, even complex scenes and simple scenes can be shot, and the shot image is input.
- the camera of the first embodiment may be configured as a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like.
- the stereo camera device 100 can be operated with various devices and various environments.
- FIG. 2 is a view showing a configuration example of the camera control unit 12.
- the camera control unit 12 includes a zoom control unit 21, a baseline control unit 22, a rotation control unit 23, a tilt control unit 24, and a pan control unit 25.
- the zoom control unit 21 controls the lens mechanism using the zoom control value 26.
- the zoom control unit 21 changes the focal length by moving the lens to change the angle of view which is the imaging range of the camera.
- the baseline control unit 23 adjusts the distance between the cameras using the baseline control value 27.
- the tilt control unit 24 controls the angle of the camera using the tilt control value 28.
- the pan control unit uses the pan control value 29 to control the pan angle of the camera.
- the tilt control unit 23 and the pan control unit 24 respectively control the camera angle in the horizontal direction and in the vertical direction.
- the rotation control unit 23 can change the posture after rotating the camera.
- the real image input unit 10 the image acquisition unit 11, and the camera control unit 12 respectively have cases where left and right camera functions are integrally formed. Although shown, they may be separated. When separated, the right-view input unit 10R and the left-view input unit 10L, the right-image acquisition unit 11R and the left-image acquisition unit 11L, and the right-camera control unit 12R and the left-camera control unit 12L are individually provided. .
- the processing unit 102 configured by a computer in the stereo camera device 100 includes a feature point selection unit 13, a calibration calculation unit 14, a calibration data database DB 1, a distance calculation unit 16, and a control amount calculation unit 17. While the stereo camera unit 101 is a mechanical mechanism unit mainly composed of a camera, the processing unit 102 configured by a computer can be said to be an electronic processing function.
- the feature point selection unit 13 executes a process of selecting feature points for calibration using the image information input from the left and right image acquisition units 11R and 11L. .
- Feature point selection is a process of extracting feature points from a shooting scene and selecting effective points from the feature points, which can be realized by an existing method.
- FIG. 1 An example of the process flow of feature point extraction is shown in FIG.
- feature points are often extracted from one scene captured at one position, but in the first embodiment, an example is shown in which feature point groups are extracted from a plurality of scenes.
- processing 33 selects feature points for the images I1, I2, and I3 in a plurality of scenes, and processing 34 selects valid feature points from the plurality of extracted feature points.
- the process 34 a process of selecting valid feature points Pr is required except for the wrong feature points.
- RANSAC Random sample condition
- the point group Px is an incorrect feature point, and Pr indicates a correct feature point showing a predetermined regularity.
- the wrong feature point Px can be removed by RANSAC processing.
- the calibration calculation unit 14 performs camera calibration using the feature points output from the feature point selection unit 13.
- Camera calibration is a process of calculating internal camera parameters and external camera parameters.
- f indicates the focal length
- ⁇ indicates the aspect ratio
- s indicates the skew
- (u 0 , v 0 ) indicate the center coordinates of the coordinates.
- R is a rotation matrix indicating the orientation of the camera
- T is a translation matrix indicating the positional relationship between the cameras.
- R indicating the orientation of the camera is defined by the Euler angles, and is represented by three parameters of pan ⁇ , tilt ⁇ , and roll ⁇ ⁇ ⁇ which are camera installation angles. Be done. Calibration techniques are required to calculate internal and external parameters, respectively. Conventionally, the method described in “Z.
- the calculated camera parameters are recorded as calibration data D1 in the calibration data database DB1.
- the feature point selecting unit 13 selects feature points from the re-input scene.
- the calibration calculation unit 14 also calculates the calibration data D1 again, and registers the latest calibration data D1 in the calibration database DB1. Then, calibration data is used in accordance with camera parameters controlled by the camera.
- FIG. 4 is a view showing a configuration example of the distance calculation unit 16.
- the distance calculation unit 16 calculates distance data P1 using the left and right images from the left and right image acquisition units 11R and 11L and the calibration data D1.
- the calculation of the distance data P1 is obtained by sequentially executing the processing of the distortion correction unit 44, the parallel processing unit 45, and the distance calculation unit 46.
- K is an internal parameter matrix
- f is a focal length
- a is an aspect ratio
- s is a skew
- (vc, uc) is a center coordinate of image coordinates.
- D is an external parameter matrix
- (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicate the orientation of the camera
- (tx, ty, tz) is the camera Indicates the position coordinates between
- the distortion correction unit 44 performing the first process in the distance calculation unit 46 of FIG. 4 corrects distortion of the camera image using the camera parameters. Then, the parallel processing unit 45 processes parallelization of the left and right cameras.
- the three-dimensional measurement value of the measurement object can be calculated by (4), including the distance.
- (xl, yl) and (xr, yr) are pixel values on the left and right camera images.
- f is the focal length
- B is the baseline
- d is the difference between the images projected on the left and right cameras, respectively, in the same three dimensions. The difference is the three-dimensional parallax.
- the relationship between world coordinates (X, Y, Z) and image coordinates (u, v) is as shown in equation (5).
- the determination of the world coordinates (X, Y, Z) means that the distance data P1 has been acquired.
- control amount calculation unit 17 calculates a control value using the calculated distance data P1 and the image data. Then, the camera control unit 12 performs control using each control value.
- FIG. 5 shows a flowchart regarding control amount calculation and control processing in the zoom control unit 21 and the baseline control unit 22.
- processing step S51 which is the first processing in the zoom control unit 21 and the baseline control unit 22 shown in FIG. 5, images are input from the left and right cameras.
- processing step S52 the distance data P1 measured by the distance calculation unit 16 is stored in the distance database DB2.
- processing step S54 the stability of the distance data P1 is evaluated. For example, if the distance data P1 is stored several times (here five times), it is determined to be stable.
- processing step S55 the average distance is calculated using the distance data determined to be stable, and in processing step S56, the zoom control unit 21 and the baseline control unit 22 calculate zoom and baseline control amounts.
- the reference s can be set in accordance with the zoom of the camera. If the distance is close, the zoom factor a is decreased, and if the distance is long, the zoom factor a is increased.
- control processing is repeatedly executed until the end of image acquisition. If there is no image, the image acquisition ends in processing step S58, and the control processing also ends.
- FIG. 6 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25.
- processing step S61 in the rotation control shown in FIG. 6, the left and right images from the left and right image acquiring units 11R and 11L are acquired, and in processing step S62, the inter-frame difference is calculated. If there is a difference, it indicates that there is movement such as human movement in the image.
- Processing step S63 extracts the motion area. For example, the image area is divided into a plurality of areas, and movement of a person or the like is detected between the divided small areas.
- processing step S64 if there is area movement, the amount of movement is calculated in processing step S65, and rotation control of pan and tilt is performed in processing step S66.
- processing step S67 control processing is repeated until the end of image acquisition. If there is no image, the process proceeds to processing step S68, the image acquisition ends, and the control processing also ends.
- the pan control value p and the tilt control value t use the movement distance in a triangular relationship, according to equation (6) It is possible to calculate the angle.
- the ideal stereo camera should be synchronized, and the calculated distance should be stable.
- the left and right cameras have the problem of being asynchronous. In that case, the distance during movement is unstable, and control on a distance basis is difficult.
- FIG. 7 is a diagram showing the flow of control processing in an asynchronous stereo camera.
- processing step S71 which is the first processing step in FIG. 7, left and right camera images are acquired.
- processing step S72 distortion correction of the camera is performed.
- processing step S73 disparity calculation processing is performed.
- processing step S74 inter-frame difference processing of the left and right cameras is performed.
- Y side if there is no difference in the processing step S75 (Y side), it is judged as stationary and moves to the processing of the processing step S710, and if there is a difference (N side), it is judged as movement and the processing in the processing step S76 Move.
- processing step S75 When it is determined that there is a difference (N side) in the determination processing of the processing step S75, it is determined that the movement of a person or the like. In the case of movement, in the processing step S76, a movement area on the image is calculated. Then, in processing step S77, the current values of pan and tilt of the camera are calculated. In processing step S78, the calculated motion area is compared with the current value to calculate a difference, and the position of a person or the like is determined. At processing step S79, the pan control value and the tilt control value are calculated based on the difference, and the rotation control is performed.
- step S75 If it is stationary (Y side) in the determination process of processing step S75, the distance abnormal value is determined in processing step S710, and the distance data of 5 consecutive frames at the time of stationary is stored in processing step S711 only for normal data.
- step S712 the stability is evaluated. If it is stable (Y side), an average distance is calculated in processing step S713, and a zoom control value and a baseline control value are calculated. Zooming and baseline control are performed in processing step S714 using the control value.
- control processing is repeatedly performed until the end of image acquisition. If there is no image, in the processing step S716, the image acquisition ends, and the control processing also ends.
- FIG. 8 is a view showing a configuration example of a stereo camera device 100 according to a second embodiment of the present invention.
- the functions from the real video input unit 10 to the control amount calculation unit 17 in the configuration shown in FIG. 1 are the same as in the first embodiment, and thus the description thereof is omitted.
- the stereo camera unit 101 which is a mechanical mechanism unit
- the processing unit 102 which is an electronic processing function
- a network 81 is communicably connected to each other via a network 81.
- calibration is performed on a camera in the network 81.
- the distance of the object measured by the camera is calculated.
- it is a camera system configuration that can be remotely processed to control the camera via the network 81 using the calculated distance. Thereby, it becomes possible to measure remotely without depending on the installation place or the measurement place.
- FIG. 9 is a view showing a stereo camera device 100 according to a third embodiment of the present invention.
- the accuracy of the scanned distance when controlling a camera, when scanning a wide range, the accuracy of the scanned distance can be improved.
- the accuracy of the near distance is high, and the accuracy decreases as the distance increases.
- one camera first scans the near area 94 with 91 cameras. Then, the camera sbeck is changed to 92 and the far area 95 is scanned. Similarly, the camera Sbeck is changed to 93, and the far area 96 is scanned. Then, by integrating the scanned results, the results for the entire area 97 can be obtained. In the processing result of the image obtained for the entire area 97, the accuracy of the distance can be high for all the areas. In this way, various applications such as map construction and environment scanning can be performed for automatic driving in the future.
- FIG. 10 is a view showing a stereo camera device 100 according to a fourth embodiment of the present invention.
- the fourth embodiment is an embodiment in which a sensing object is tracked and appropriate measurement is performed while controlling a camera.
- a moving object M such as a person or a car is used as a target object to be measured.
- the movement path of the object M to be sensed is like g, and is assumed to be gradually moved away from the near distance position.
- the sensing system tracks the target object M while controlling the pan, tilt, zoom, and baseline of the camera. Thereby, it is possible to measure a wide area at a very far distance from a close position using one stereo camera. Moreover, it becomes possible to sense appropriately according to the movement of an object.
- the zoom is 1 ⁇
- the baseline is 15 cm
- the zoom is 5 ⁇ and the baseline is 20 cm and the angle 20 ° at the middle distance position.
- the zoom is 10 ⁇
- the baseline is 25 cm
- the angle is 10 °.
- FIG. 11 is a view showing a stereo camera device 100 according to a fifth embodiment of the present invention.
- Example 5 is an example of a sensing system.
- a stereo camera 101 and a computer 111 are connected by a network 81.
- the image is sent to the computer 111 via the network 81.
- the computer 111 performs the process shown in FIG. 1 using the received image.
- the zoom control unit 115 controls the focal length of the camera.
- the baseline control unit 116 controls the baseline stage 113.
- the rotation control unit 117 controls the rotation stage 114. Those units are connected to the computer 111 and perform control according to the processing result.
- the image acquisition unit 11 of the camera acquires an image at, for example, 30 frames per second, and at this point, it is assumed that the image processing of the second frame is to be performed from now.
- the image of the second frame is simultaneously given from the image acquisition unit 11 to the feature point selection unit 13.
- the distance calculation unit 16 detects the distance from the left and right images to the target object and the movement of the target object in the image, and when there is movement, a new camera via the control amount calculation unit 17 and the camera control unit 12 Act to position.
- the distance calculation unit 16 refers to the calibration data database DB1 to obtain the calibration data D2 used for correction when calculating the distance, but the latest calibration data D2 is at least the first frame.
- the image processing of the eyes is obtained by the processing of the feature point selection unit 13 and the carry calculation unit 14. That is, the distance calculation unit 16 calculates the distance from the image of the current time and the latest calibration data D2 obtained by the process of the past time.
- calibration data D2 obtained from the image obtained in the second frame is stored in the calibration data database DB1, and the process of the distance calculation unit 16 in the third frame is performed.
- the calculation timing of the calibration data is performed at the timing when the camera is controlled. This makes it possible to calculate distance data with high accuracy. And according to the present distance of measurement object, it becomes possible to control a camera appropriately and to measure further.
- the stereo camera device 100 can calibrate the latest situation and reflect it on the distance control.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Accessories Of Cameras (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
L'invention concerne un système de mesure stéréo qui permet d'effectuer une mesure sur une large portée allant d'une courte distance à une longue distance au moyen d'une seule caméra stéréo. L'invention concerne également un dispositif de mesure stéréo destiné à traiter des informations d'image droite et gauche à partir d'une unité de caméra stéréo qui fournit les informations d'image droite et gauche à l'aide de caméras droite et gauche, et qui comprend une unité de commande de caméra pour les caméras droite et gauche, le dispositif de mesure stéréo étant caractérisé en ce qu'il comporte : une unité de sélection de point de caractéristique qui sélectionne des points de caractéristique à partir des scènes d'image des informations d'image droite et gauche ; une unité de calcul d'étalonnage qui utilise les points de caractéristique sélectionnés pour effectuer un étalonnage, calculant ainsi des données de calcul ; une unité de calcul de distance qui utilise les données d'étalonnage calculées et les informations d'image droite et gauche pour calculer la distance jusqu'à un objet cible qui doit être détecté ; et une unité de calcul de degré de commande qui calcule, à partir de la distance calculée et des informations d'image, des degrés de commande auxquels l'unité de commande de caméra commande les caméras droite et gauche, les degrés de commande étant fournis à l'unité de commande de caméra de l'unité de caméra stéréo pour les caméras droite et gauche.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019524821A JP6734994B2 (ja) | 2017-06-23 | 2017-06-23 | ステレオ計測装置及びシステム |
PCT/JP2017/023159 WO2018235256A1 (fr) | 2017-06-23 | 2017-06-23 | Dispositif et système de mesure stéréo |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/023159 WO2018235256A1 (fr) | 2017-06-23 | 2017-06-23 | Dispositif et système de mesure stéréo |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018235256A1 true WO2018235256A1 (fr) | 2018-12-27 |
Family
ID=64737655
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/023159 WO2018235256A1 (fr) | 2017-06-23 | 2017-06-23 | Dispositif et système de mesure stéréo |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6734994B2 (fr) |
WO (1) | WO2018235256A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111915656A (zh) * | 2019-05-09 | 2020-11-10 | 东芝泰格有限公司 | 追踪装置及信息处理方法、可读存储介质、电子设备 |
CN111915638A (zh) * | 2019-05-09 | 2020-11-10 | 东芝泰格有限公司 | 追踪装置及信息处理方法、可读存储介质、电子设备 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004096488A (ja) * | 2002-08-30 | 2004-03-25 | Fujitsu Ltd | 物体検知装置、物体検知方法および物体検知プログラム |
JP2004310595A (ja) * | 2003-04-09 | 2004-11-04 | Ntt Data Corp | 動物体検出装置および動物体検出方法 |
JP2007263669A (ja) * | 2006-03-28 | 2007-10-11 | Denso It Laboratory Inc | 3次元座標取得装置 |
JP2010050582A (ja) * | 2008-08-20 | 2010-03-04 | Tokyo Institute Of Technology | 遠距離視標探査カメラシステム |
JP2012103741A (ja) * | 2010-11-05 | 2012-05-31 | Sony Corp | 撮像装置、画像処理装置、および画像処理方法、並びにプログラム |
JP2013105002A (ja) * | 2011-11-14 | 2013-05-30 | Bi2−Vision株式会社 | 3d映像撮影制御システム、3d映像撮影制御方法、およびプログラム |
JP2016099941A (ja) * | 2014-11-26 | 2016-05-30 | 日本放送協会 | オブジェクト位置推定システム、及びそのプログラム |
JP2017040549A (ja) * | 2015-08-19 | 2017-02-23 | シャープ株式会社 | 画像処理装置および誤差判定方法 |
-
2017
- 2017-06-23 JP JP2019524821A patent/JP6734994B2/ja active Active
- 2017-06-23 WO PCT/JP2017/023159 patent/WO2018235256A1/fr active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004096488A (ja) * | 2002-08-30 | 2004-03-25 | Fujitsu Ltd | 物体検知装置、物体検知方法および物体検知プログラム |
JP2004310595A (ja) * | 2003-04-09 | 2004-11-04 | Ntt Data Corp | 動物体検出装置および動物体検出方法 |
JP2007263669A (ja) * | 2006-03-28 | 2007-10-11 | Denso It Laboratory Inc | 3次元座標取得装置 |
JP2010050582A (ja) * | 2008-08-20 | 2010-03-04 | Tokyo Institute Of Technology | 遠距離視標探査カメラシステム |
JP2012103741A (ja) * | 2010-11-05 | 2012-05-31 | Sony Corp | 撮像装置、画像処理装置、および画像処理方法、並びにプログラム |
JP2013105002A (ja) * | 2011-11-14 | 2013-05-30 | Bi2−Vision株式会社 | 3d映像撮影制御システム、3d映像撮影制御方法、およびプログラム |
JP2016099941A (ja) * | 2014-11-26 | 2016-05-30 | 日本放送協会 | オブジェクト位置推定システム、及びそのプログラム |
JP2017040549A (ja) * | 2015-08-19 | 2017-02-23 | シャープ株式会社 | 画像処理装置および誤差判定方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111915656A (zh) * | 2019-05-09 | 2020-11-10 | 东芝泰格有限公司 | 追踪装置及信息处理方法、可读存储介质、电子设备 |
CN111915638A (zh) * | 2019-05-09 | 2020-11-10 | 东芝泰格有限公司 | 追踪装置及信息处理方法、可读存储介质、电子设备 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018235256A1 (ja) | 2020-05-21 |
JP6734994B2 (ja) | 2020-08-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10455141B2 (en) | Auto-focus method and apparatus and electronic device | |
CN109922251B (zh) | 快速抓拍的方法、装置及系统 | |
KR102143456B1 (ko) | 심도 정보 취득 방법 및 장치, 그리고 이미지 수집 디바이스 | |
KR102149276B1 (ko) | 영상 정합 방법 | |
JP4852591B2 (ja) | 立体画像処理装置、方法及び記録媒体並びに立体撮像装置 | |
CN106960454B (zh) | 景深避障方法、设备及无人飞行器 | |
US20070189750A1 (en) | Method of and apparatus for simultaneously capturing and generating multiple blurred images | |
US8648961B2 (en) | Image capturing apparatus and image capturing method | |
KR101095361B1 (ko) | 촬상 장치, 촬상 제어 방법, 및 기록 매체 | |
US9619886B2 (en) | Image processing apparatus, imaging apparatus, image processing method and program | |
KR20150050172A (ko) | 관심 객체 추적을 위한 다중 카메라 동적 선택 장치 및 방법 | |
US20130162786A1 (en) | Image processing apparatus, imaging apparatus, image processing method, and program | |
JP2002298142A (ja) | 人物画像検出方法、この方法を実行するためのプログラムを記録した記録媒体、人物画像検出装置及びこの装置を備えた撮影装置 | |
JP2009077092A (ja) | マルチカメラシステム | |
JP6694281B2 (ja) | ステレオカメラおよび撮像システム | |
JP7378219B2 (ja) | 撮像装置、画像処理装置、制御方法、及びプログラム | |
JP6694234B2 (ja) | 距離測定装置 | |
JP6734994B2 (ja) | ステレオ計測装置及びシステム | |
JP2019062340A (ja) | 像振れ補正装置および制御方法 | |
US20130093856A1 (en) | Stereoscopic imaging digital camera and method of controlling operation of same | |
US20130076868A1 (en) | Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same | |
WO2016047220A1 (fr) | Dispositif et procédé d'imagerie | |
JP2018205008A (ja) | カメラキャリブレーション装置およびカメラキャリブレーション方法 | |
CN111080689B (zh) | 确定面部深度图的方法和装置 | |
WO2019130409A1 (fr) | Dispositif et procédé de mesure stéréo |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17915178 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019524821 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17915178 Country of ref document: EP Kind code of ref document: A1 |