WO2022249534A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2022249534A1 WO2022249534A1 PCT/JP2022/000907 JP2022000907W WO2022249534A1 WO 2022249534 A1 WO2022249534 A1 WO 2022249534A1 JP 2022000907 W JP2022000907 W JP 2022000907W WO 2022249534 A1 WO2022249534 A1 WO 2022249534A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cameras
- image
- timing
- information processing
- extrinsic parameter
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 85
- 238000003672 processing method Methods 0.000 title claims abstract description 6
- 238000004364 calculation method Methods 0.000 claims abstract description 24
- 238000003384 imaging method Methods 0.000 claims abstract description 24
- 230000008859 change Effects 0.000 claims description 13
- 238000000034 method Methods 0.000 description 79
- 238000012545 processing Methods 0.000 description 67
- 230000008569 process Effects 0.000 description 56
- 238000001514 detection method Methods 0.000 description 50
- 238000010586 diagram Methods 0.000 description 33
- 238000004891 communication Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 18
- 230000033001 locomotion Effects 0.000 description 16
- 238000005259 measurement Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 239000008186 active pharmaceutical agent Substances 0.000 description 3
- 230000036544 posture Effects 0.000 description 3
- 239000013589 supplement Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- HFGHRUCCKVYFKL-UHFFFAOYSA-N 4-ethoxy-2-piperazin-1-yl-7-pyridin-4-yl-5h-pyrimido[5,4-b]indole Chemical compound C1=C2NC=3C(OCC)=NC(N4CCNCC4)=NC=3C2=CC=C1C1=CC=NC=C1 HFGHRUCCKVYFKL-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000003705 background correction Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007616 round robin method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the present disclosure relates to an information processing device, an information processing method, and a program.
- Camera parameters include, for example, external parameters that depend on the positions and orientations of multiple cameras. Therefore, when the positions and orientations of multiple cameras deviate due to the influence of disturbances such as environmental temperature changes and vibrations, the accuracy of external parameters may be reduced, and the accuracy of distance measurement value estimation may also be reduced. Therefore, techniques have been developed to detect whether or not there is a deviation in the position or orientation of the camera.
- Patent Literature 1 discloses a technique of estimating camera parameters using movement distance information of a moving body equipped with a plurality of cameras and a plurality of images.
- Patent Document 1 requires that the moving object be equipped with a plurality of cameras and a sensor capable of acquiring movement distance information. Therefore, it is difficult to apply the technology to a moving object that does not have a sensor capable of acquiring movement distance information.
- the present disclosure proposes a new and improved information processing device, information processing method, and program that can more simply detect whether or not a deviation related to camera parameters has occurred in a plurality of cameras.
- three-dimensional positions of feature points included in each image are calculated based on each image obtained by photographing a subject at a first timing by a plurality of cameras and the extrinsic parameters of the plurality of cameras.
- the second timing based on one image included in each image obtained by photographing the subject at the second timing by the plurality of cameras, and the three-dimensional position of the feature point a first estimator for estimating a first extrinsic parameter, which is an extrinsic parameter of the camera that captured the one image among the plurality of cameras; and the first extrinsic parameter estimated by the first estimator.
- an information processing device comprising:
- three-dimensional positions of feature points included in each image based on each image obtained by photographing a subject at a first timing by a plurality of cameras and the extrinsic parameters of the plurality of cameras are calculated. and calculating the second timing based on one image included in each image obtained by photographing the subject at the second timing by the plurality of cameras and the three-dimensional position of the feature point estimating a first extrinsic parameter that is an extrinsic parameter of the camera that captured the first image among the plurality of cameras in the above, and based on the estimated first extrinsic parameter, the at the first timing estimating a second extrinsic parameter that is an extrinsic parameter of any one of a plurality of cameras; and the estimated second extrinsic parameter of any one of the plurality of cameras and the camera and determining whether the plurality of cameras has deviated from the extrinsic parameters based on the previous extrinsic parameters of.
- the computer determines feature points included in each image based on each image obtained by photographing a subject at a first timing by a plurality of cameras and the extrinsic parameters of the plurality of cameras. Based on a calculation function for calculating a three-dimensional position, one image included in each image obtained by photographing the subject at a second timing by the plurality of cameras, and the three-dimensional position of the feature point, the a first estimation function for estimating a first extrinsic parameter, which is an extrinsic parameter of the camera that captured the first image among the plurality of cameras at a second timing; a second estimation function for estimating a second extrinsic parameter, which is an extrinsic parameter of one of the plurality of cameras at the first timing, based on the first extrinsic parameter; Based on the second extrinsic parameter of any one of the plurality of cameras estimated by the function and the previous extrinsic parameter of the camera, it is determined whether or not a deviation related to
- FIG. 1 is an explanatory diagram for explaining an example of an information processing system according to the present disclosure
- FIG. FIG. 3 is an explanatory diagram for explaining a functional configuration example of an information processing device 30 according to the present disclosure
- FIG. 4 is an explanatory diagram for explaining an example of distance measurement processing using the principle of triangulation
- FIG. 4 is an explanatory diagram for explaining an example of operation processing of the information processing device 30 according to the present disclosure
- FIG. 4 is an explanatory diagram for explaining an example of imaging by the stereo camera 10 according to the present disclosure
- FIG. 5 is an explanatory diagram for explaining an example of operation processing related to image processing according to the present disclosure
- FIG. 4 is an explanatory diagram for explaining an example of a feature point detection method according to the Harris method
- FIG. 5 is an explanatory diagram for explaining an example of operation processing related to determination of a set of feature points according to the present disclosure
- FIG. 4 is an explanatory diagram for explaining a specific example of correlation calculation according to the present disclosure
- FIG. 10 is an explanatory diagram for explaining an example of operation processing related to determination of whether or not an image group is suitable for deviation determination according to the present disclosure
- FIG. 10 is an explanatory diagram for explaining a first process example of a provisional determination method as to whether or not a shift has occurred in the stereo camera 10 according to the present disclosure
- FIG. 10 is an explanatory diagram for explaining a second process example of a provisional determination method as to whether or not a shift has occurred in the stereo camera 10 according to the present disclosure
- FIG. 10 is an explanatory diagram for explaining a third process example of a provisional determination method for determining whether or not a shift has occurred in the stereo camera 10 according to the present disclosure
- FIG. 11 is an explanatory diagram for explaining an example of a final determination as to whether or not a deviation has occurred according to the present disclosure
- FIG. 11 is an explanatory diagram for explaining an example of notification information generated by a notification information generation unit 351
- 3 is a block diagram showing the hardware configuration of the information processing device 30;
- FIG. 1 is an explanatory diagram for explaining an example of an information processing system according to the present disclosure.
- An information processing system according to the present disclosure includes a network 1, a mobile object 5, a stereo camera 10, an information processing device 30, and an information terminal TB.
- a network 1 is a wired or wireless transmission path for information transmitted from devices connected to the network 1 .
- the network 1 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
- the network 1 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
- the information terminal TB and the information processing device 30 are connected via the network 1 .
- the moving body 5 is a device that moves under autonomous control or user operation.
- the mobile object 5 may be, for example, a drone as shown in FIG. Also, the mobile object 5 may be a car, a ship, or an aircraft.
- the stereo camera 10 acquires an image by photographing a subject. Also, the stereo camera 10 acquires information in the depth direction of the subject by mounting two cameras side by side.
- the camera mounted on the left side facing the subject will be referred to as the left camera 15A
- the camera mounted on the right side facing the subject will be referred to as the right camera 15B. called.
- the left camera 15A and the right camera 15B may be collectively referred to as the stereo camera 10 when there is no particular need to distinguish them.
- two cameras are described as the plurality of cameras mounted on the moving body 5, but the number of cameras mounted on the moving body 5 is not limited to such an example. .
- the number of cameras mounted on the moving body 5 may be three or more.
- the information processing device 30 estimates the extrinsic parameters of the left camera 15A or the right camera 15B based on each image obtained by photographing the subject with the stereo camera 10 at a plurality of timings. Further, the information processing device 30 determines whether or not a deviation related to the external parameters has occurred in the left camera 15A or the right camera 15B based on the estimated external parameters and the external parameters at the time of previous setting. That is, it is determined whether or not there is a discrepancy between the actual installation position or orientation of the left camera 15A or right camera 15B and the installation position or orientation corresponding to the set external parameters.
- the information terminal TB is a terminal used by the user OP.
- the information terminal TB may be, for example, a tablet terminal as shown in FIG. 1, or various devices such as a smart phone and a PC (Personal Computer).
- the display provided in the information terminal TB displays images obtained by shooting with the stereo camera 10 . Further, the information terminal TB performs remote control of the moving body 5 based on the operation by the user OP.
- FIG. 2 is an explanatory diagram for explaining a functional configuration example of the information processing device 30 according to the present disclosure.
- the moving body 5 includes a stereo camera 10 , an operating device 20 and an information processing device 30 . Since the functional configuration of the stereo camera 10 has been described with reference to FIG. 1, description thereof will be omitted in FIG.
- the operation device 20 is a device that operates under the control of an operation control section 355, which will be described later.
- the operating device 20 includes, for example, an engine and a braking device. A specific example of the operation of the operating device 20 will be described later.
- the information processing device 30 includes a communication unit 310, a storage unit 320, and a control unit 330, as shown in FIG.
- the communication unit 310 performs various communications with the information terminal TB. For example, the communication unit 310 receives operation information of the moving body 5 from the information terminal TB. Also, the communication unit 310 transmits notification information generated by a notification information generating unit 351, which will be described later, to the information terminal TB.
- Storage unit 320 holds software and various data.
- the storage unit 320 stores the provisional determination result determined by the deviation detection unit 347 . Further, when the number of provisional determination results stored exceeds a predetermined number, the storage unit 320 may delete the oldest determination results in order.
- control unit 330 controls overall operations of the information processing device 30 according to the present disclosure.
- the control unit 330 includes an image processing unit 331, a feature point detection unit 335, a pair determination unit 339, an estimation unit 343, a deviation detection unit 347, a notification information generation unit 351, an operation A control unit 355 , a distance measurement unit 359 and a distance measurement data utilization unit 363 are provided.
- the image processing unit 331 performs image processing on each image acquired by the stereo camera 10 .
- the image processing unit 331 performs various image processing such as shading correction and noise reduction on each image.
- the image processing unit 331 performs various processes such as lens distortion removal, parallelization, and cropping.
- the feature point detection unit 335 uses the stereo camera 10 to detect feature points from each image obtained by photographing the subject at a certain timing.
- the feature point detection unit 335 may detect feature points from each image using a known technique such as Harris' method or SIFT (Scale Invariant Feature Transform).
- Harris' method Scale Invariant Feature Transform
- SIFT Scale Invariant Feature Transform
- the pair determination unit 339 is an example of a determination unit, and measures the degree of correlation between the feature points included in each image obtained by photographing the subject at a certain timing with the stereo camera 10 and the feature points included in the other image. , is determined as the set of feature points.
- the estimating unit 343 is an example of a calculating unit, and based on each image obtained by photographing the subject at a certain timing with the stereo camera 10 and the extrinsic parameters of a plurality of cameras, the feature included in each image is calculated. Calculate the 3D position of a point.
- the estimating unit 343 is an example of a first estimating unit, and the cubic Based on the original position, the extrinsic parameters of the camera that shot one image out of the stereo cameras 10 at the other timing are estimated as first extrinsic parameters.
- the estimating unit 343 is an example of a second estimating unit. Based on the estimated first extrinsic parameters, the extrinsic parameters of the left camera 15A or the right camera 15B at the above-described first timing are set as second extrinsic parameters. estimated as
- the deviation detection unit 347 is an example of a determination unit, and based on the estimated extrinsic parameters of the left camera 15A or the right camera 15B and the extrinsic parameters of the camera at the time of previous setting, the extrinsic parameters of the left camera 15A or the right camera 15B are determined. It is determined whether or not such deviation has occurred.
- the notification information generation unit 351 is an example of a notification unit, and when the deviation detection unit 347 determines that a deviation related to an external parameter has occurred in the left camera 15A or the right camera 15B, generates notification information regarding the deviation. . Also, the notification information generation unit 351 causes the communication unit 310 to transmit the generated notification information to the information terminal TB of the user OP. A specific example of the notification information will be described later.
- the motion control unit 355 controls a predetermined motion of the moving body 5 on which the stereo camera 10 is mounted. conduct. For example, when the deviation detection unit 347 determines that the left camera 15A or the right camera 15B has a deviation related to an external parameter, the motion control unit 355 controls the speed of the moving body to limit the speed of the moving body. 20 may be controlled.
- the motion control unit 159 may control the motion device 20 via a control device that controls the overall motion of the moving body 5 .
- the operation control unit 159 may control the operation device 20 based on various information obtained by the distance measurement data utilization unit 363 . For example, when the ranging data utilization unit 363 determines that the possibility of collision with an object is high, the operation control unit 159 may control the braking device to stop the moving body 5 .
- the distance measurement unit 359 executes distance measurement processing for calculating the distance from the stereo camera 10 to the subject based on each image obtained by shooting with the stereo camera 10 and camera parameters.
- the ranging process according to the present disclosure may use known techniques such as the principle of triangulation.
- the ranging data usage unit 363 uses the ranging information calculated by the ranging unit 359 .
- the ranging data utilization unit 363 may determine the possibility of collision between the moving body 5 on which the stereo camera 10 is mounted and an object based on the calculated ranging information.
- Organize assignments can be applied to the moving object 5, such as autonomous movement and movement by the operation of the user OP.
- any operation policy it is desirable to estimate the distance from the moving body 5 to the object in order to reduce the possibility of the moving body 5 colliding with an object such as an obstacle or an animal while traveling.
- FIG. 3 is an explanatory diagram for explaining an example of distance measurement processing using the principle of triangulation.
- FIG. 3 shows the imaging positions of the subject P on the imaging plane SL of the left camera 15A and the imaging plane SR of the right camera 15B when the subject P is photographed by the left camera 15A and the right camera 15B.
- the imaging position PL is the position where the subject P is captured on the imaging surface SL of the left camera 15A when the left camera 15A captures the subject P.
- the imaging position PR is the position where the subject P is captured on the imaging surface SR of the right camera 15B when the right camera 15B captures the subject P at the same timing as the left camera 15A.
- the difference between the imaging position PL of the subject P in the left camera 15A and the imaging position PR of the subject P in the right camera 15B is referred to as a parallax S. It should be noted that the imaging position P L on the imaging surface S R in FIG.
- the distance DS is expressed by the following formula (Equation 1) using the baseline length B, the focal length F, and the parallax S.
- the parallax S is used for the calculation of the distance DS, as shown in the formula (Equation 1).
- the parallax S decreases as the distance from the stereo camera 10 to the subject P increases, and increases as the distance increases.
- parallax S is the difference between the imaging position P L of the subject P in the left camera 15A and the imaging position P R of the subject P in the right camera 15B. It depends on external parameters set based on the position and orientation of the right camera 15B.
- the value of the parallax S is also not an accurate value, so the accuracy of estimating the distance DS by the distance measurement unit 359 may also be reduced. Therefore, it is desirable to periodically detect whether or not a deviation related to the external parameter has occurred in the left camera 15A or the right camera 15B.
- the information processing device 30 determines whether or not a deviation related to the external parameters has occurred in the left camera 15A or the right camera 15B based on the group of images obtained by photographing the subject.
- the deviation related to the external parameter includes, for example, the deviation of the attachment angle and the deviation of the attachment position of the left camera 15A or the right camera 15B.
- FIG. 4 is an explanatory diagram for explaining an operation processing example of the information processing device 30 according to the present disclosure.
- the stereo camera 10 photographs a subject and acquires an image group including a plurality of images (S101).
- the image processing unit 331 performs image processing on each image obtained by shooting with the stereo camera 110 (S105).
- the feature point detection unit 335 detects feature points from each image (S109).
- the pair determination unit 339 determines a set of feature points with a high degree of correlation between each feature point detected from one image and each feature point detected from another image (S113).
- the estimating unit 343 determines whether or not the group of images captured by the stereo camera 10 are images suitable for deviation determination (S117).
- the deviation detection unit 347 performs provisional deviation determination processing using an image group suitable for deviation determination (S121).
- the storage unit 320 stores the provisional determination result of the deviation (S125).
- control unit 330 determines whether or not a provisional shift determination process has been executed using a predetermined number of image groups (S129). If the provisional misalignment determination process has been performed using the predetermined number of image groups (S129/Yes), the process proceeds to S133, and the predetermined number of image groups is used to perform the provisional misalignment determination process. If not (S129/No), the process proceeds to S101.
- the deviation detection unit 347 detects the difference between the left camera 15A or the right camera 15B based on the provisional determination result of each deviation. (S133), and the information processing apparatus 30 according to the present disclosure ends the process.
- FIG. 5 is an explanatory diagram for explaining an example of imaging by the stereo camera 10 according to the present disclosure.
- the stereo camera 10 according to the present disclosure takes pictures at intervals of the time width T1.
- Each image obtained by photographing at intervals of such a time width T1 may be expressed as an image group PG.
- the stereo camera 10 captures images twice at intervals of the time width T1, and the images obtained by the two captures are regarded as one image group PG. You may shoot 3 or more times at intervals of . In this case, each image obtained according to the number of times is set as one image group PG.
- the stereo camera 10 may acquire the image group PG by shooting at intervals of the time width T2.
- the stereo camera 10 may acquire images while changing the subject at intervals of time width T2 (eg, 10 to 60 seconds).
- time width T2 e.g. 10 to 60 seconds.
- the moving body 5 includes one stereo camera 10
- the moving body 5 may include a plurality of stereo cameras 10 .
- the control unit 330 may control the imaging timings of the plurality of stereo cameras 10 by, for example, a round robin method.
- FIG. 6 is an explanatory diagram for explaining an example of operation processing related to image processing according to the present disclosure.
- the image processing unit 331 uses the camera parameters of the stereo camera 10 to remove lens distortion from each image captured by the stereo camera 10 (S201).
- the image processing unit 331 performs parallelization processing on each image obtained by shooting with the stereo camera 10 (S205).
- the parallelization process is a process of matching the imaging position of a subject in the y direction with respect to each image obtained by photographing the subject with the stereo camera 10 .
- the direction of a straight line connecting the centers of the left camera 15A and the right camera 15B is defined as the x direction, and the direction perpendicular to the x direction is defined as the y direction.
- the image processing unit 331 crops the image subjected to the lens distortion removal and parallelization processing to a desired image size by cropping processing (S209). End the process.
- FIG. 7 is an explanatory diagram for explaining an example of a feature point detection method according to the Harris method.
- the feature point detection unit 335 generates an x-direction differential image from each input image (S301).
- the feature point detection unit 335 generates a differential image in the y direction from each input image (S305). Note that the feature point detection unit 335 may generate a differential image in the x direction and a differential image in the y direction by, for example, applying a Sobel filter corresponding to the x direction and the y direction to each input image. good.
- the feature point detection unit 335 calculates the matrix M(x, y) using the pixel values at the same pixel positions of the differential images in each direction and the following formula (Equation 2) (S309).
- g(u, v) is a weighting factor, and may be, for example, a Gaussian function with x and y as the origin.
- Ix is the pixel value of the differential image in the x direction
- Iy is the pixel value of the differential image in the y direction.
- the feature point detection unit 335 calculates the feature amount R(x, y) of the pixel (x, y) using the matrix M(x, y) and the following formula (Equation 3) (S313).
- detM(x, y) is the value of the determinant of matrix M(x, y), and trM is the trace of matrix M(x, y).
- k is a parameter specified by the user, and is specified in the range of 0.04 to 0.06, for example.
- the feature point detection unit 335 performs the processes of S309 to S313 on all pixels of the input image (S317). Therefore, if the processes of S309 and S313 have not been performed on all pixels of the input image (S317/No), the process returns to S309, and if the processes of S309 and S313 have been performed on all pixels of the input image ( S317/Yes), the process proceeds to S321.
- the feature point detection unit 335 detects feature points based on the feature amounts R(x, y) of all pixels (S321).
- the feature point detection unit 335 uses a pixel position where the feature amount R(x, y) is maximum and is equal to or greater than a threshold specified by the user as a feature point (for example, a corner point) of the image. To detect.
- FIG. 8 is an explanatory diagram for explaining an operation processing example related to determination of a set of feature points according to the present disclosure.
- FIG. 8 illustrates an example of a method by which the pair determination unit 339 determines a set of feature points from each feature point included in each of the two images.
- the image captured by the left camera 15A may be referred to as the Left image
- the image captured by the right camera 15B may be referred to as the Right image.
- the pair determination unit 339 acquires one feature point on the Left image side (S401).
- the pair determination unit 339 acquires u ⁇ v image blocks as feature points from the Left image (S405).
- the pair determination unit 339 acquires one feature point on the Right image side (S409).
- the pair determination unit 339 acquires u ⁇ v image blocks centered on the feature points from the Right image (S413).
- the pair determination unit 339 performs correlation calculation between the image blocks on the Light side and the image blocks on the Right side (S417).
- the pair determination unit 339 may calculate the degree of correlation of each feature point using a known calculation method for the correlation calculation according to the present disclosure. Either one may be used to calculate the degree of correlation of each feature point.
- FIG. 9 is an explanatory diagram for explaining a specific example of correlation calculation according to the present disclosure.
- the pair determination unit 339 acquires pixel values in a range of u ⁇ v blocks with a certain feature point in the Left image as the origin.
- the pair determination unit 339 similarly acquires the range of u ⁇ v blocks with a certain feature point as the origin on the Right image side.
- the pair determination unit 339 applies Equations 4 to 7 described above to each pixel value I l of the Left image and each pixel value I R of the Right image in the range of uxv to obtain feature points Calculate the degree of correlation of
- the pair determination unit 339 determines that the feature points on the Left image side obtained in S401 have the highest degree of correlation.
- the feature points on the Right image side are left as feature point set candidates (S421).
- the pair determination unit 339 executes the processing of S409 to S421 between the feature points on the Left image side acquired in S401 and all the feature points on the Right image side (S425). Therefore, if all the feature points on the right image side have not been checked with respect to the feature points on the left image side acquired in S401 (S425/No), the process returns to S409, and all the feature points on the right image side are checked. If checked (S425/Yes), the process proceeds to S429.
- the pair determination unit 339 determines the feature points on the Left image side acquired in S401 and the feature points on the Right image side finally remaining in S421. It is determined whether or not the correlation value is equal to or greater than a predetermined value (S429). If the correlation value is less than the predetermined value (S429/No), the process proceeds to S437, and if the correlation value is equal to or greater than the predetermined value (S429/Yes), the process proceeds to S433.
- the pair determination unit 339 determines that there is no feature point on the right image side that corresponds to the set of feature points among the feature points on the left image side acquired in S401. , is determined (S437).
- the pair determination unit 339 determines the feature points on the Left image side acquired in S401 and the feature points on the Right image side finally remaining in S421. It is determined as a set of points (S433).
- the pair determination unit 339 executes the processing related to determination of feature point pairs in S401 to S437 for all feature points on the Left image side (S441). Therefore, if the processing related to the determination of a set of feature points has not been executed for all feature points on the Left image side (S441/No), the process returns to S401 again, and all feature points on the Left image side are (S441/Yes), the pair determination unit 339 according to the present disclosure ends the processing.
- FIG. 10 is an explanatory diagram for explaining an example of operation processing related to determination of whether or not an image group is suitable for deviation determination according to the present disclosure.
- the pair determination unit 339 does not necessarily determine feature points at the same target position as a set of feature points. Therefore, the larger the number of feature points detected by the feature point detection unit 335 and the number of pairs of feature points discriminated by the pair discrimination unit 339 (that is, the larger the number of samples), the more misjudgment of feature point pairs. It may be possible to reduce another effect.
- FIG. 9 is an explanatory diagram for explaining an operation processing example for determining whether or not the image group is suitable for the deviation determination processing according to the present disclosure.
- the feature point detection unit 335 detects feature points from each image (S109).
- the estimation unit 343 determines whether or not the number of detected feature points is equal to or greater than a predetermined value (S501). If the number of feature points is greater than or equal to the predetermined value (S501/Yes), the process proceeds to S113, and if the number of feature points is less than the predetermined value (S501/No), the process proceeds to S525.
- the pair determination unit 339 determines pairs of feature points from each image (S113).
- the estimating unit 343 advances the process to S513, and if the number of sets of feature points is less than the predetermined value (S509/No ), and the process proceeds to S525.
- the estimation unit 343 calculates the amount of change in feature point positions between images (S513).
- the estimation unit 343 determines whether or not the calculated amount of change in the feature point position is equal to or greater than a predetermined value (S517). If the amount of change in feature point position is greater than or equal to the predetermined value (S517/Yes), the process proceeds to S521, and if the amount of change in feature point position is less than the predetermined value (S517/No), the process proceeds to S525. can be advanced to If the moving body 5 has a sensor that acquires the amount of motion, the amount of change in the position of the feature point may be estimated based on the motion information of the moving body 5 acquired by the sensor.
- the estimating unit 343 determines that the image group is suitable for the deviation determination process (S521), and the information processing apparatus 30 according to the present disclosure. terminates the process.
- the estimation unit 343 determines that the image group is not suitable for the shift determination process (S525), and the information processing apparatus 30 according to the present disclosure ends the process.
- the estimation unit 343 may determine whether or not the image group is suitable for the shift determination process by combining the processes of 1 or 2 without executing all the processes of S501, S509 and S517. .
- the estimation unit 343 may divide each image (for example, divide it into four) and perform the processing of S501, S509, or S517 on each of the divided areas.
- the discrepancy determination process includes two processes, a provisional determination process and a final determination process. The details of the provisional determination and the final determination will be sequentially described below.
- FIG. 11A is an explanatory diagram for explaining a first process example of a provisional determination method for determining whether or not a shift has occurred in the stereo camera 10 according to the present disclosure.
- the estimating unit 343 calculates the feature included in each image based on each image captured by the left camera 15A and the right camera 15B at the timing of time T and the extrinsic parameter P1 of the left camera 15A and the extrinsic parameter P2 of the right camera 15B. Calculate the 3D position of a point.
- the feature points for which the three-dimensional positions are calculated are the feature points determined by the pair determination unit 339 to be a set of feature points.
- the estimation unit 343 calculates the position of the right camera 15B at the time TT1.
- the extrinsic parameter P4' is estimated as the first extrinsic parameter.
- FIG. 11B is an explanatory diagram for explaining a second process example of a provisional determination method as to whether or not there is a deviation in the stereo camera 10 according to the present disclosure.
- the estimation unit 343 captures the image obtained by the left camera 15A at the timing of time T and the image obtained by the right camera 15B at the timing of time TT1, the left camera Based on the extrinsic parameter P1 of 15A and the extrinsic parameter P4' of the right camera 15B, the three-dimensional positions of feature points included in each image are calculated.
- the estimation unit 343 determines the external position of the left camera 15A at time T-T1. Estimate the parameter P3' as the first extrinsic parameter.
- FIG. 11C is an explanatory diagram for explaining a third process example of a provisional determination method as to whether or not a shift has occurred in the stereo camera 10 according to the present disclosure.
- the estimation unit 343 based on each image captured by the left camera 15A and the right camera 15B at the timing of time T-1 and the external parameters P3′ and P4′ estimated as the first external parameters, Three-dimensional positions of feature points included in each image are calculated.
- the estimating unit 343 calculates the extrinsic parameter P1′ of the left camera 15A at time T based on the calculated three-dimensional position of the feature point and the feature point of the image captured by the left camera 15A at time T. Estimated as the second extrinsic parameter.
- the deviation detection unit 347 determines whether a deviation related to the external parameter has occurred in either the left camera 15A or the right camera 15B. provisionally determined. For example, when the difference between the extrinsic parameter P1′ and the extrinsic parameter P1 is equal to or greater than a predetermined value, the deviation detection unit 347 provisionally determines that a deviation related to the extrinsic parameter has occurred in either the left camera 15A or the right camera 15B. judge.
- the deviation related to the external parameter includes, for example, the deviation related to the mounting angle and the mounting position of either the left camera 15A or the right camera 15B.
- the estimation unit 343 estimates the extrinsic parameter P4' in the first step described above. Subsequently, the estimator 343 may estimate the extrinsic parameter P1' based on the extrinsic parameter P2 and the extrinsic parameter P4'. This makes it possible to omit the second step, thereby simplifying the process.
- the estimating unit 343 estimates the extrinsic parameter P3' and the extrinsic parameter P4' through the first step and the second step. Subsequently, the deviation detection unit 347 compares the external parameters P1 and P2 at the time of the previous setting with the external parameters P3′ and P4′ estimated by the estimation unit 343, and detects the left camera 15A or right camera 15B. It may be determined whether or not a deviation related to the external parameter has occurred in any of the above. This makes it possible to omit the third step, thereby simplifying the process.
- 11A to 11C illustrate an example in which the estimating unit 343 estimates the extrinsic parameter P1′ of the left camera 15A through the first to third steps. good too.
- the deviation detection unit 347 determines whether the deviation related to the external parameter has occurred in either the left camera 15A or the right camera 15B. It may be provisionally determined whether or not.
- the deviation detection unit 347 does not make a binary determination as to whether or not a deviation has occurred, but makes a determination based on a plurality of stages such as the degree of risk, or a determination regarding the deviation as a continuous value such as the probability that a deviation has occurred.
- the notification information generation unit 351 and the operation control unit 355 may generate notification information and control the operating device 20 according to the degree of risk.
- provisional determination processing of deviation from a group of images obtained by photographing with the stereo camera 10 may be performed at three or more timings.
- the deviation detection unit 347 may use the result of one provisional determination process as the final determination result, or may estimate the final determination result based on the results of a plurality of provisional determination processes. good. Next, an example of final determination as to whether or not a deviation according to the present disclosure has occurred will be described.
- FIG. 12 is an explanatory diagram for explaining an example of final determination as to whether or not a deviation has occurred according to the present disclosure.
- the deviation detection unit 347 determines whether or not a majority of the provisional determination results of deviation have been performed over the predetermined number of times (S601). If the judgment result that there is a deviation is not the majority (S601/No), the process proceeds to S605, and if the judgment result that there is a deviation is the majority (S601/Yes), the process proceeds to S609.
- the deviation detection unit 347 determines that there is no deviation related to the external parameters of the stereo camera 10, and the information processing apparatus 30 according to the present disclosure performs the process. finish.
- the deviation detection unit 347 determines that there is a deviation related to the external parameters of the stereo camera 10 (S609).
- the notification information generation unit 351 generates notification information related to the discrepancy, and causes the communication unit 310 to transmit it to the information terminal TB (S613).
- the operation control unit 355 executes control related to a predetermined operation of the moving body 5 (for example, limiting the speed of the moving body 5) (S617), and the information processing device 30 according to the present disclosure ends the processing.
- FIG. 13 is an explanatory diagram for explaining an example of notification information generated by the notification information generation unit 351.
- the display D of the information terminal TB may display notification information N related to the deviation in addition to the image acquired by the stereo camera, as shown in FIG. 13, for example.
- notification information N may be video notification information as shown in FIG. 13, or may be audio notification information.
- the notification information generation unit 351 may generate notification information for obtaining permission as to whether or not the operation control unit 355 may execute control regarding a predetermined operation. For example, when the user selects to execute control regarding a predetermined operation, the operation control unit 355 may execute control regarding a predetermined operation of the moving body 5 .
- the deviation detection unit 347 may include an image group determined as an image group unsuitable for deviation determination processing in S525 of FIG. For example, if the number of unidentifiable votes is the largest among the provisional determination results of the deviation performed over a predetermined number of times, the deviation detection unit 347 determines whether or not there is a deviation as "impossible to determine". can be judged to
- the estimation unit 343 determines whether or not the image group is suitable for deviation determination. This makes it possible to exclude a group of images inappropriate for deviation determination from the determination process, and the deviation detection unit 347 can detect deviation occurring in the left camera 15A or the right camera 15B with higher accuracy.
- the deviation detection unit 347 provisionally determines whether or not a deviation has occurred a plurality of times, and when the number of times provisionally determined that a deviation has occurred satisfies a predetermined criterion, , the left camera 15A or the right camera 15B is determined to have a deviation related to the external parameter. As a result, the deviation detection unit 347 can reduce the influence of an erroneous determination that can occur with one determination result, and can determine whether or not a deviation has occurred with higher accuracy.
- FIG. 14 is a block diagram showing the hardware configuration of the information processing device 30.
- the information processing device 30 includes a CPU (Central Processing Unit) 3001 , a ROM (Read Only Memory) 3002 , a RAM (Random Access Memory) 3003 and a host bus 3004 .
- the information processing device 30 also includes a bridge 3005 , an external bus 3006 , an interface 3007 , an input device 3008 , an output device 3010 , a storage device (HDD) 3011 , a drive 3012 and a communication device 3015 .
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the information processing device 30 also includes a bridge 3005 , an external bus 3006 , an interface 3007 , an input device 3008 , an output device 3010 , a storage device (HDD) 3011 , a drive 3012 and a communication device 3015 .
- HDMI storage device
- the CPU 3001 functions as an arithmetic processing device and a control device, and controls general operations within the information processing device 30 according to various programs.
- the CPU 3001 may be a microprocessor.
- a ROM 3002 stores programs, calculation parameters, and the like used by the CPU 3001 .
- a RAM 3003 temporarily stores programs used in the execution of the CPU 3001, parameters that change as appropriate during the execution, and the like. These are interconnected by a host bus 3004 comprising a CPU bus or the like. Functions such as the estimating unit 343 and the deviation detecting unit 347 described with reference to FIG.
- the host bus 3004 is connected via a bridge 3005 to an external bus 3006 such as a PCI (Peripheral Component Interconnect/Interface) bus.
- an external bus 3006 such as a PCI (Peripheral Component Interconnect/Interface) bus.
- PCI Peripheral Component Interconnect/Interface
- bridge 3005 and external bus 3006 do not necessarily have to be configured separately, and these functions may be implemented in one bus.
- the input device 3008 includes input means for the user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 3001 . etc.
- input means for the user to input information such as a mouse, keyboard, touch panel, button, microphone, switch, and lever
- an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 3001 . etc.
- the user of the information processing device 30 can input various data to the information processing device 30 and instruct processing operations.
- the output device 3010 includes display devices such as liquid crystal display devices, OLED devices, and lamps, for example.
- output device 3010 includes audio output devices such as speakers and headphones.
- the output device 3010 outputs reproduced content, for example.
- the display device displays various information such as reproduced video data as text or images.
- the audio output device converts reproduced audio data and the like into audio and outputs the audio.
- the storage device 3011 is a device for storing data.
- the storage device 3011 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 3011 is composed of, for example, an HDD (Hard Disk Drive).
- the storage device 3011 drives a hard disk and stores programs executed by the CPU 3001 and various data.
- the drive 3012 is a storage medium reader/writer, and is built in or externally attached to the information processing apparatus 30 .
- the drive 3012 reads out information recorded in the attached removable storage medium 35 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 3003 .
- Drive 3012 can also write information to removable storage medium 35 .
- the communication device 3015 is, for example, a communication interface configured with a communication device or the like for connecting to the network 1. Also, the communication device 3015 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- LTE Long Term Evolution
- the functions of the information processing device 30 may be realized by the information terminal TB.
- the stereo camera 10 transmits an image group obtained by photographing a subject to the information terminal TB. Based on the received image group, the information terminal TB may perform various processes related to determining whether or not there is a deviation related to the external parameter in the left camera 15A or the right camera 15B.
- the feature point detection unit 335 may detect the blur amount from the image for which the feature point is to be detected, and detect the feature point from the image in which the blur amount is less than a predetermined value. Also, if the moving body 5 is equipped with a sensor that acquires motion information such as an IMU (Inertial Measurement Unit), the feature point detection unit 335 may estimate the blur amount based on the motion information acquired by the sensor. good.
- IMU Inertial Measurement Unit
- each step in the processing of the information processing apparatus 30 in this specification does not necessarily have to be processed in chronological order according to the order described as the flowchart.
- each step in the processing of the information processing device 30 may be processed in an order different from the order described in the flowchart or in parallel.
- a computer program for causing hardware such as a CPU, ROM, and RAM incorporated in the information processing device 30 and the information terminal TB to exhibit functions equivalent to the respective components of the information processing device 30 and the information terminal TB described above. can be created.
- a storage medium storing the computer program is also provided.
- a calculation unit that calculates three-dimensional positions of feature points included in each image based on each image obtained by photographing a subject at a first timing by a plurality of cameras and the extrinsic parameters of the plurality of cameras; the plurality of cameras at the second timing based on one image included in each image obtained by photographing the subject at the second timing and the three-dimensional position of the feature point; a first estimating unit that estimates a first extrinsic parameter, which is an extrinsic parameter of the camera that captured the first image; estimating a second extrinsic parameter, which is an extrinsic parameter of one of the plurality of cameras at the first timing, based on the first extrinsic parameter estimated by the first estimating unit; an estimator of 2; Based on the second extrinsic parameter of any one of the plurality of cameras estimated by the second estimating unit and the previous extrinsic parameter of the camera, the plurality of cameras has a
- a determination unit that determines whether or not the An information processing device.
- the calculation unit the first timing when the number of each feature point included in the group of images obtained by photographing the subject at the first timing and the second timing by the plurality of cameras is equal to or greater than a predetermined number; calculating the three-dimensional position of the feature point included in each image based on each image obtained by photographing the subject with and the external parameters of the plurality of cameras; The information processing device according to (1) above.
- a discrimination unit that discriminates, as a set of feature points, feature points having the highest degree of correlation between feature points included in one of the images and feature points included in the other image; further comprising The calculation unit calculates feature points included in each image based on each feature point determined as a set of feature points among the feature points included in each image, and the external parameters of the plurality of cameras. calculating a three-dimensional position; The information processing apparatus according to (1) or (2). (4) The calculation unit Each feature point determined as a set of feature points when the number of sets of feature points included in each image obtained by photographing an object by the plurality of cameras at a first timing satisfies a predetermined condition.
- the information processing device includes a case where the number of sets of feature points is equal to or greater than a predetermined number, The information processing device according to (4) above.
- the calculation unit calculating a three-dimensional position of the feature point included in each of the images when a change amount of the imaging position of the feature point between the first timing and the second timing is equal to or greater than a predetermined value; , The information processing apparatus according to any one of (1) to (5) above.
- the first estimation unit The second timing based on another image different from the first image included in each image obtained by photographing the subject at the second timing by the plurality of cameras and the three-dimensional position of the feature point. estimating the extrinsic parameters of the camera that captured the other image among the plurality of cameras in The information processing apparatus according to any one of (1) to (6).
- the second estimation unit any one of the plurality of cameras at the first timing based on the first extrinsic parameters of the plurality of cameras at the second timing estimated by the first estimation unit; estimating the extrinsic parameters, The information processing device according to (7) above.
- the information processing apparatus according to any one of (1) to (8), further comprising: (10) an operation control unit that executes control related to a predetermined operation of a moving body equipped with the plurality of cameras when the determination unit determines that the plurality of cameras has a deviation related to the external parameter;
- the information processing apparatus according to any one of (1) to (9), further comprising: (11)
- the determination unit is Using a plurality of image groups, it is tentatively determined whether or not a deviation related to the extrinsic parameter has occurred in the plurality of cameras over a plurality of times, and it is tentatively determined that a deviation related to the extrinsic parameter has occurred.
- the predetermined criterion includes a case where the number of times provisionally determined that a deviation related to the posture information of the plurality of cameras has occurred is greater than or equal to the number of times provisionally determined that a deviation has not occurred, The information processing device according to (11) above.
- a computer-implemented information processing method comprising: (14) a calculation function for calculating the three-dimensional position of a feature point included in each image based on each image obtained by photographing a subject at a first timing by a plurality of cameras and the extrinsic parameters of the plurality of cameras; the plurality of cameras at the second timing based on one image included in each image obtained by photographing the subject at the second timing and the three-dimensional position of the feature point; a first estimating function for estimating a first extrinsic parameter, which is an extrinsic parameter of the camera that captured the first image; estimating a second extrinsic parameter, which is an extrinsic parameter of one of the plurality of cameras at the first timing, based on the first extrinsic parameter estimated by the first estimation function; 2 estimation function; Based on the second extrinsic parameter of any one of the plurality of cameras estimated by the second estimation function and the previous extrinsic parameter of the camera, the plurality of cameras has a deviation related
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
1.情報処理システムの概要
2.情報処理装置30の機能構成例
3.課題の整理
4.動作処理例
4.1.全体動作
4.2.撮影例
4.3.画像処理例
4.4.Harrisの方法による特徴点の検出
4.5.特徴点の組を判別
4.6.判定処理に適した画像群であるか否かを判定
4.7.カメラズレの判定
5.作用効果例
6.ハードウェア構成例
7.補足
本開示の一実施形態として、複数のカメラの撮影により得られた各画像に基づき、当該複数のカメラにカメラパラメータに係るズレが生じたか否かを検出する仕組みについて説明する。
ネットワーク1は、ネットワーク1に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク1は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク1は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。
移動体5は、自律制御またはユーザの操作により移動する装置である。移動体5は、例えば図1に示すようなドローンであってもよい。また、移動体5は、車、船舶または航空機であってもよい。
ステレオカメラ10は、被写体を撮影して画像を取得する。また、ステレオカメラ10は、2台のカメラを並べて搭載することで被写体の奥行方向の情報を取得する。以下の説明では、ステレオカメラ10に搭載される2台のカメラのうち、被写体に向かって左側に搭載されるカメラを左カメラ15Aと称し、被写体に向かって右側に搭載されるカメラを右カメラ15Bと称する。なお、以下の説明では、特段区別する必要がない場合は、左カメラ15Aおよび右カメラ15Bを総称してステレオカメラ10と表現する場合がある。
情報処理装置30は、ステレオカメラ10により複数のタイミングで被写体を撮影して得られた各画像に基づき、左カメラ15Aまたは右カメラ15Bの外部パラメータを推定する。また、情報処理装置30は、推定した外部パラメータおよび前回設定時における外部パラメータに基づき、左カメラ15Aまたは右カメラ15Bに外部パラメータに係るズレが生じたか否かを判定する。すなわち、左カメラ15Aまたは右カメラ15Bの実際の設置位置または姿勢と、設定されている外部パラメータに対応する設置位置または姿勢との間にズレが生じているか否かを判定する。
情報端末TBは、ユーザOPにより使用される端末である。情報端末TBは、例えば図1に示すようなタブレット端末であってもよいし、スマートフォンやPC(Personal Computer)等の各種機器であってもよい。
図2は、本開示に係る情報処理装置30の機能構成例を説明するための説明図である。図2に示すように、移動体5は、ステレオカメラ10と、動作装置20と、情報処理装置30とを備える。ステレオカメラ10の機能構成の説明については図1を参照して説明したため、図2では説明を省略する。
動作装置20は、後述する動作制御部355の制御に従い、動作する装置である。動作装置20は、例えば、エンジンや制動装置などを含む。動作装置20の動作の具体例については後述する。
本開示に係る情報処理装置30は、図2に示すように、通信部310と、記憶部320と、制御部330とを備える。
通信部310は、情報端末TBとの間で各種通信を行う。例えば、通信部310は、情報端末TBから移動体5の操作情報を受信する。また、通信部310は、後述する通知情報生成部351により生成された通知情報を情報端末TBに送信する。
記憶部320は、ソフトウェアおよび各種データを保持する。例えば、記憶部320は、ズレ検出部347により判定された暫定的な判定結果を保存する。また、記憶部320は、保存した暫定的な判定結果が所定数を超えた際に、古い判定結果から順に削除してもよい。
制御部330は、本開示に係る情報処理装置30の動作全般を制御する。制御部330は、図2に示すように、画像処理部331と、特徴点検出部335と、ペア判別部339と、推定部343と、ズレ検出部347と、通知情報生成部351と、動作制御部355と、測距部359と、測距データ利用部363と、を備える。
移動体5は、自律移動やユーザOPの操作による移動など、様々な運用方針を適用可能である。いずれの運用方針においても、移動体5が走行中に障害物や動物などの物体に衝突する可能性を低減するため、移動体5から物体までの距離を推定することが望ましい。
<4.1.全体動作>
図4は、本開示に係る情報処理装置30の動作処理例を説明するための説明図である。まず、ステレオカメラ10は、被写体を撮影し、複数の画像を含む画像群を取得する(S101)。
図5は、本開示に係るステレオカメラ10の撮影例を説明するための説明図である。本開示に係るステレオカメラ10は、時間幅T1の間隔で撮影をする。例えば、時間幅T1は、被写体のオーバラップを含みつつ各画像の特徴点の動き量が検知できる程度(例えば、T1=0.2秒)に設定されてもよい。このような時間幅T1の間隔で撮影して得られた各画像を画像群PGと表現する場合がある。なお、図5では、ステレオカメラ10が時間幅T1の間隔で2回撮影し、当該2回の撮影により得られた各画像を一の画像群PGとしているが、ステレオカメラ10は、時間幅T1の間隔で3回以上撮影してもよい。この場合、当該回数に応じて得られた各画像を一の画像群PGとする。
本開示に係る画像処理に係る動作処理例を説明する。
図6は、本開示に係る画像処理に係る動作処理例を説明するための説明図である。まず、画像処理部331は、ステレオカメラ10のカメラパラメータを用いて、ステレオカメラ10の撮影により得られた各画像に対して、レンズ歪み除去を実行する(S201)。
図7は、Harrisの方法による特徴点の検出方法の一例を説明するための説明図である。まず、特徴点検出部335は、入力された各々の画像からx方向の微分画像を生成する(S301)。
図8は、本開示に係る特徴点の組の判別に係る動作処理例を説明するための説明図である。図8では、ペア判別部339が2の画像の各々に含まれる各特徴点から特徴点の組を判別する方法の一例を説明する。以下の説明では、2の画像のうち、左カメラ15Aの撮影により得られた画像をLeft画像、右カメラ15Bの撮影により得られた画像をRight画像と表現する場合がある。
図10は、本開示に係るズレ判定に適した画像群であるか否かの判定に係る動作処理例を説明するための説明図である。ズレ判定処理の判定精度を向上するためには、各画像から判別された特徴点の組が、同じ対象位置を示している必要がある。一方、ペア判別部339は、必ずしも同じ対象位置にある特徴点を特徴点の組として判別するとは限らない。そこで、特徴点検出部335により検出された特徴点の数量、また、ペア判別部339により判別された特徴点の組数が多いほど(即ち、サンプル数が多いほど)、特徴点の組の誤判別の影響を低減することが可能になり得る。
本開示に係るズレの判定処理は、暫定的な判定処理と最終的な判定処理の2つの過程がある。以下、暫定的判定および最終的判定の詳細について順次説明する。
本開示に係るズレが生じたか否かの暫定的な判定方法の一例を、図11A~11Cを参照して説明する。
図11Aは、本開示に係るステレオカメラ10にズレが生じたから否かの暫定的な判定方法の第1工程例を説明するための説明図である。まず、推定部343は、左カメラ15Aおよび右カメラ15Bが時刻Tのタイミングで撮影した各画像と、左カメラ15Aの外部パラメータP1および右カメラ15Bの外部パラメータP2に基づき、各画像に含まれる特徴点の三次元位置を算出する。なお、三次元位置が算出される特徴点とは、ペア判別部339により特徴点の組であると判別された各特徴点である。
図11Bは、本開示に係るステレオカメラ10にズレが生じたから否かの暫定的な判定方法の第2工程例を説明するための説明図である。第1工程の続き、推定部343は、左カメラ15Aが時刻Tのタイミングで撮影して得られた画像および右カメラ15Bが時刻T-T1のタイミングで撮影して得られた画像と、左カメラ15Aの外部パラメータP1および右カメラ15Bの外部パラメータP4´に基づき、各画像に含まれる特徴点の三次元位置を算出する。
図11Cは、本開示に係るステレオカメラ10にズレが生じたから否かの暫定的な判定方法の第3工程例を説明するための説明図である。第2工程に続き、推定部343は、左カメラ15Aおよび右カメラ15Bが時刻T-1のタイミングで撮影した各画像と、第1の外部パラメータとして推定した外部パラメータP3´およびP4´に基づき、各画像に含まれる特徴点の三次元位置を算出する。
ズレ検出部347は、第3工程で得られた外部パラメータP1´および前回設定時における外部パラメータP1に基づき、左カメラ15Aまたは右カメラ15Bのいずれかに外部パラメータに係るズレが生じたか否かを暫定的に判定する。例えば、ズレ検出部347は、外部パラメータP1´および外部パラメータP1の差分が所定値以上であった際に、左カメラ15Aまたは右カメラ15Bのいずれかに外部パラメータに係るズレが生じたと暫定的に判定する。なお、外部パラメータに係るズレは、例えば、左カメラ15Aまたは右カメラ15Bのいずれかの取り付け角度のズレや取り付け位置に係るズレを含む。
図12は、本開示に係るズレが生じたか否かの最終判定の一例を説明するための説明図である。まず、ズレ検出部347は、当該所定回数に亘って実行されたズレの暫定的な判定結果のうち、ズレがあるという判定結果が過半数であるか否かを判定する(S601)。ズレがあるという判定結果が過半数でない場合(S601/No)、処理はS605に進められ、ズレがあるという判定結果が過半数である場合(S601/Yes)、処理はS609に進められる。
以上説明した本開示によれば、多様な作用効果が得られる。例えば、ステレオカメラ10の撮影により得られた画像群に基づき、左カメラ15Aまたは右カメラ15Bに外部パラメータに係るズレが生じたか否かを判定することが可能である。そのため、他のセンサによるセンシング情報を用いずに、左カメラ15Aまたは右カメラ15Bの像面垂直方向(上述したx方向)に加え、像面水平方向(上述したy方向)のズレを検出することが可能である。
以上、本開示に係る実施形態を説明した。上述した各種情報処理は、ソフトウェアと、以下に説明する情報処理装置30のハードウェアとの協働により実現される。なお、以下に説明するハードウェア構成は、情報端末TBにも適用可能である。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
(1)
複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する算出部と、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定する第1の推定部と、
前記第1の推定部により推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定する第2の推定部と、
前記第2の推定部により推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定する判定部と、
を備える、情報処理装置。
(2)
前記算出部は、
前記複数のカメラが前記第1のタイミングおよび前記第2のタイミングで被写体を撮影して得られた画像群に含まれる各特徴点の数が所定数以上であった際に、前記第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
前記(1)に記載の情報処理装置。
(3)
前記各画像のうちいずれか一方の画像に含まれる特徴点に対し、他方の画像に含まれる各特徴点の相関度合が最も高い特徴点を特徴点の組として判別する判別部、
を更に備え、
前記算出部は、前記各画像に含まれる特徴点のうち、前記特徴点の組として判別された各特徴点と、前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
前記(1)または前記(2)に記載の情報処理装置。
(4)
前記算出部は、
前記複数のカメラが第1のタイミングで被写体を撮影して得られた各画像に含まれる特徴点の組数が所定の条件を満たした際に、前記特徴点の組として判別された各特徴点と、前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
前記(3)に記載の情報処理装置。
(5)
前記所定の条件は、前記特徴点の組数が所定数以上である場合を含む、
前記(4)に記載の情報処理装置。
(6)
前記算出部は、
前記第1のタイミングと、前記第2のタイミングとの間における前記特徴点の撮像位置の変化量が所定値以上であった際に、前記各画像に含まれる特徴点の三次元位置を算出する、
前記(1)から前記(5)までのうちいずれか一項に記載の情報処理装置。
(7)
前記第1の推定部は、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる前記一の画像と異なる他の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記他の画像を撮影したカメラの外部パラメータを推定する、
前記(1)から前記(6)までのうちいずれか一項に記載の情報処理装置。
(8)
前記第2の推定部は、
前記第1の推定部により推定された前記第2のタイミングにおける前記複数のカメラの前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つの前記第2の外部パラメータを推定する、
前記(7)に記載の情報処理装置。
(9)
前記判定部により前記複数のカメラに前記外部パラメータに係るズレが生じたと判定された際に、前記複数のカメラを利用するユーザに当該ズレに関して通知する通知部、
を更に備える、前記(1)から前記(8)までのうちいずれか一項に記載の情報処理装置。
(10)
前記判定部により前記複数のカメラに前記外部パラメータに係るズレが生じたと判定された際に、前記複数のカメラを搭載する移動体の所定の動作に関する制御を実行する動作制御部、
を更に備える、前記(1)から前記(9)までのうちいずれか一項に記載の情報処理装置。
(11)
前記判定部は、
複数の画像群を用いて、複数回に亘って前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを暫定的に判定し、前記外部パラメータに係るズレが生じたと暫定的に判定された回数が所定の基準を満たした際に、前記複数のカメラに前記外部パラメータに係るズレが生じたと判定する、
前記(9)または前記(10)に記載の情報処理装置。
(12)
前記所定の基準は、前記複数のカメラの姿勢情報に係るズレが生じたと暫定的に判定された回数が、ズレが生じていないと暫定的に判定された回数以上であった場合を含む、
前記(11)に記載の情報処理装置。
(13)
複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出することと、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定することと、
推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定することと、
推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定することと、
を含む、コンピュータにより実行される情報処理方法。
(14)
複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する算出機能と、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定する第1の推定機能と、
前記第1の推定機能により推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定する第2の推定機能と、
前記第2の推定機能により推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定する判定機能と、
をコンピュータに実現させる、プログラム。
5 移動体
10 ステレオカメラ
15A 左カメラ
15B 右カメラ
20 動作装置
30 情報処理装置
310 通信部
320 記憶部
330 制御部
331 画像処理部
335 特徴点検出部
339 ペア判別部
343 推定部
347 ズレ検出部
351 通知情報生成部
355 動作制御部
359 測距部
363 測距データ利用部
Claims (14)
- 複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する算出部と、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定する第1の推定部と、
前記第1の推定部により推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定する第2の推定部と、
前記第2の推定部により推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定する判定部と、
を備える、情報処理装置。 - 前記算出部は、
前記複数のカメラが前記第1のタイミングおよび前記第2のタイミングで被写体を撮影して得られた画像群に含まれる各特徴点の数が所定数以上であった際に、前記第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
請求項1に記載の情報処理装置。 - 前記各画像のうちいずれか一方の画像に含まれる特徴点に対し、他方の画像に含まれる各特徴点の相関度合が最も高い特徴点を特徴点の組として判別する判別部、
を更に備え、
前記算出部は、前記各画像に含まれる特徴点のうち、前記特徴点の組として判別された各特徴点と、前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
請求項2に記載の情報処理装置。 - 前記算出部は、
前記複数のカメラが第1のタイミングで被写体を撮影して得られた各画像に含まれる特徴点の組数が所定の条件を満たした際に、前記特徴点の組として判別された各特徴点と、前記複数のカメラの前記外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する、
請求項3に記載の情報処理装置。 - 前記所定の条件は、前記特徴点の組数が所定数以上である場合を含む、
請求項4に記載の情報処理装置。 - 前記算出部は、
前記第1のタイミングと、前記第2のタイミングとの間における前記特徴点の撮像位置の変化量が所定値以上であった際に、前記各画像に含まれる特徴点の三次元位置を算出する、
請求項5に記載の情報処理装置。 - 前記第1の推定部は、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる前記一の画像と異なる他の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記他の画像を撮影したカメラの外部パラメータを推定する、
請求項6に記載の情報処理装置。 - 前記第2の推定部は、
前記第1の推定部により推定された前記第2のタイミングにおける前記複数のカメラの前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つの前記第2の外部パラメータを推定する、
請求項7に記載の情報処理装置。 - 前記判定部により前記複数のカメラに前記外部パラメータに係るズレが生じたと判定された際に、前記複数のカメラを利用するユーザに当該ズレに関して通知する通知部、
を更に備える、請求項8に記載の情報処理装置。 - 前記判定部により前記複数のカメラに前記外部パラメータに係るズレが生じたと判定された際に、前記複数のカメラを搭載する移動体の所定の動作に関する制御を実行する動作制御部、
を更に備える、請求項9に記載の情報処理装置。 - 前記判定部は、
複数の画像群を用いて、複数回に亘って前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを暫定的に判定し、前記外部パラメータに係るズレが生じたと暫定的に判定された回数が所定の基準を満たした際に、前記複数のカメラに前記外部パラメータに係るズレが生じたと判定する、
請求項10に記載の情報処理装置。 - 前記所定の基準は、前記複数のカメラの姿勢情報に係るズレが生じたと暫定的に判定された回数が、ズレが生じていないと暫定的に判定された回数以上であった場合を含む、
請求項11に記載の情報処理装置。 - 複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出することと、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定することと、
推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定することと、
推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定することと、
を含む、コンピュータにより実行される情報処理方法。 - 複数のカメラが第1のタイミングで被写体を撮影して得られた各画像および前記複数のカメラの外部パラメータに基づき、前記各画像に含まれる特徴点の三次元位置を算出する算出機能と、
前記複数のカメラが第2のタイミングで前記被写体を撮影して得られた各画像に含まれる一の画像と、前記特徴点の三次元位置に基づき、前記第2のタイミングにおける前記複数のカメラのうち前記一の画像を撮影したカメラの外部パラメータである第1の外部パラメータを推定する第1の推定機能と、
前記第1の推定機能により推定された前記第1の外部パラメータに基づき、前記第1のタイミングにおける前記複数のカメラのうちいずれか一つのカメラの外部パラメータである第2の外部パラメータを推定する第2の推定機能と、
前記第2の推定機能により推定された前記複数のカメラのうちいずれか一つのカメラの前記第2の外部パラメータおよび当該カメラの前回外部パラメータに基づき、前記複数のカメラに前記外部パラメータに係るズレが生じたか否かを判定する判定機能と、
をコンピュータに実現させる、プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280036187.3A CN117356101A (zh) | 2021-05-26 | 2022-01-13 | 信息处理装置、信息处理方法和程序 |
JP2023523963A JPWO2022249534A1 (ja) | 2021-05-26 | 2022-01-13 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-088588 | 2021-05-26 | ||
JP2021088588 | 2021-05-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022249534A1 true WO2022249534A1 (ja) | 2022-12-01 |
Family
ID=84229679
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/000907 WO2022249534A1 (ja) | 2021-05-26 | 2022-01-13 | 情報処理装置、情報処理方法およびプログラム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2022249534A1 (ja) |
CN (1) | CN117356101A (ja) |
WO (1) | WO2022249534A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013123123A (ja) * | 2011-12-09 | 2013-06-20 | Fujitsu Ltd | ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム |
JP2017533052A (ja) * | 2014-11-10 | 2017-11-09 | ビジョン アールティ リミテッド | 放射線治療装置と共に使用するための患者監視システムの較正方法 |
WO2018181249A1 (ja) * | 2017-03-31 | 2018-10-04 | パナソニックIpマネジメント株式会社 | 撮像システムおよび校正方法 |
-
2022
- 2022-01-13 CN CN202280036187.3A patent/CN117356101A/zh active Pending
- 2022-01-13 WO PCT/JP2022/000907 patent/WO2022249534A1/ja active Application Filing
- 2022-01-13 JP JP2023523963A patent/JPWO2022249534A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013123123A (ja) * | 2011-12-09 | 2013-06-20 | Fujitsu Ltd | ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム |
JP2017533052A (ja) * | 2014-11-10 | 2017-11-09 | ビジョン アールティ リミテッド | 放射線治療装置と共に使用するための患者監視システムの較正方法 |
WO2018181249A1 (ja) * | 2017-03-31 | 2018-10-04 | パナソニックIpマネジメント株式会社 | 撮像システムおよび校正方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117356101A (zh) | 2024-01-05 |
JPWO2022249534A1 (ja) | 2022-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6355052B2 (ja) | 撮像装置、画像処理装置、撮像方法および記録媒体 | |
JP3801137B2 (ja) | 侵入物体検出装置 | |
JP6141079B2 (ja) | 画像処理システム、画像処理装置、それらの制御方法、及びプログラム | |
JP7272024B2 (ja) | 物体追跡装置、監視システムおよび物体追跡方法 | |
US20180365849A1 (en) | Processing device and processing system | |
EP3314883B1 (en) | Video frame processing | |
EP2960859B1 (en) | Constructing a 3d structure | |
WO2020171379A1 (en) | Capturing a photo using a mobile device | |
KR101202642B1 (ko) | 배경의 특징점을 이용한 전역 움직임 추정 방법 및 장치 | |
JP2018081402A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP7255173B2 (ja) | 人検出装置および人検出方法 | |
JP7384158B2 (ja) | 画像処理装置、移動装置、および方法、並びにプログラム | |
WO2021059765A1 (ja) | 撮像装置、画像処理システム、画像処理方法及びプログラム | |
JP7264308B2 (ja) | 二次元顔画像の2つ以上の入力に基づいて三次元顔モデルを適応的に構築するためのシステムおよび方法 | |
WO2022249534A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US9176221B2 (en) | Distance estimation in camera-based systems utilizing motion measurement and compression attributes | |
JP2018201146A (ja) | 画像補正装置、画像補正方法、注目点認識装置、注目点認識方法及び異常検知システム | |
JP5539565B2 (ja) | 撮像装置及び被写体追跡方法 | |
JP2001116527A (ja) | 立体物検出方法及び装置 | |
CN113936042B (zh) | 一种目标跟踪方法、装置和计算机可读存储介质 | |
CN111614834B (zh) | 电子设备控制方法、装置、电子设备及存储介质 | |
JP2002190027A (ja) | 画像認識による速度測定システム及び速度測定方法 | |
JP5247419B2 (ja) | 撮像装置および被写体追跡方法 | |
KR20210153989A (ko) | 맞춤형 객체 검출 모델을 가진 객체 검출 장치 | |
CN111294507A (zh) | 拍摄控制装置、拍摄系统和拍摄控制方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22810820 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023523963 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18287962 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280036187.3 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22810820 Country of ref document: EP Kind code of ref document: A1 |