WO2021008052A1 - 3d摄影模组镜头精度的标定方法、装置及设备 - Google Patents

3d摄影模组镜头精度的标定方法、装置及设备 Download PDF

Info

Publication number
WO2021008052A1
WO2021008052A1 PCT/CN2019/120172 CN2019120172W WO2021008052A1 WO 2021008052 A1 WO2021008052 A1 WO 2021008052A1 CN 2019120172 W CN2019120172 W CN 2019120172W WO 2021008052 A1 WO2021008052 A1 WO 2021008052A1
Authority
WO
WIPO (PCT)
Prior art keywords
light domain
image
central
calibration
brightness value
Prior art date
Application number
PCT/CN2019/120172
Other languages
English (en)
French (fr)
Inventor
杨书钊
Original Assignee
南昌欧菲生物识别技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南昌欧菲生物识别技术有限公司 filed Critical 南昌欧菲生物识别技术有限公司
Publication of WO2021008052A1 publication Critical patent/WO2021008052A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects

Definitions

  • This application relates to the field of imaging technology, in particular to a method, device and equipment for calibrating the lens accuracy of a 3D camera module.
  • 3D vision technology has also brought a new development direction to mobile terminals.
  • 3D face recognition can be performed through mobile phone cameras.
  • the 3D photography module used in the existing 3D vision technology such as 3D TOF (TOF is called Time Of Flight in Chinese) module, which transmits light to the target through the transmitter, and then receives the reflection from the target The returning light uses the round-trip time of the light to calculate the distance of each part of the target to form a 3D image.
  • 3D TOF Time Of Flight in Chinese
  • This method will have relatively concentrated brightness when the light is irradiated on the target through the lens of the transmitter in the module
  • the central area and the edge area with low brightness will make the final target image blurred or incomplete, especially in the case of a long distance, the brightness energy of the light will be attenuated, and finally even lead to 3D photography modules
  • the target image cannot be acquired and it becomes invalid.
  • the traditional 3D camera module will calibrate the target before it is captured.
  • the existing operation method is basically to increase the light source around the target to fill the light to achieve uniform brightness of each part of the target. This operation method has large errors and cannot effectively improve the accuracy of the 3D camera module.
  • a method, device and equipment for calibrating the lens accuracy of a 3D camera module are provided.
  • a method for calibrating the lens accuracy of a 3D photographic module comprising the steps of: controlling the 3D photographing module to collect a target sample to obtain a sample image, the sample image includes a bright image and a dark image; extracting the central light domain and edge light in the bright image To obtain the brightness value of the central light domain and the edge light domain; compare the brightness value of the central light domain and the edge light domain of the bright image with the brightness value of the corresponding area of the dark image to obtain the brightness value of the central light domain The brightness difference and the brightness difference of the edge light domain; the ratio of the brightness difference of the edge light domain and the brightness difference of the center light domain is obtained as a calibration coefficient; the 3D camera module is subjected to subsequent calibration processing according to the calibration coefficient, and the subsequent calibration processing includes At least one of lens compensation coefficient calibration, temperature compensation coefficient calibration, internal parameter and external parameter calibration, and distortion compensation parameter calibration.
  • a calibration device for the lens accuracy of a 3D photography module includes: an image acquisition module for controlling the 3D photography module to collect a target sample to obtain a sample image, the sample image includes a bright image and a dark image; a light domain extraction module, To extract the central light domain and the edge light domain in the sample image, obtain the brightness value of the central light domain and the brightness value of the edge light domain; the brightness difference acquisition module is used to convert the brightness value of the central light domain and the edge light of the bright image.
  • the brightness value of the domain is compared with the brightness value of the corresponding area of the dark image, and the brightness difference of the center light domain and the brightness difference of the edge light domain are obtained; the coefficient acquisition module is used to obtain the brightness difference of the edge light domain and the brightness of the center light domain The ratio of the difference is used as the calibration coefficient; the calibration module is used to perform subsequent calibration processing on the 3D camera module according to the calibration coefficient.
  • the subsequent calibration processing includes lens compensation coefficient calibration, temperature compensation coefficient calibration, internal parameter and external parameter calibration and distortion compensation parameters At least one of the calibrations.
  • a calibration equipment for the lens accuracy of a 3D camera module which includes a target template, a connector, and a processor.
  • the target template is used for the 3D camera module to collect sample images.
  • the processor is connected to the 3D camera module through the connector.
  • the calibration method of 3D camera module lens accuracy is to calibrate the 3D camera module.
  • 1 is a flowchart of a method for calibrating the lens accuracy of a 3D camera module in an embodiment
  • FIG. 2 is a flowchart of a method for calibrating the lens accuracy of a 3D camera module in another embodiment
  • 3 is a flowchart of a method for calibrating the lens accuracy of a 3D camera module in an embodiment
  • FIG. 4 is a flowchart of a method for calibrating the lens accuracy of a 3D camera module in an embodiment
  • FIG. 5 is a structural block diagram of a calibration system for the lens accuracy of a 3D camera module in an embodiment
  • FIG. 6 is a structural block diagram of a calibration system for the lens accuracy of a 3D camera module in an embodiment
  • FIG. 7 is a schematic flowchart of the steps of a subsequent calibration process of a 3D camera module in an embodiment
  • FIG. 8 is a brightness comparison diagram before and after calibration provided in an embodiment
  • FIG. 9 is a system block diagram of a calibration device for lens accuracy of a 3D camera module in another embodiment
  • FIG. 10 is a system block diagram of a calibration device for lens accuracy of a 3D camera module in another embodiment
  • 11 is a system block diagram of a calibration device for lens accuracy of a 3D camera module in an embodiment
  • FIG. 12 is a system block diagram of a calibration device for lens accuracy of a 3D camera module in an embodiment.
  • a method for calibrating the lens accuracy of a 3D camera module is provided, which is suitable for solving the existence of a high-brightness central light field and a low-brightness edge light field on the target, so that the 3D camera module collects
  • the image data is affected by the brightness and the quality is low.
  • the 3D camera module can be a 3D TOF module, a structured light module, etc.
  • the method includes the following steps:
  • Step S100 Control the 3D photography module to collect the target sample to obtain the sample image.
  • Sample images include light images and dark images. Specifically, the bright image is an image taken by the 3D camera module in an environment where an infrared light source is provided, and the dark image is an image taken by the 3D camera module in an environment where no infrared light source is provided and the lens of the 3D camera module is covered. The resulting image.
  • the specific type of 3D camera module is not unique, it can be 3D structured light module or 3D TOF module. Take the 3D TOF module as a 3D camera module for calibration as an example.
  • the 3D TOF module includes a transmitter and a receiver.
  • the transmitter When image data is collected, the transmitter continuously emits light, where the transmitter is installed There is a lens, the light emitted by the transmitter will first pass through the lens, and then the light will be reflected on the target object. The reflected light is received by the receiver, and the processor connected to the 3D camera module emits and reflects according to the light. Information data such as flight time experienced on the way, and finally get the overall outline of the target.
  • the 3D photography module needs to be calibrated first, that is, the 3D photography module first collects a sample image through step S100, and the sample image is a 3D The subsequent step S200 will be performed based on the sample image collected before the camera module is calibrated.
  • the processor can control the 3D photography module to project the image to the target template, and obtain the sample image obtained by the 3D photography module collecting the image on the target template.
  • Step S200 Extract the central light domain and the edge light domain of the bright image in the sample image, and obtain the brightness value of the central light domain and the brightness value of the edge light domain.
  • the bright image can be divided into a central light domain and an edge light domain.
  • the light emitted by the emitter will pass through the lens before being irradiated on the target sample.
  • the lens due to the influence of optical principles, it will form a higher brightness area and a relatively lower brightness area on the target sample. Normally, the higher brightness area is concentrated in the target sample. In the center, the areas with relatively low brightness are located around the edges at the center.
  • the response on the bright image is that the brightness value at the center of the concentrated brightness is higher, and the brightness value of the relatively low brightness area is lower.
  • the central light domain in S200 is represented as an area with a higher brightness value in the bright image
  • the edge light domain is represented as an area with a relatively low brightness value in the bright image.
  • the processor can identify the central light domain and the edge light domain according to the brightness values of different areas in the bright image, and extract the brightness value of the central light domain and the edge light domain.
  • the central light domain and the edge light domain are a range area, so the range size of the central light domain and the edge light domain can be set according to actual needs, and the actual needs can be, for example, the light emitted by the emitter The degree of divergence or convergence through the lens. When the light emitted by the emitter passes through different lenses, the degree of light concentration and divergence are different. For example, when the lens used has a better condensing effect, the processor can reduce the range of the central light domain in the sample image, and Increase the range of the edge light field.
  • Step S300 Compare the brightness value of the central light domain and the edge light domain of the bright image with the brightness value of the corresponding area of the dark image to obtain the brightness difference of the central light domain and the edge light domain.
  • the size of the dark image is the same as the size of the bright image.
  • the brightness values of all location areas in the bright image can be compared with the brightness values of the corresponding location areas in the dark image to obtain a brightness difference.
  • the impact of the ambient brightness on the 3D camera module can be reduced, and the error caused by subsequent calculations based on the brightness value can be reduced.
  • m represents the width of the light image and the dark image
  • n represents the height of the light image and the dark image
  • Img1[m,n] represents the brightness value matrix of the light image
  • Img2[m,n] represents the brightness value of the dark image
  • the matrix, Img[m,n] is the brightness difference matrix.
  • Step S400 Obtain the ratio of the brightness difference of the edge light domain to the brightness difference of the center light domain as a calibration coefficient. Specifically, in the bright image, the brightness difference of the edge light domain will be smaller than the brightness difference of the center light domain, and the processor will compare the brightness difference of the edge light domain with the brightness difference of the center light domain to obtain the calibration coefficient.
  • the brightness difference may be an average brightness value or a point value, and the number of calibration coefficients may be multiple.
  • the brightness difference of the central light domain is the brightness value of the first pixel in the bright image and the dark image
  • the difference of the brightness value of the first pixel in the middle, then the brightness difference of the central light domain is the point value.
  • the edge light domain can also be composed of several second pixels, so that the processor can The brightness difference of each second pixel in the edge light domain is compared with the brightness difference of the first pixel in the central light domain to obtain several calibration coefficients with different values.
  • the edge light domain may be the area where the concentric circles in the bright image are located, and the central light domain may be the area where the centers of the concentric circles are located.
  • the lens such as convex lenses
  • light rays pass through the lenses. , Can be evenly projected on the target, so that the formed bright image will have several concentric circles, the brightness value of the pixels on each concentric circle is the same, and the concentric circles of different diameters
  • the brightness value of the pixel points is uniformly decreased or increased, which can simplify the number of pixels collected. It is not necessary to collect all the pixels in the bright image, but only need to collect one pixel from each concentric circle to represent the entire circle.
  • the brightness value of the pixel point collected on the circle also represents the brightness value of all pixels on the entire circle.
  • the brightness difference of the pixel point collected on the circle is divided with the pixel located at the center of the circle By comparing the brightness difference of the points, the calibration coefficient can be obtained.
  • Step S500 Perform subsequent calibration processing on the 3D camera module according to the calibration coefficient.
  • the subsequent calibration process includes at least one of lens compensation coefficient calibration, temperature compensation coefficient calibration, internal and external parameter calibration, and distortion compensation parameter calibration.
  • the lens compensation coefficient calibration is used to calibrate the 3D camera module lens according to the calibration coefficient.
  • the processor performs subsequent calibration processing on the 3D camera module through the calibration coefficient, where the lens compensation coefficient calibration is to calibrate the lens according to the calibration coefficient to increase the brightness of the edge light domain, so that the brightness value of the edge light domain is the same as that of the center light.
  • the brightness value of the domain is equivalent, which improves the uniformity of the image quality of the entire sample image.
  • the first step is to determine the black level of the camera.
  • the first step includes step a: set the exposure to manual, so that the camera preview screen is completely black; step b: turn off Shading Disable, and grab the light source data; step c: set Hi Capture TOOl parameters, for example, when the camera is 12 bits, set the RAW bits to 12 bits; Step d: Cover the camera, capture the raw data, record the red gain (Rgain), blue gain B (Rgain), and then It is the second step to capture the raw data of the three light sources.
  • the second step includes the steps: open the RAW Analyzer to analyze whether the brightness is bright enough, for example, whether the brightness is about 500 or so. If it is not enough, you need to increase the exposure time Again, Dgain, etc.
  • the last step is the third step for calibration.
  • the third step includes step A: open the ISP Calibration Tools tool, open the black level raw data, select Black for the RAW Scene; step B: check the black level RAW file, and click Black Level in the upper right corner Cabration determines the black level value; Step C: After determining the black level value, turn on the 3 RAW sources previously captured, and select Flat Field; Step D: Fill in the RgainBgrin noted before, click Calibrate, and click Export Result.
  • Step E Open the header file and copy the contents of the array g_stCmosLscTable to the g_stCmosLscTable array of ov9732_cmos.c; Step F: Compile the SDK, burn the firmware, and see the calibration effect.
  • the temperature compensation coefficient calibration is to perform temperature compensation on the 3D camera module to reduce the influence of the external environment temperature on the 3D camera module;
  • internal and external parameter calibration includes internal and external parameter calibration, where the internal parameter is the 3D camera module itself Characteristic related parameters, such as focal length, pixel size, etc., external parameters are parameters in the world coordinate system, such as the position and rotation direction of the 3D camera module, etc.;
  • distortion compensation parameter calibration includes radial distortion correction and tangential distortion correction, Radial distortion is the position deviation along the radial direction with the distortion center as the center point, which causes the image formed in the image to be deformed. Radial distortion includes barrel distortion and pincushion distortion. Tangential distortion is caused by lens manufacturing. The above defect makes the lens itself not parallel to the image plane.
  • the follow-up calibration process may also include periodic error compensation calibration.
  • temperature compensation coefficient calibration, internal and external parameter calibration, distortion compensation parameter calibration, and periodic error compensation calibration are existing technologies, and this embodiment does not Give more details.
  • the calibration coefficients can be used to calibrate each time the 3D camera module acquires an image in the subsequent calibration process.
  • Calibration for example, when the lens compensation coefficient is calibrated, the processor can calibrate the 3D camera module through the calibration coefficient, so that the brightness of the edge light field in the collected image can be improved, and the image quality can be uniform.
  • the processor can also perform temperature compensation coefficient calibration, etc., so that the accuracy of each image collected by the 3D camera module can be improved, thereby improving the accuracy of the entire 3D camera module.
  • the 3D camera module when calibrating the 3D camera module by the calibration coefficient, a variety of calibration methods can be combined to calibrate the 3D camera module to improve the subsequent accuracy of the 3D camera module.
  • the 3D camera module can be calibrated sequentially. Lens compensation coefficient calibration, temperature compensation coefficient calibration, internal and external parameters calibration, distortion compensation parameter calibration and periodic error calibration to achieve the best calibration effect, you can also remove the temperature compensation coefficient calibration and other steps or change the order, etc. , This can be selected and combined according to actual needs.
  • the calibration method of the lens accuracy of the 3D camera module mentioned above is to determine the edge light domain and the central light domain and obtain the ratio of the brightness difference between the edge light domain and the brightness difference of the central light domain as the calibration coefficient, and then the calibration coefficient Calibrating the lens of the 3D camera module can increase the brightness of the edge of the image when the 3D camera module collects image data during the subsequent calibration process, so that the image quality of the entire image is more uniform, and the 3D camera module is guaranteed
  • the accuracy of the 3D camera module and the RGB image are fused and the calibration consistency is improved, and the consistency of the module accuracy is improved.
  • step S200 specifically includes step S210 and step S220.
  • Step S210 Collect the central pixel in the bright image, use the area where the central pixel is located as the central light domain, and obtain the brightness value of the area where the central pixel is located as the brightness value of the central light domain.
  • the bright image taken by the 3D camera module is composed of a number of pixels. Among these pixels, there is a central pixel. The central pixel is located at the center of all the pixels, so it is called the central pixel. Point, and the area where the central pixel point is located is the central light domain. Similarly, the brightness value of the central light domain can also be collected by software.
  • Step S220 Collect the area where each pixel point except the center pixel point in the bright image is located as the edge light domain, and obtain the brightness value of each pixel point outside the center pixel point as the brightness value of the edge light domain .
  • the brightness values of other pixels are also It can be collected by software. It should be noted that, there are more other pixels, and the brightness value of the area where each other pixel is located may also be different.
  • the bright image By dividing the bright image into several pixels and collecting the center pixels, the bright image can be subdivided.
  • the calibration coefficient is calculated in the subsequent steps, the number of pixels in the edge light domain is large, and the edge light domain Each pixel may have a different brightness value, so that the brightness difference between each pixel in the bright image and the corresponding pixel in the dark image may also be different.
  • the value of each pixel in the edge light domain may be different.
  • the ratio of the brightness difference to the brightness difference of the center pixel can be used as a calibration coefficient, so that multiple calibration coefficients can be obtained to calibrate each pixel, improve the accuracy of subsequent calibration, and make subsequent 3D photography module shooting The brightness of the image is more uniform.
  • step S400 may specifically include: calculating the edge light domain brightness difference and the center light domain brightness according to the pixel point calibration coefficient formula The ratio of the difference, the pixel calibration coefficient formula is:
  • m represents the width of the light image and the dark image
  • n represents the height of the light image and the light image
  • C[m,n] is the calibration coefficient corresponding to the area of each pixel in the bright image
  • Img[m,n] Is the brightness difference matrix
  • Center point is the brightness value of the area where the center pixel is located.
  • step S200 specifically includes step S211, step S221, and step S222.
  • Step S211 Divide the bright image into different blocks according to the preset size.
  • the preset size can be selected according to needs.
  • the preset size is 3*3 size, and the processor will divide the bright image into multiple blocks according to the 3*3 size. It should be noted that each block There are several pixels in each.
  • Step S221 Extract a block located in the center of the bright image to obtain a central area, use the central area as the central light domain, and obtain the average brightness value of the central area as the brightness value of the central light domain.
  • the bright image will have a center point.
  • a block where this center point is located is the central area, and the processor will recognize the block and use it as the center light.
  • the brightness value of each pixel in the block can also be obtained, and the average brightness value can be calculated as the brightness value of the central light domain.
  • Step S222 Extract each block in the bright image outside the central area to obtain each edge area, use each edge area as an edge light domain, and obtain the average brightness value of each edge area as the brightness value of each edge light domain.
  • the remaining remaining blocks constitute the edge light domain.
  • the brightness value may vary.
  • x, y represent the width and height of the preset size
  • Aver[x, y] represents the average brightness value of each preset size block in the bright image
  • Center Block average represents the average brightness value of the center block
  • C[x,y] represents the calibration coefficient of each block of preset size in the bright image.
  • step S500 the method further includes step S600 and step S700.
  • Step S600 Collect image data through the 3D photography module to obtain a depth image.
  • Step S700 Convert the depth image into a point cloud image, and perform data evaluation based on the point cloud image.
  • the image data collected by the calibrated 3D photography module is processed by the processor to obtain a depth image, and the processor converts the depth image into a point cloud image according to a preset formula, so as to facilitate data evaluation.
  • the preset formula can be internal and external parameters Matrix transformation formula.
  • the processor converts the depth image into a point cloud image through a preset formula, which can facilitate data evaluation of the image data collected by the 3D camera module to understand the calibration accuracy of the 3D camera module.
  • FIG. 5 shows the accuracy calibration method of a conventional 3D camera module.
  • the conventional 3D camera module includes an infrared transmitting terminal 51, a calibration board 52, and an infrared receiving terminal 53 during calibration.
  • the infrared emitting terminal 51 emits infrared rays to the calibration plate 52, and then the infrared receiving terminal 53 receives the light reflected by the calibration plate 52 to calculate the accuracy.
  • Fig. 6 shows an example of a calibration method using the lens accuracy of a 3D camera module, which includes an infrared receiving end 61 and an infrared light source plate 62.
  • the infrared receiving end 61 is used to receive infrared light from the infrared light source plate 62 to form a sample Image, and finally get the calibration coefficients through the sample image collection, to calibrate the 3D camera module, among which, refer to Figure 7,
  • Figure 7 shows the steps used in the calibration calibration, through the series of calibration calibration in Figure 7
  • the 3D photography module finally obtains the depth map, which can be evaluated after the depth map is converted into a point cloud image.
  • Figure 8 shows the brightness of the edge light domain before calibration and the brightness change of the edge light domain after calibration. .
  • a calibration device for the lens accuracy of a 3D camera module is also provided, which is suitable for solving the existence of a central light domain with higher brightness and an edge light domain with lower brightness on the target, so that the 3D camera module captures
  • the image data is affected by brightness and the quality is low.
  • the device includes an image acquisition module 100, a light domain extraction module 200, a brightness difference acquisition module 300, a coefficient acquisition module 400, and a calibration module 500.
  • the image acquisition module 100 is used to control the 3D photography module to acquire the target sample to obtain the sample image. Sample images include light images and dark images.
  • the light domain extraction module 200 is used to extract the central light domain and the edge light domain of the bright image in the sample image, and obtain the brightness value of the central light domain and the brightness value of the edge light domain.
  • the brightness difference obtaining module 300 is used to compare the brightness value of the center light domain and the edge light domain of the bright image with the brightness value of the corresponding area of the dark image to obtain the brightness difference value of the center light domain and the brightness of the edge light domain. Difference.
  • the coefficient obtaining module 400 is used to obtain the ratio of the brightness difference of the edge light domain to the brightness difference of the center light domain as a calibration coefficient.
  • the calibration module 500 is used to perform subsequent calibration processing on the 3D camera module according to the calibration coefficient.
  • the subsequent calibration process includes at least one of temperature compensation coefficient calibration, internal parameter calibration, external parameter calibration, and distortion compensation parameter calibration, wherein the lens compensation coefficient calibration is used to calibrate the 3D camera module lens according to the calibration coefficient , To increase the brightness of the edge light domain in the image collected by the 3D camera module.
  • the light domain extraction module 200 includes a central pixel point collection unit 210 and an edge pixel point collection unit 220.
  • the central pixel point collection unit 210 is used to collect the central pixel point in the bright image, use the area where the central pixel point is located as the central light domain, and obtain the brightness value of the area where the central pixel point is located as the brightness value of the central light domain.
  • the edge pixel point collection unit 220 is used to collect the area where each pixel point except the center pixel point of the bright image is located as an edge light domain, and obtain the brightness value of each pixel point outside the center pixel point as the edge light domain The brightness value.
  • the light domain extraction module 200 includes a block division unit 211, a center block extraction unit 221 and an edge block extraction unit 222.
  • the block dividing unit 211 is used to evenly divide the bright image into blocks of preset sizes.
  • the central block extraction unit 221 is configured to extract a central block in the bright image to obtain a central area, use the central area as the central light domain, and obtain the average brightness value of the central area as the brightness value of the central light domain.
  • the edge block extraction unit 222 is used to extract blocks outside the central area in the bright image to obtain the edge area, use the edge area as the edge light domain, and obtain the average brightness value of each edge area as the brightness value of each edge light domain .
  • a depth image acquisition module 600 and a data evaluation module 700 are further included.
  • the depth image acquisition module 600 is used for the calibration module 500 to perform subsequent calibration processing on the 3D photography module according to the calibration coefficients, and then collect image data through the 3D photography module to obtain a depth image.
  • the data evaluation module 700 is used to convert the depth image into a point cloud image, and perform data evaluation based on the point cloud image.
  • the image data collected by the 3D photography module after the subsequent calibration process is processed by the processor to obtain a depth image, and the processor converts the depth image into a point cloud image according to a preset formula, thereby facilitating data evaluation and understanding 3D photography The final calibration accuracy of the module.
  • each module in the device for calibrating the lens accuracy of the 3D camera module can be implemented in whole or in part by software, hardware, and a combination thereof.
  • the foregoing modules may be embedded in the form of hardware or independent of the processor in the computer device, or may be stored in the memory of the computer device in the form of software, so that the processor can call and execute the operations corresponding to the foregoing modules.
  • the aforementioned 3D camera module lens accuracy calibration device determines the edge light domain and the central light domain and obtains the ratio of the brightness difference between the edge light domain and the brightness difference of the central light domain as a calibration coefficient. Calibrating the lens of the 3D camera module can increase the brightness of the edge of the image when the 3D camera module collects image data during the subsequent calibration process, so that the image quality of the entire image is more uniform, and after the subsequent calibration process , So that the accuracy of the entire 3D camera module is improved, the calibration consistency of the 3D camera module and the RGB image after fusion is improved, and the consistency of the module accuracy is improved.
  • a device for calibrating the lens accuracy of a 3D camera module which specifically includes a target template, a connector, and a processor.
  • the target template is used as a 3D camera module to collect sample images
  • the processor is used to control the 3D
  • the camera module collects the target sample to obtain a sample image, where the sample image includes a bright image and a dark image, extracts the central light domain and edge light domain in the bright image, and obtains the brightness value and edge of the central light domain
  • the brightness value of the light domain, the brightness value of the central light domain and the brightness value of the edge light domain of the bright image are compared with the brightness value of the corresponding area of the dark image, to obtain the brightness difference value and the edge of the central light domain
  • the brightness difference of the light domain and obtain the ratio of the brightness difference of the edge light domain to the brightness difference of the center light domain as a calibration coefficient; and perform subsequent calibration processing on the 3D photography module according to the calibration coefficient ,
  • the subsequent calibration processing includes at least one of lens compensation coefficient calibration
  • the target template can be a white chart calibration board.
  • the processor can control the 3D camera module to emit infrared light to the target template, and then the target template emits infrared light back to the 3D camera module, and the target template is collected by the 3D camera module The image information on the image information to form a sample image, and finally the processor can obtain the sample image.
  • the above-mentioned calibration equipment determines the edge light domain and the central light domain and obtains the ratio of the brightness difference of the edge light domain to the brightness difference of the central light domain as a calibration coefficient, and then uses the calibration coefficient to calibrate the lens of the 3D camera module
  • the 3D camera module can increase the brightness of the edge of the image when collecting image data, so that the image quality of the entire image is more uniform, and after the subsequent calibration processing, the accuracy of the entire 3D camera module It has been improved to improve the consistency of calibration after the fusion of the 3D camera module and the RGB image, and improve the consistency of the module accuracy.
  • the processor is also used to collect the central pixel in the bright image, use the area where the central pixel is located as the central light domain, and obtain the brightness value of the area where the central pixel is located as the brightness value of the central light domain;
  • the area where each pixel point is located outside the center pixel point is used as the edge light domain, and the brightness value of the area where each pixel point outside the center pixel point is located is obtained as the brightness value of the edge light domain.
  • the processor is further used to divide the bright image into different blocks according to a preset size; extract the block located in the center of the bright image to obtain the central area, use the central area as the central light domain, and obtain the central area
  • the average brightness value is used as the brightness value of the central light domain; and each block in the bright image outside the central area is extracted to obtain each edge area, and each edge area is used as the edge light domain to obtain the average brightness value of each edge area, As the brightness value of each edge light domain.
  • the processor is further configured to calculate the brightness difference of the central light domain and the brightness difference of the edge light domain according to a preset brightness difference formula.
  • m represents the width of the light image and the dark image
  • n represents the height of the light image and the dark image
  • Img1[m,n] represents the brightness value of the light image
  • the matrix, Img2[m, n] represents the brightness value matrix of the dark image
  • Img[m, n] is the brightness difference matrix.
  • the processor is further configured to calculate the ratio of the edge light domain brightness difference to the center light domain brightness difference according to the pixel point calibration coefficient formula.
  • m represents the width of the bright image and the dark image
  • n represents the height of the bright image and the dark image
  • C[m,n] is the calibration corresponding to the area where each pixel in the bright image is located
  • the coefficient, Img[m,n] is the brightness difference matrix
  • Center point is the brightness value of the area where the center pixel is located.
  • the calibration equipment further includes a light-emitting device and a cover plate.
  • the light-emitting device is used to emit light to provide a light source when the 3D photography module collects bright images; the cover plate is used in the 3D photography model. When the group collects the dark image, the lens of the 3D camera module is covered.
  • the light emitting device is an infrared emitter, and the infrared emitter is used to provide an infrared light source when the 3D photography module collects bright images.
  • the calibration device further includes a driving component connected to the processor.
  • the driving component is used to receive a driving signal from the processor and drive the cover plate according to the drive signal so that the cover plate covers the 3D camera.
  • the lens of the module is used to receive a driving signal from the processor and drive the cover plate according to the drive signal so that the cover plate covers the 3D camera.
  • the calibration device further includes a dark box for placing the 3D camera module, and the target sample is set in the dark box.
  • the calibration device further includes an interactive device connected to the processor.
  • the interactive device includes a display screen and a keyboard connected to the processor, wherein the keyboard may be a key keyboard or a virtual touch keyboard, and the display screen may be a liquid crystal display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请涉及一种3D摄影模组镜头精度的标定方法、装置和设备,该方法包括:控制3D摄影模组对目标样本进行采集,得到样本图像,样本图像包括亮图和暗图;提取亮图中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值;将亮图的中心光域的亮度值以及边缘光域的亮度值与暗图相应区域的亮度值进行对比,得到中心光域的亮度差值以及边缘光域的亮度差值;获取边缘光域亮度差值与中心光域亮度差值的比值,作为校准系数;根据校准系数对3D摄影模组进行后续标定处理,后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种。

Description

3D摄影模组镜头精度的标定方法、装置及设备
本申请要求于2019年7月17日提交中国专利局,申请号为201910645385X,申请名称为“3D摄影模组镜头精度的标定方法、装置及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及影像技术领域,特别是涉及一种3D摄影模组镜头精度的标定方法、装置和设备。
背景技术
目前的移动终端例如手机、平板等正迎来新的技术变革,3D视觉技术的发展也给移动终端带来了一个新的发展方向,例如可以通过手机摄像头来进行3D人脸识别。现有的3D视觉技术的中使用的3D摄影模组例如3D TOF(TOF全称为Time Of Flight,中文称为飞行时间)模组,它是通过发射器发射光线至目标,然后再接收从目标反射回来的光线,通过光线的往返时间来计算目标中各个部分的距离,从而形成一个3D图像,这种方法在光线通过模组中发射器的透镜照射到目标物上时,会存在亮度相对集中的中心区域和亮度较低的边缘区域,这会使得最后得到的目标图像模糊或者是不完整,尤其是在距离较远的情况下,光线的亮度能量会产生衰减,最后甚至会导致3D摄影模组无法采集目标图像而失效。
为了能够保证3D摄影模组的精度,提高采集到的目标图像的质量,传统的3D摄影模组在采集目标之前会先对其进行标定,但由于受到透镜本身光学原理的限制,在面对存在亮度集中的中心区域以及亮度较低的边缘区域这一问题时,现有的操作方式基本都是通过在目标周围增加光源进行补光,来达到目标物各个部分的亮度均匀,而现有的这种操作方式存在较大的误差,不能有效的提高3D摄影模组的精度。
发明内容
根据本申请的各种实施例,提供一种3D摄影模组镜头精度的标定方法、装置及设备。
一种3D摄影模组镜头精度的标定方法,包括步骤:控制3D摄影模组对目标样本进行采集,得到样本图像,样本图像包括亮图和暗图;提取亮图中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值;将亮图的中心光域的亮度值以及边缘光域的亮度值与暗图相应区域的亮度值进行对比,得到中心光域的亮度差值以及边缘光域的亮度差值;获取边缘光域亮度差值与中心光域亮度差值的比值,作为校准系数;根据校准系数对3D摄影模组进行后续标定处理,后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种。
一种3D摄影模组镜头精度的标定装置,包括:图像采集模块、用于控制3D摄影模组对目标样本进行采集,得到样本图像,样本图像包括亮图以及暗图;光域提取模块、用于提取样本图像中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值;亮度差值获取模块、用于将亮图的中心光域的亮度值以及边缘光域的亮度值与暗图相应区域的亮度值进行对比,得到中心光域的亮度差值以及边缘光域的亮度差值;系数获取模块、用于获取边缘光域亮度差值与中心光域亮度差值的比值,作为校准系数;标定模块、用于根据校准系数对3D摄影模组进行后续标定处理,后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种。
一种3D摄影模组镜头精度的标定设备,包括目标样板、接头和处理器,目标样板用 于3D摄影模组进行样本图像的采集,处理器通过接头与3D摄影模组连接,处理器用于根据3D摄影模组镜头精度的标定方法进行3D摄影模组标定。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1为一实施例中3D摄影模组镜头精度的标定方法流程图;
图2为另一实施例中3D摄影模组镜头精度的标定方法流程图;
图3为一实施例中3D摄影模组镜头精度的标定方法流程图;
图4为一实施例中一种3D摄影模组镜头精度的标定方法流程图;
图5为一实施例中3D摄影模组镜头精度的标定系统结构框图;
图6为一实施例中3D摄影模组镜头精度的标定系统结构框图;
图7为一实施例中一种3D摄影模组后续标定处理的步骤流程示意图;
图8为一实施例中提供的标定前后的亮度对比图;
图9为另一实施例中一种3D摄影模组镜头精度的标定装置的系统框图;
图10为又一实施例中一种3D摄影模组镜头精度的标定装置的系统框图;
图11为一实施例中一种3D摄影模组镜头精度的标定设备的系统框图;
图12为一实施例中一种3D摄影模组镜头精度的标定设备的系统框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
在一个实施例中,提供了一种3D摄影模组镜头精度的标定方法,适用于解决目标物上存在亮度较高的中心光域以及亮度较低的边缘光域,使得3D摄影模组采集的图像数据受亮度影响,质量较低的问题,其中3D摄影模组可以是3D TOF模组、结构光模组等。如图1所示,该方法包括以下步骤:
步骤S100:控制3D摄影模组对目标样本进行采集,得到样本图像。样本图像包括亮图以及暗图。具体的,亮图是3D摄影模组在有提供红外光源的环境下拍摄得到的图像,暗图是3D摄影模组在没有提供红外光源的环境下且3D摄影模组的镜头被遮盖住时拍摄得到的图像。3D摄影模组的具体类型并不唯一,可以是3D结构光模组或3D TOF模组等。以3D TOF模组作为3D摄影模组进行标定为例,3D TOF模组包括有发射器以及接收器,在进行图像数据采集时,先是通过发射器不断地发射出光线,其中,发射器上安装有透镜,发射器发出的光线会先通过透镜,然后光线再照射到目标物上之后被反射,反射回来的光线通过接收器接收,与3D摄影模组相连的处理器再根据光线在发射与反射路途中所经历的飞行时间等信息数据,最后得到目标物的整体轮廓。其中,为了能够保证3D摄影模组得到质量较高的目标物的整体轮廓,需要先对3D摄影模组进行标定,即3D摄影模组先通过步骤S100采集一个样本图像,该样本图像为对3D摄影模组标定之前采集到的,后续步骤S200也将根据采集的该样本图像来进行。具体地,可通过处理器控制3D摄影模组投影 图像至目标样板,获取3D摄影模组对目标样板上的图像进行采集得到的样本图像。
步骤S200:提取样本图像中亮图的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值。具体的,亮图中根据亮度值的不同,可划分为由中心光域以及边缘光域构成,在步骤S100中已经提到,发射器发出的光线会先通过透镜才会照射到目标样本上,光线在通过透镜时,由于受光学原理的影响,其会在目标样本上形成一个亮度较高的区域和一个亮度相对较低的区域,正常情况下,亮度较高的区域都集中在目标样本的中心处,而亮度相对较低的区域都位于中心处的边缘周围,其反应在亮图上则表现为亮度集中的中心处亮度值更高,而亮度相对较低的区域亮度值较低,步骤S200中的中心光域表示为在亮图中亮度值较高的区域,而边缘光域则表示为在亮图中亮度值相对较低的区域。在得到亮图之后,可以通过处理器根据亮图中不同区域的亮度值的高低,识别出中心光域以及边缘光域,并提取出中心光域的亮度值以及边缘光域的亮度值。
需要说明的是,中心光域以及边缘光域是一个范围区域,故可以根据实际情况需要来对中心光域以及边缘光域的范围大小进行设置,而实际情况需要可以例如是发射器发出的光通过透镜所发散或聚集的程度。当发射器发出的光在通过不同透镜时,光的聚集以及发散程度也不相同,例如当采用的透镜聚光效果较好时,处理器就可以缩小样本图像中的中心光域的范围,而增大边缘光域的范围。
步骤S300:将亮图的中心光域的亮度值以及边缘光域的亮度值与暗图相应区域的亮度值进行对比,得到中心光域的亮度差值以及边缘光域的亮度差值。可以理解,在同一个3D摄影模组拍摄出来的图像中,暗图的尺寸与亮图的尺寸是相同的,在通过步骤S200找到亮图的中心光域以及边缘光域之后,相应的可以在暗图中找到对应的位置,即亮图中的所有位置区域都可以在暗图中找到相应的对照位置,例如,亮图中的中心点对应的是暗图中的中心点。这样使得亮图中的所有位置区域的亮度值都可以与暗图中对应位置区域的亮度值进行对比,得到亮度差值。通过采用亮度差值,能够减少环境亮度给3D摄影模组所带来的影响,减少后续根据亮度值计算所带来的误差。
进一步的,在一个实施例中,步骤S300具体包括:根据预设亮度差值公式计算中心光域的亮度差值以及边缘光域的亮度差值,预设亮度差值公式为Img[m,n]=Img1[m,n]-Img2[m,n];
上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,Img1[m,n]表示亮图的亮度值矩阵,Img2[m,n]表示暗图的亮度值矩阵,Img[m,n]为亮度差值矩阵。
步骤S400:获取边缘光域亮度差值与中心光域亮度差值的比值,作为校准系数。具体的,在亮图中,边缘光域的亮度差值会小于中心光域的亮度差值,处理器会将边缘光域亮度差值与中心光域亮度差值进行对比,得到校准系数。其中,亮度差值可以为平均亮度值也可以为点值,且校准系数的数量可以为多个。例如,当中心光域的范围区域设置为一个像素点时(将该点称为第一像素点),那么中心光域的亮度差值则为亮图中第一像素点的亮度值与暗图中第一像素点的亮度值的差值,这时中心光域的亮度差值就为点值,同理边缘光域也可以是由若干个第二像素点构成的,从而可以通过处理器将边缘光域中的每一个第二像素点的亮度差值都与中心光域的第一像素点的亮度差值进行对比,得到若干个数值不同的校准系数。
更进一步的,在一个实施方式中,边缘光域可以是亮图中同心圆的圆圈所在区域,中心光域可以是同心圆的圆心所在区域,对于不同的透镜,例如凸透镜,光线在通过透镜之后,能够均匀的投射在目标物上,使得形成的亮图中,会具有若干个同心圆,每一个同心圆的圆圈上像素点的亮度值都是相同的,不同直径的同心圆的圆圈上的像素点的亮度值均匀递减或递增,这样就能够简化像素点的采集数量,不需要采集亮图中所有的像素点,只需要从每一个同心圆的圆圈中采集一个像素点来代表整个圆圈上的所有像素点,圆圈上被采集的这个像素点的亮度值也就代表了整个圆圈上所有像素点的亮度值,将圆圈上被采集 的这个像素点的亮度差值去与位于圆心处的像素点的亮度差值进行对比,就可以得到校准系数。
步骤S500:根据校准系数对3D摄影模组进行后续标定处理。后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种,其中,透镜补偿系数标定用于根据校准系数对3D摄影模组镜头进行标定,以增加3D摄影模组采集的图像中边缘光域的亮度。具体的,处理器通过校准系数对3D摄影模组进行后续标定处理,其中,透镜补偿系数标定为根据校准系数对透镜进行标定,增加边缘光域的亮度,使得边缘光域的亮度值与中心光域的亮度值相当,提高整个样本图像的画质均匀性。以PQTools作为标定软件工具对摄影镜头校准为例,由于利用了Matlab,因此在使用标定软件工具前,还需要使用MCR(Matlab Compiler Runtime)。首先第一步是确定摄像头的黑电平,第一步中包括有步骤a:将曝光设置为手动,使得摄像头预览画面全黑;步骤b:关闭Shading Disable,抓取光源数据;步骤c:设置Hi Capture TOOl参数,例如当摄像头为12位时,将RAW bits设置为12bits;步骤d:盖住摄像头,抓取raw原始数据,记录下红色增益(Rgain)、蓝色增益B(Rgain),然后是第二步抓取3种光源的raw数据,第二步中包括步骤:打开RAW Analyzer分析亮度是否足够亮,例如亮度大概是否达到500左右,如果不够则需要调高曝光时间Again、Dgain等,最后是第三步进行标定,第三步中包括步骤A:打开ISP Calibration Tools工具,打开黑电平raw数据,RAW Scene选Black;步骤B:勾上黑电平RAW文件,右上角点Black Level Cabration确定黑电平值;步骤C:确定黑电平值后,打开之前抓的3个光源RAW,选Flat Field;步骤D:填上之前记下的RgainBgrin,点Calibrate标定,点Export Result导出头文件;步骤E:打开头文件,把数组g_stCmosLscTable内容复制到ov9732_cmos.c的g_stCmosLscTable数组中;步骤F:编译sdk,烧固件,看标定效果。
温度补偿系数标定则为对3D摄影模组进行温度补偿,减少外界环境温度对3D摄影模组的影响;内参以及外参标定包括内参数以及外参数标定,其中,内参数为3D摄影模组自身特性相关的参数,例如焦距、像素大小等,外参数为世界坐标系中的参数,例如3D摄影模组的位置、旋转方向等;畸变补偿参数标定包括有径向畸变校正以及切向畸变校正,径向畸变是以畸变中心为中心点,沿着径向产生的位置偏差,从而导致图像中所成的像发生形变,径向畸变包括桶形畸变和枕形畸变,切向畸变是由于透镜制造上的缺陷使得透镜本身与图像平面不平行而产生的。进一步的,后续标定处理中还可以包括有周期误差补标定,需要说明的是,温度补偿系数标定、内参以及外参标定、畸变补偿参数标定和周期误差补标定为现有技术,本实施例不再做详细说明。在确定3D摄影模组采集的样本图像中的校准系数之后,对3D摄影模组进行后续标定处理时,可在后续标定处理过程中每一次3D摄影模组采集图像之后,均利用校准系数进行校准标定,例如当进行透镜补偿系数标定时,处理器可以通过校准系数,对3D摄影模组进行校准标定,使得采集的图像中边缘光域的亮度能够得到提高,实现图片画质的均匀,相应的,处理器还可以进行温度补偿系数标定等等,使得3D摄影模组每一次采集的图像的精度都能够得到提高,从而提高整个3D摄影模组的精度。
可以理解,在通过校准系数对3D摄影模组进行标定时,可以结合多种标定方式来对3D摄影模组进行标定,以提高3D摄影模组后续的精度,例如可以依次对3D摄影模组进行透镜补偿系数标定、温度补偿系数标定、内参以及外参标定、畸变补偿参数标定和周期误差标定,以达到最佳的标定效果,也可以去除其中的温度补偿系数标定等步骤或改变其顺序等等,这是可根据实际需要来进行选择与结合的。
上述的3D摄影模组镜头精度的标定方法,通过确定边缘光域与中心光域以及获取边缘光域的亮度差值与中心光域的亮度差值的比值,作为校准系数,再通过校准系数来对3D摄影模组的镜头进行标定,能够在后续标定过程中使得3D摄影模组在采集图像数据时, 提高图像边缘光域的亮度,使得整个图像的画质更加均匀,保障了3D摄影模组的精度,提高3D摄影模组与RGB图像融合后标定的一致性,改善模组精度的一致性。
在一实施例中,如图2所示,在步骤S200中具体包括步骤S210和步骤S220。
步骤S210:采集亮图中的中心像素点,将中心像素点所在的区域作为中心光域,获取所述中心像素点所在区域的亮度值作为中心光域的亮度值。具体的,3D摄影模组拍摄出来的亮图是由若干个像素点构成的,这些若干个像素点中具有一个中心像素点,该中心像素点位于所有像素点的中心处,故称为中心像素点,而该中心像素点的所处的区域就为中心光域,同样,中心光域的亮度值也可以通过软件采集得到。
步骤S220:采集亮图中的中心像素点之外的各个像素点所在的区域,作为边缘光域,获取所述中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。具体的,亮图中的若干个像素点中,除去中心像素点之外还具有其它的像素点,而其它像素点所处的区域就为边缘光域,同理,其它像素点的亮度值也可以通过软件采集得到,需要说明的是,其它像素点的数量较多,每一个其它像素点所处的区域的亮度值也可能不同。
通过将亮图划分为若干个像素点,并采集中心像素点,能够将亮图进行细分,在后续步骤中计算校准系数时,由于边缘光域中的像素点数量较多,且边缘光域中每一个像素点处都可能具有不相同的亮度值,使得亮图中每一个像素点与暗图中对应像素点对比之后的亮度差值也可能不同,边缘光域中的每一个像素点的亮度差值与中心像素点的亮度差值的比值都可以作为一个校准系数,从而能够得到多个校准系数,以对每一个像素点进行标定,提高后续标定的精度,使得后续3D摄影模组拍摄的图像的亮度更加均匀。
进一步的,在一个实施例中,当通过步骤S210以及步骤S220确定中心光域和边缘光域之后,步骤S400可以具体包括:根据像素点校准系数公式计算边缘光域亮度差值与中心光域亮度差值的比值,像素点校准系数公式为:
C[m,n]=Img[m,n]/Center point
上式中,m表示亮图和暗图的宽,n表示亮图和亮图的高,C[m,n]为亮图中各个像素点所在区域对应的校准系数,Img[m,n]为亮度差值矩阵,Center point为中心像素点所在区域的亮度值。
在一实施例中,如图3所示,在步骤S200中具体包括步骤S211、步骤S221和步骤S222。
步骤S211:根据预设尺寸将亮图划分为不同区块。具体的,预设尺寸可以根据需要来选择,例如预设尺寸为3*3尺寸,处理器会根据3*3尺寸将亮图将划分为多个区块,需要说明的是,每一个区块中都具有若干个像素点。
步骤S221:提取亮图中位于中心的区块,得到中心区域,将中心区域作为中心光域,获取所述中心区域的平均亮度值,作为中心光域的亮度值。具体的,亮图会有一个中心点,在亮图划分的多个区块中,这个中心点所处的某个区块就是中心区域,处理器会识别出该区块并将其作为中心光域,同样,还可以得到该区块中各个像素点的亮度值,计算得到平均亮度值作为中心光域的亮度值。
步骤S222:提取亮图中位于中心区域之外的各个区块,得到各个边缘区域,将各个边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。同理,在多个区块中识别出某个区块以及中心光域之后,其余剩余的区块则构成了边缘光域,需要说明的是,剩余的区块中,不同的区块的平均亮度值可能存在不同。通过将样本图像划分为预设尺寸的区块,能够方便处理器快对样本图像进行处理,将位于边缘区域中的区块的亮度值与位于中心区域的区块的亮度值进行对比,快速的得到若干个校准系数,减少了数据的运算量,这种方式能够有效的适用于对标定精度要求不高的场景。
进一步的,在一个实施例中,当通过步骤S211、步骤S221和步骤S222确定了中心光域以及边缘光域之后,步骤S400可以具体包括:根据区块校准系数公式计算边缘光域 亮度差值与中心光域亮度差值的比值,区块校准系数公式为:C[x,y]=Aver[x,y]/Center Block average
上式中,x,y表示预设尺寸的宽和高,Aver[x,y]表示亮图中各个预设尺寸的区块的平均亮度值,Center Block average表示中心的区块的平均亮度值,C[x,y]表示亮图中各个预设尺寸的区块的校准系数。
在一实施例中,如图4所示,在步骤S500之后,该方法还包括步骤S600和步骤S700。步骤S600:通过3D摄影模组采集图像数据,得到深度图像。
步骤S700:将深度图像转换为点云图,并根据点云图进行数据评测。标定之后的3D摄影模组采集得到的图像数据通过处理器进行处理,得到深度图像,处理器再根据预设公式将深度图像转换为点云图,从而方便进行数据评测,预设公式可以是内外参矩阵变换公式。处理器通过预设公式将深度图像转换为点云图,能够方便对3D摄影模组采集的图像数据进行数据评测以了解对3D摄影模组的标定精度情况。
在一个实施例中,图5示出了常规的3D摄影模组的精度标定方法,传统的3D摄影模组在进行标定时包括有红外线发射端51、标定板52和红外线接收端53,其通过红外线发射端51发射红外线至标定板52上,然后通过红外线接收端53接收标定板52反射的光线来计算精度。图6示出了一种应用3D摄影模组镜头精度的标定方法的示例,其包括有红外接收端61和红外光源板62,红外接收端61用于接收红外光源板62的红外光线,形成样本图像,再通过该样本图像采集最后得到校准系数,来对3D摄影模组进行标定校准,其中,参考图7,图7示出了标定校准时采用的步骤,通过图7中的一系列标定校准步骤之后,3D摄影模组最终得到深度图,深度图转换为点云图之后即可进行数据评测,参考图8,图8示出了校准前边缘光域的亮度与校准后边缘光域的亮度变化。
在一个实施例中,还提供了一种3D摄影模组镜头精度的标定装置,适用于解决目标物上存在亮度较高的中心光域以及亮度较低的边缘光域,使得3D摄影模组采集的图像数据受亮度影响,质量较低的问题。如图9所示,该装置包图像采集模块100、光域提取模块200、亮度差值获取模块300、系数获取模块400和标定模块500。图像采集模块100用于控制3D摄影模组对目标样本进行采集,得到样本图像。样本图像包括亮图以及暗图。光域提取模块200用于提取样本图像中亮图的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值。亮度差值获取模块300用于将亮图的中心光域的亮度值以及边缘光域的亮度值与暗图相应区域的亮度值进行对比,得到中心光域的亮度差值以及边缘光域的亮度差值。系数获取模块400用于获取边缘光域的亮度差值与中心光域的亮度差值的比值,作为校准系数。标定模块500用于根据校准系数对3D摄影模组进行后续标定处理。
其中,后续标定处理包括温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种,其中,所述透镜补偿系数标定用于根据所述校准系数对3D摄影模组镜头进行标定,以增加所述3D摄影模组采集的图像中边缘光域的亮度。
在一实施例中,如图10所示,光域提取模块200包括中心像素点采集单元210和边缘像素点采集单元220。
中心像素点采集单元210用于采集亮图中的中心像素点,将中心像素点所在的区域作为中心光域,获取所述中心像素点所在区域的亮度值作为中心光域的亮度值。边缘像素点采集单元220用于采集亮图中心像素点之外的各个像素点所在的区域,作为边缘光域,获取所述中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。
在一实施例中,如图11所示,光域提取模块200包括区块划分单元211、中心区块提取单元221和边缘区块提取单元222。区块划分单元211用于将亮图均匀划分为若干预设尺寸的区块。中心区块提取单元221用于提取亮图中位于中心的区块,得到中心区域,将中心区域作为中心光域,获取所述中心区域的平均亮度值,作为中心光域的亮度值。边 缘区块提取单元222用于提取亮图中位于中心区域之外的区块,得到边缘区域,将边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。
在一实施例中,如图12所示,还包括深度图像获取模块600和数据评测模块700。深度图像获取模块600用于标定模块500根据校准系数对3D摄影模组进行后续标定处理之后,通过3D摄影模组采集图像数据,得到深度图像。
数据评测模块700用于将深度图像转换为点云图,并根据点云图进行数据评测。通过后续标定处理之后的3D摄影模组采集得到的图像数据通过处理器进行处理,得到深度图像,处理器再根据预设公式将深度图像转换为点云图,从而方便进行数据评测,以了解3D摄影模组最终标定精度情况。
关于3D摄影模组镜头精度的标定装置的具体限定可以参见上文中对于3D摄影模组镜头精度的标定方法的限定,在此不再赘述。上述3D摄影模组镜头精度的标定装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
上述的3D摄影模组镜头精度的标定装置,通过确定边缘光域与中心光域以及获取边缘光域的亮度差值与中心光域的亮度差值的比值,作为校准系数,再通过校准系数来对3D摄影模组的镜头进行标定,能够在后续标定过程中使得3D摄影模组在采集图像数据时,提高图像边缘光域的亮度,使得整个图像的画质更加均匀,并且通过后续标定处理之后,使得整个3D摄影模组的精度得到提高,提高了3D摄影模组与RGB图像融合后标定的一致性,改善模组精度的一致性。
在一个实施例中,还提供了一种3D摄影模组镜头精度的标定设备,具体包括目标样板、接头和处理器,目标样板用作3D摄影模组进行样本图像的采集,处理器用于控制3D摄影模组对目标样本进行采集,得到样本图像,其中,所述样本图像包括亮图以及暗图,提取所述亮图中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值,将所述亮图的中心光域的亮度值以及边缘光域的亮度值与所述暗图相应区域的亮度值进行对比,得到所述中心光域的亮度差值以及边缘光域的亮度差值,并获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数;以及根据所述校准系数对所述3D摄影模组进行后续标定处理,其中,所述后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参和外参标定以及畸变补偿参数标定中的至少一种。具体的,目标样板可以是白chart标定板,处理器可以控制3D摄影模组发射红外光线至目标样板上,然后目标样板将红外光线发射回至3D摄影模组,通过3D摄影模组采集目标样板上的图像信息,从而形成样本图像,最终使得处理器可以获取到样本图像。
上述标定设备通过确定边缘光域与中心光域以及获取边缘光域的亮度差值与中心光域的亮度差值的比值,作为校准系数,再通过校准系数来对3D摄影模组的镜头进行标定,能够在后续标定过程中使得3D摄影模组在采集图像数据时,提高图像边缘光域的亮度,使得整个图像的画质更加均匀,并且通过后续标定处理之后,使得整个3D摄影模组的精度得到提高,提高了3D摄影模组与RGB图像融合后标定的一致性,改善模组精度的一致性。
在一个实施例中,处理器还用于采集亮图中的中心像素点,将中心像素点所在区域作为中心光域,获取中心像素点所在区域的亮度值作为中心光域的亮度值;采集亮图中中心像素点之外的各个像素点所在区域,作为边缘光域,获取中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。
在一个实施例中,处理器还用于根据预设尺寸将亮图划分为不同区块;提取亮图中位于中心的区块,得到中心区域,将中心区域作为中心光域,获取中心区域的平均亮度值,作为中心光域的亮度值;以及提取亮图中位于中心区域之外的各个区块,得到各个边缘区 域,将各个边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。
在一个实施例中,处理器还用于根据预设亮度差值公式计算中心光域的亮度差值以及边缘光域的亮度差值,预设亮度差值公式为:Img[m,n]=Img1[m,n]-Img2[m,n];上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,Img1[m,n]表示亮图的亮度值矩阵,Img2[m,n]表示暗图的亮度值矩阵,Img[m,n]为亮度差值矩阵。
在一个实施例中,处理器还用于根据像素点校准系数公式计算边缘光域亮度差值与中心光域亮度差值的比值,像素点校准系数公式为:C[m,n]=Img[m,n]/Center point;上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,C[m,n]为亮图中各个像素点所在区域对应的校准系数,Img[m,n]为亮度差值矩阵,Center point为中心像素点所在区域的亮度值。
在一个实施例中,虽未图示,该标定设备还包括发光装置以及遮盖板,发光装置用于在3D摄影模组采集亮图时,进行发光以提供光源;遮盖板用于在3D摄影模组采集暗图时,遮盖3D摄像模组的镜头。进一步的,在一个实施例中,发光装置为红外发射器,红外发射器用于在3D摄影模组采集亮图时,提供红外光源。
在一个实施例中,虽未图示,该标定设备还包括与处理器连接的驱动组件,驱动组件用于接收处理器的驱动信号,并根据驱动信号驱动遮盖板,以使得遮盖板遮盖3D摄像模组的镜头。
在一个实施例中,虽未图示,该标定设备还包括用于放置3D摄影模组的暗箱,目标样本设置于暗箱。
在一个实施例中,虽未图示,该标定设备还包括与处理器连接的交互装置。进一步的,在一个实施例中,交互装置包括与处理器连接的显示屏以及键盘,其中键盘可以是按键键盘或虚拟触摸式键盘,显示屏可以是液晶显示屏。
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种3D摄影模组镜头精度的标定方法,其特征在于,包括以下步骤:
    控制3D摄影模组对目标样本进行采集,得到样本图像,所述样本图像包括亮图以及暗图;
    提取所述亮图中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值;
    将所述亮图的中心光域的亮度值以及边缘光域的亮度值与所述暗图相应区域的亮度值进行对比,得到所述中心光域的亮度差值以及边缘光域的亮度差值;
    获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数;
    根据所述校准系数对所述3D摄影模组进行后续标定处理,所述后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参和外参标定以及畸变补偿参数标定中的至少一种。
  2. 根据权利要求1所述的方法,其特征在于,所述提取所述亮图的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值,具体包括以下步骤:
    采集所述亮图中的中心像素点,将所述中心像素点所在区域作为中心光域,获取所述中心像素点所在区域的亮度值作为中心光域的亮度值;
    采集所述亮图中所述中心像素点之外的各个像素点所在区域,作为边缘光域,获取所述中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。
  3. 根据权利要求1所述的方法,其特征在于,所述提取所述样本图像中亮图的中心光域以及边缘光域,具体包括以下步骤:
    根据预设尺寸将所述亮图划分为不同区块;
    提取所述亮图中位于中心的区块,得到中心区域,将所述中心区域作为中心光域,获取所述中心区域的平均亮度值,作为中心光域的亮度值;
    提取所述亮图中位于中心区域之外的各个区块,得到各个边缘区域,将所述各个边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。
  4. 根据权利要求1所述的方法,其特征在于,所述将所述亮图的中心光域的亮度值以及边缘光域的亮度值与所述暗图相应区域的亮度值进行对比,得到所述中心光域的亮度差值以及边缘光域的亮度差值,具体包括步骤:
    根据预设亮度差值公式计算所述中心光域的亮度差值以及边缘光域的亮度差值,所述预设亮度差值公式为:
    Img[m,n]=Img1[m,n]-Img2[m,n];
    上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,Img1[m,n]表示亮图的亮度值矩阵,Img2[m,n]表示暗图的亮度值矩阵,Img[m,n]为亮度差值矩阵。
  5. 根据权利要求2所述的方法,其特征在于,所述获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数,具体包括步骤:
    根据像素点校准系数公式计算所述边缘光域亮度差值与所述中心光域亮度差值的比值,所述像素点校准系数公式为:
    C[m,n]=Img[m,n]/Center point;
    上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,C[m,n]为亮图中各个像素点所在区域对应的校准系数,Img[m,n]为亮度差值矩阵,Center point为中心像素点所在区域的亮度值。
  6. 根据权利要求3所述的方法,其特征在于,所述获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数,具体包括步骤:
    根据区块校准系数公式计算所述边缘光域亮度差值与所述中心光域亮度差值的比值,所述区块校准系数公式为:
    C[x,y]=Aver[x,y]/Center Block average;
    上式中,x,y表示所述预设尺寸的宽和高,Aver[x,y]表示亮图中各个预设尺寸的区块的平均亮度值,Center Block average表示所述中心的区块的平均亮度值,C[x,y]表示亮图中各个预设尺寸的区块的校准系数。
  7. 一种3D摄影模组镜头精度的标定装置,其特征在于,包括:
    图像采集模块、用于控制3D摄影模组对目标样本进行采集,得到样本图像,所述样本图像包括亮图以及暗图;
    光域提取模块、用于提取所述亮图中的中心光域以及边缘光域,获取中心光域的亮度值以及边缘光域的亮度值;
    亮度差值获取模块、用于将所述亮图的中心光域的亮度值以及边缘光域的亮度值与所述暗图相应区域的亮度值进行对比,得到所述中心光域的亮度差值以及边缘光域的亮度差值;
    系数获取模块、用于获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数;
    标定模块、用于根据校准系数对所述3D摄影模组进行后续标定处理,所述后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参以及外参标定和畸变补偿参数标定中的至少一种。
  8. 根据权利要求7所述的装置,其特征在于,所述光域提取模块包括:
    中心像素点采集单元、用于采集所述亮图中的中心像素点,将所述中心像素点所在区域作为中心光域,获取所述中心像素点所在区域的亮度值作为中心光域的亮度值;
    边缘像素点采集单元、用于采集所述亮图中所述中心像素点之外的各个像素点所在区域,作为边缘光域,获取所述中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。
  9. 根据权利要求7所述的装置,其特征在于,所述光域提取模块包括:
    区块划分单元、用于将所述亮图均匀划分为若干预设尺寸的区块;
    中心区块提取单元、用于提取所述亮图中位于中心的区块,得到中心区域,将所述中心区域作为中心光域,获取所述中心区域的平均亮度值,作为中心光域的亮度值;
    边缘区块提取单元、用于提取所述亮图中位于中心区域之外的区块,得到边缘区域,将所述边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。
  10. 一种3D摄影模组镜头精度的标定设备,其特征在于,包括目标样板、接头和处理器,所述目标样板用于所述3D摄影模组进行样本图像的采集,所述处理器通过所述接头与所述3D摄影模组连接,所述处理器用于控制3D摄影模组对目标样本进行采集,得到样本图像,其中,所述样本图像包括亮图以及暗图,提取所述亮图中的中心光域以及边缘光域;所述处理器获取中心光域的亮度值以及边缘光域的亮度值,将所述亮图的中心光域的亮度值以及边缘光域的亮度值与所述暗图相应区域的亮度值进行对比,得到所述中心光域的亮度差值以及边缘光域的亮度差值,并获取所述边缘光域亮度差值与所述中心光域亮度差值的比值,作为校准系数;以及根据所述校准系数对所述3D摄影模组进行后续标定处理,其中,所述后续标定处理包括透镜补偿系数标定、温度补偿系数标定、内参和外参标定以及畸变补偿参数标定中的至少一种。
  11. 根据权利要求10所述的设备,其特征在于,所述处理器还用于采集所述亮图中的中心像素点,将所述中心像素点所在区域作为中心光域,获取所述中心像素点所在区域的亮度值作为中心光域的亮度值;采集所述亮图中所述中心像素点之外的各个像素点所在区域,作为边缘光域,获取所述中心像素点之外的各个像素点所在区域的亮度值作为边缘光域的亮度值。
  12. 根据权利要求10所述的设备,其特征在于,所述处理器还用于根据预设尺寸将所述亮图划分为不同区块;提取所述亮图中位于中心的区块,得到中心区域,将所述中心区域作为中心光域,获取所述中心区域的平均亮度值,作为中心光域的亮度值;以及提取所述亮图中位于中心区域之外的各个区块,得到各个边缘区域,将所述各个边缘区域作为边缘光域,获取各个边缘区域的平均亮度值,作为各个边缘光域的亮度值。
  13. 根据权利要求10所述的设备,其特征在于,所述处理器还用于根据预设亮度差值公式计算所述中心光域的亮度差值以及边缘光域的亮度差值,所述预设亮度差值公式为:
    Img[m,n]=Img1[m,n]-Img2[m,n];
    上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,Img1[m,n]表示亮图的亮度值矩阵,Img2[m,n]表示暗图的亮度值矩阵,Img[m,n]为亮度差值矩阵。
  14. 根据权利要求10所述的设备,其特征在于,所述处理器还用于根据像素点校准系数公式计算所述边缘光域亮度差值与所述中心光域亮度差值的比值,所述像素点校准系数公式为:
    C[m,n]=Img[m,n]/Center point;
    上式中,m表示亮图和暗图的宽,n表示亮图和暗图的高,C[m,n]为亮图中各个像素点所在区域对应的校准系数,Img[m,n]为亮度差值矩阵,Center point为中心像素点所在区域的亮度值。
  15. 根据权利要求10所述的设备,其特征在于,所述的设备还包括发光装置以及遮盖板,所述发光装置用于在所述3D摄影模组采集亮图时,进行发光以提供光源;所述遮盖板用于在所述3D摄影模组采集暗图时,遮盖所述3D摄像模组的镜头。
  16. 根据权利要求15所述的设备,其特征在于,所述的发光装置为红外发射器,所述红外发射器用于在所述3D摄影模组采集亮图时,提供红外光源。
  17. 根据权利要求15所述的设备,其特征在于,所述的设备还包括与处理器连接的驱动组件,所述驱动组件用于接收所述处理器的驱动信号,并根据所述驱动信号驱动所述遮盖板,以使得所述遮盖板遮盖所述3D摄像模组的镜头。
  18. 根据权利要求10所述的设备,其特征在于,所述的设备还包括用于放置3D摄影模组的暗箱,所述目标样本设置于所述暗箱。
  19. 根据权利要求10所述的设备,其特征在于,所述的设备还包括与所述处理器连接的交互装置。
  20. 根据权利要求19所述的设备,其特征在于,所述的交互装置包括与所述处理器连接的显示屏以及键盘。
PCT/CN2019/120172 2019-07-17 2019-11-22 3d摄影模组镜头精度的标定方法、装置及设备 WO2021008052A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910645385.XA CN112243120A (zh) 2019-07-17 2019-07-17 3d摄影模组镜头精度的标定方法、装置及设备
CN201910645385.X 2019-07-17

Publications (1)

Publication Number Publication Date
WO2021008052A1 true WO2021008052A1 (zh) 2021-01-21

Family

ID=74167370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/120172 WO2021008052A1 (zh) 2019-07-17 2019-11-22 3d摄影模组镜头精度的标定方法、装置及设备

Country Status (2)

Country Link
CN (1) CN112243120A (zh)
WO (1) WO2021008052A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643377A (zh) * 2021-07-12 2021-11-12 杭州易现先进科技有限公司 基于多次标定的单镜头一致性误差分析的方法和系统
CN114615438A (zh) * 2022-03-07 2022-06-10 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN117516726A (zh) * 2023-12-29 2024-02-06 深圳市英博伟业科技有限公司 一种基于红外技术的温度测量方法及终端设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116413730B (zh) * 2021-12-29 2024-05-31 深圳市速腾聚创科技有限公司 测距方法、装置、存储介质及激光雷达
CN114900617B (zh) * 2022-03-28 2023-08-08 北京京东乾石科技有限公司 补光方法、装置、设备、存储介质及补光亮度调整装置
CN116667155A (zh) * 2023-07-24 2023-08-29 深圳市速腾聚创科技有限公司 发射模组、激光发射模块和激光雷达设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106970024A (zh) * 2017-03-16 2017-07-21 中南大学 基于相机和可控频闪光源的限界检测测距方法及系统
CN108234824A (zh) * 2018-03-26 2018-06-29 上海小蚁科技有限公司 阴影校正检测参数确定、校正检测方法及装置、存储介质、鱼眼相机
CN108270952A (zh) * 2017-11-21 2018-07-10 深圳市芯舞时代科技有限公司 一种双目摄像机影像色差的校正方法及系统
CN108604042A (zh) * 2016-01-20 2018-09-28 亮锐控股有限公司 自适应光源的驱动器

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604042A (zh) * 2016-01-20 2018-09-28 亮锐控股有限公司 自适应光源的驱动器
CN106970024A (zh) * 2017-03-16 2017-07-21 中南大学 基于相机和可控频闪光源的限界检测测距方法及系统
CN108270952A (zh) * 2017-11-21 2018-07-10 深圳市芯舞时代科技有限公司 一种双目摄像机影像色差的校正方法及系统
CN108234824A (zh) * 2018-03-26 2018-06-29 上海小蚁科技有限公司 阴影校正检测参数确定、校正检测方法及装置、存储介质、鱼眼相机

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643377A (zh) * 2021-07-12 2021-11-12 杭州易现先进科技有限公司 基于多次标定的单镜头一致性误差分析的方法和系统
CN113643377B (zh) * 2021-07-12 2024-03-19 杭州易现先进科技有限公司 基于多次标定的单镜头一致性误差分析的方法和系统
CN114615438A (zh) * 2022-03-07 2022-06-10 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN114615438B (zh) * 2022-03-07 2023-09-15 江西合力泰科技有限公司 一种摄像头芯片表面黑点补偿方法
CN117516726A (zh) * 2023-12-29 2024-02-06 深圳市英博伟业科技有限公司 一种基于红外技术的温度测量方法及终端设备
CN117516726B (zh) * 2023-12-29 2024-03-15 深圳市英博伟业科技有限公司 一种基于红外技术的温度测量方法及终端设备

Also Published As

Publication number Publication date
CN112243120A (zh) 2021-01-19

Similar Documents

Publication Publication Date Title
WO2021008052A1 (zh) 3d摄影模组镜头精度的标定方法、装置及设备
US10997696B2 (en) Image processing method, apparatus and device
US10205896B2 (en) Automatic lens flare detection and correction for light-field images
US9420276B2 (en) Calibration of light-field camera geometry via robust fitting
US10013764B2 (en) Local adaptive histogram equalization
US7590305B2 (en) Digital camera with built-in lens calibration table
CN109685853B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN108012078A (zh) 图像亮度处理方法、装置、存储介质和电子设备
CN108600740A (zh) 光学元件检测方法、装置、电子设备和存储介质
CN108716982A (zh) 光学元件检测方法、装置、电子设备和存储介质
WO2022036539A1 (zh) 一种多相机色彩一致性校正方法和装置
CN107911683A (zh) 图像白平衡处理方法、装置、存储介质和电子设备
JP5740147B2 (ja) 光源推定装置及び光源推定方法
US20230042544A1 (en) Electronic device for detecting defect in image on basis of difference among sub-images acquired by multiple photodiode sensors, and operation method thereof
US8723938B2 (en) Immunoassay apparatus and method of determining brightness value of target area on optical image using the same
JP7321772B2 (ja) 画像処理装置、画像処理方法、およびプログラム
US20100254624A1 (en) Method of correcting image distortion
JP2022130308A (ja) 歯色を判定する方法
CN114494080A (zh) 一种图像生成方法、装置、电子设备及存储介质
WO2016145872A1 (zh) 测距、对焦处理方法及装置
CN112461762A (zh) 基于hsv模型的溶液浊度检测方法、介质、图像处理系统
WO2024134935A1 (ja) 三次元情報補正装置及び三次元情報補正方法
JP6561479B2 (ja) 色シェーディング補正が可能な撮像装置
JP2005216191A (ja) ステレオ画像処理装置およびステレオ画像処理方法
WO2023195403A1 (ja) 画像処理装置及び画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19937833

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19937833

Country of ref document: EP

Kind code of ref document: A1