CN111372005B - Automatic exposure compensation method and system for TOF camera module - Google Patents

Automatic exposure compensation method and system for TOF camera module Download PDF

Info

Publication number
CN111372005B
CN111372005B CN201811587701.4A CN201811587701A CN111372005B CN 111372005 B CN111372005 B CN 111372005B CN 201811587701 A CN201811587701 A CN 201811587701A CN 111372005 B CN111372005 B CN 111372005B
Authority
CN
China
Prior art keywords
data
exposure time
depth
compensation
recommended
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811587701.4A
Other languages
Chinese (zh)
Other versions
CN111372005A (en
Inventor
周劲蕾
陈立刚
苏子凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN201811587701.4A priority Critical patent/CN111372005B/en
Publication of CN111372005A publication Critical patent/CN111372005A/en
Application granted granted Critical
Publication of CN111372005B publication Critical patent/CN111372005B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The automatic exposure compensation method of the TOF camera module comprises the steps of collecting first depth data and first gray data of a target object by the TOF camera module within first preset exposure time; collecting specific data of the first gray scale data to generate first measured brightness data; generating a first automatic exposure gain based on the first measured brightness data and a first reference brightness data; generating a first suggested exposure time based on the first automatic exposure gain and the first preset exposure time; acquiring second depth data and second gray data of the target object by the TOF camera module according to the first suggested exposure time; acquiring a first depth compensation value corresponding to the first recommended exposure time based on the first recommended exposure time; and compensating the second depth data based on the first depth compensation value to generate second compensated depth data corresponding to the second depth data.

Description

Automatic exposure compensation method and system for TOF camera module
Technical Field
The invention relates to the field of camera modules, in particular to an automatic exposure compensation method and system of a TOF camera module.
Background
With the development of science and technology, TOF (Time of flight) camera modules are rapidly developed, and the TOF camera modules are widely applied in various fields with the unique performance of the TOF camera modules. The TOF camera module can obtain the distance to a target object by detecting the time of flight (round trip) of the transmitted and received light pulses, with the transmitted light pulses continuing towards the target and then using a sensor to receive the light returning from the receiving object. That is to say, the TOF camera module can not only acquire the two-dimensional gray scale image of the target object, but also acquire the depth image of the target object.
In addition, the TOF camera module has the advantages of small volume, small error, capability of directly outputting depth data, strong anti-interference performance and the like. In recent years, with the development and maturity of TOF camera module technology, TOF camera modules are widely applied to the fields of smart phones, AR (Augmented Reality), VR (Virtual Reality), computer three-dimensional modeling, industrial automation and the like.
On the other hand, automatic exposure is the common function of the traditional module of making a video recording, and the module of making a video recording that has the automatic exposure function can allow the module of making a video recording to adjust exposure time according to the intensity variation of external light to change the time that external light acted on sensitization chip, so that the module of making a video recording can adapt to different light environment, acquires higher image quality under different light environment.
However, it should be pointed out that most TOF camera modules lack the function of automatic exposure, and according to the imaging principle of TOF and the source of error, changing the exposure time of the TOF camera module can cause the phenomenon that the depth values obtained by the TOF camera modules before and after changing are inconsistent, affect the accuracy of the depth data of the target object obtained by the TOF camera module, and cause a large difference between the depth data of the target object obtained by the TOF camera module and the actual distance.
That is to say, on the one hand, traditional TOF module of making a video recording does not have the function of automatic exposure, can not carry out automatic exposure according to the service environment of difference, can produce great influence to TOF module of making a video recording's formation of image quality when light intensity in the service environment changes great, influences TOF module of making a video recording's result of use.
On the other hand, when making the TOF module of making a video recording have the automatic exposure function, make a video recording module exposure time at TOF and changed the time, the TOF module of making a video recording can obtain different depth data to same target object around the exposure time change, influences the accuracy that the TOF module of making a video recording was acquireed to depth data.
In summary, with the increasing and developing application range of the TOF camera module, the TOF camera module has an automatic exposure function to be inevitable, but how to solve the effect of the TOF camera module on changing the exposure time on the acquired depth data becomes a problem that restricts whether the TOF camera module can have the automatic exposure function.
Disclosure of Invention
The invention aims to provide an automatic exposure compensation method and system for a TOF camera module, wherein the exposure compensation method for the TOF camera module can be used for adaptively adjusting the exposure time in the shooting process and improving the imaging quality of the camera module.
Another objective of the present invention is to provide an automatic exposure compensation method for a TOF camera module and a system thereof, wherein the exposure compensation method for the TOF camera module can compensate the acquired depth data of the target object during the shooting process, so as to improve the accuracy of the acquired depth data of the target object.
Another objective of the present invention is to provide an automatic exposure compensation method and system for TOF camera module, wherein the exposure compensation method for TOF camera module can reduce depth error caused by exposure, and is favorable for improving distance measurement accuracy of TOF camera module.
Another objective of the present invention is to provide an automatic exposure compensation method for a TOF camera module and a system thereof, wherein the exposure compensation method for the TOF camera module can calculate the exposure time required for capturing the next frame of depth data during the shooting process, and obtain the next frame of depth data based on the exposure time.
Another objective of the present invention is to provide an automatic exposure compensation method and system for TOF camera module, wherein the exposure time in the exposure compensation method for TOF camera module corresponds to a corresponding depth compensation value for compensating the acquired depth data.
Another objective of the present invention is to provide an automatic exposure compensation method and system for TOF camera module, wherein the TOF camera module compensation method is simple to operate and accurate in measurement result.
Correspondingly, in order to achieve at least one of the above objects, the present invention provides an exposure compensation method for a TOF camera module, including:
acquiring first depth data and first gray data of a target object by a TOF camera module within a first preset exposure time;
collecting specific data of the first gray scale data to generate first measured brightness data;
generating a first automatic exposure gain based on the first measured brightness data and a first reference brightness data;
generating a first suggested exposure time based on the first automatic exposure gain and the first preset exposure time;
acquiring second depth data and second gray data of the target object by the TOF camera module according to the first suggested exposure time;
acquiring a first depth compensation value corresponding to the first recommended exposure time based on the first recommended exposure time; and
and compensating the second depth data based on the first depth compensation value to generate second compensated depth data corresponding to the second depth data.
According to an embodiment of the present invention, the automatic exposure compensation method further includes:
collecting specific data of the second gray scale data to generate second measured brightness data;
generating a second automatic exposure gain based on the second measured brightness data and a second reference brightness data;
generating a second recommended exposure time based on the second automatic exposure gain and the first recommended exposure time;
acquiring third depth data and third gray data of the target object by the TOF camera module according to the second suggested exposure time;
acquiring a second depth compensation value corresponding to the second recommended exposure time based on the second recommended exposure time; and
compensating the third depth data based on the second depth compensation value to generate third compensated depth data corresponding to the third depth data
According to an embodiment of the present invention, the automatic exposure compensation method further includes:
judging whether the first suggested exposure time is consistent with a second preset exposure time or not, and collecting the second depth data and the second gray scale data of the target object in the second preset exposure time when the first suggested exposure time is consistent with the second preset exposure time;
and when the first recommended exposure time is inconsistent with the second preset exposure time, acquiring the second depth data and the second gray scale data of the target object at the first recommended exposure time.
According to an embodiment of the present invention, the automatic exposure compensation method further includes:
judging whether the second recommended exposure time is consistent with a third preset exposure time or not, and collecting third depth data and third gray scale data of the target object in the third preset exposure time when the second recommended exposure time is consistent with the third preset exposure time;
and when the second recommended exposure time is inconsistent with the third preset exposure time, acquiring the third depth data and the third gray scale data of the target object at the second recommended exposure time.
According to an embodiment of the present invention, the specific data of the first gray scale data collected in the step of collecting the specific data of the first gray scale data to generate a first measured luminance data is the gray scale data corresponding to the central area of the gray scale image corresponding to the first gray scale data.
According to an embodiment of the present invention, in the step of obtaining a first depth compensation value corresponding to the first proposed exposure time based on the first proposed exposure time, the first depth compensation value corresponding to the first proposed exposure time is obtained through a depth compensation correspondence relationship, wherein the obtaining of the depth compensation correspondence relationship includes:
determining the distance between a depth camera and a target to be measured as a first optimal exposure time and a second optimal exposure time corresponding to a first distance and a second distance respectively;
acquiring depth data of the target to be measured based on the first distance and the first optimal exposure time to generate first measurement depth data;
measuring the actual distance of the first distance to generate first actual distance data;
generating a first depth error based on the first measured depth data and the first actual distance data;
establishing a corresponding relation between the first optimal exposure time and the first depth error;
acquiring depth data of the target to be measured based on the second distance and the second optimal exposure time to generate second measurement depth data;
measuring the actual distance of the second distance to generate second actual distance data;
generating a second depth error based on the second measured depth data and the second actual distance data;
establishing a corresponding relation between the second optimal exposure time and the second depth error; and performing linear interpolation on the value between the first optimal exposure time and the second optimal exposure time to obtain the complete depth compensation corresponding relation.
According to an embodiment of the present invention, the specific data of the first gray scale data collected in the step of collecting the specific data of the first gray scale data to generate a first measured luminance data is the gray scale data corresponding to the central area of the gray scale image corresponding to the first gray scale data.
According to another aspect of the present invention, the present invention further provides an automatic exposure compensation system for a TOF camera module, comprising:
a data acquisition unit capable of acquiring a first depth data and a first gray data acquired at a first preset exposure time, and a second depth data and a second gray data acquired at a first recommended exposure time;
the brightness measuring unit can acquire specific data of the first gray scale data to generate first measured brightness data and acquire specific data of the second gray scale data to generate second measured brightness data;
an exposure time suggesting unit capable of generating the first suggested exposure time based on the first preset exposure time, the first measured brightness data and a first reference brightness data; and
a depth compensation unit capable of obtaining a first depth compensation value corresponding to the first proposed exposure time and compensating the second depth data based on the first depth compensation value to generate a second compensated depth data.
According to an embodiment of the present invention, the automatic exposure compensation system of the TOF camera module further includes a determining unit, wherein the determining unit is capable of determining whether the first recommended exposure time is consistent with a second preset exposure time, and when the first recommended exposure time is consistent with the second preset exposure time, acquiring the second depth data and the second gray scale data with the second preset exposure time; and when the first recommended exposure time is inconsistent with the second preset exposure time, acquiring the second depth data and the second gray scale data with the first recommended exposure time.
According to an embodiment of the present invention, the depth compensation unit further includes a compensation index module and a depth compensation module communicatively connected to the compensation index module, the compensation index module is capable of obtaining the first recommended exposure time and obtaining a corresponding first depth compensation value based on the first recommended exposure time, and the depth compensation module is capable of obtaining the first depth compensation value and compensating the second depth data for the first depth compensation value to generate the second compensated depth data.
Drawings
Fig. 1 is a block diagram illustrating an automatic exposure compensation method for a TOF camera module according to a preferred embodiment of the present invention.
Fig. 2 is a block diagram illustrating an automatic exposure compensation method for a TOF camera module according to a preferred embodiment of the present invention.
Fig. 3 is a block diagram illustrating an automatic exposure compensation method of a TOF camera module according to a preferred embodiment of the present invention.
Fig. 4 is a flowchart illustrating an automatic exposure compensation method for a TOF camera module according to a preferred embodiment of the present invention.
Fig. 5 is a schematic diagram of the calculation process of the exposure time of the automatic exposure compensation method for the TOF camera module according to a preferred embodiment of the invention.
Fig. 6 is a schematic diagram of an exposure compensation corresponding relationship establishing process of an automatic exposure compensation method for a TOF camera module according to a preferred embodiment of the invention.
Fig. 7 is a schematic view of a depth compensation process of an automatic exposure compensation method of a TOF camera module according to a preferred embodiment of the invention.
Fig. 8 is a block diagram of a depth data compensation system of a TOF camera module according to a preferred embodiment of the invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be in a particular orientation, constructed and operated in a particular orientation, and thus the above terms are not to be construed as limiting the present invention.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Referring to fig. 1 and 3, the automatic exposure compensation method of the TOF camera module according to the present invention is illustrated. The automatic exposure compensation method of the TOF camera module can enable the TOF camera module to have an automatic exposure function, and can automatically compensate depth data and calculate exposure time of the next frame in the automatic exposure process of the camera module, so that the depth data acquired by the TOF camera module can be more accurate. The automatic exposure compensation method of the TOF camera module comprises the following steps:
101: acquiring a first depth data 12 and a first gray scale data 13 of a target object 200 by a TOF camera module 10 with a first preset exposure time 11;
102: collecting specific data of the first gray data 13 to generate a first measured brightness data 14;
103: generating a first automatic exposure gain 16 based on the first measured brightness data 14 and a first reference brightness data 15;
104: generating a first recommended exposure time 17 based on the first automatic exposure gain 16 and the first preset exposure time 11;
105: acquiring a second depth data 21 and a second gray data 22 of the target object 200 by the TOF camera module 10 according to the first suggested exposure time 17;
106: acquiring a first depth compensation value 18 corresponding to the first recommended exposure time 17 based on the first recommended exposure time 17; and
107: based on the first depth compensation value 18 and the second depth data 21, a second compensated depth data 19 corresponding to the second depth data 21 is generated.
According to an embodiment of the present invention, the method for compensating depth data of the TOF camera module further includes:
201: collecting specific data of the second gray data 22 to generate a second measured brightness data 23;
202: generating a second automatic exposure gain 25 based on the second measured brightness data 23 and a second reference brightness data 24;
203: generating a second recommended exposure time 26 based on the second automatic exposure gain 25 and the first recommended exposure time 11;
204: acquiring a third depth data 27 and a third gray scale data 28 of the target object 200 by the TOF camera module 10 with the second suggested exposure time 26;
205: acquiring a second depth compensation value 29 corresponding to the second recommended exposure time 26 based on the second recommended exposure time 26; and
206: the third depth data 27 is compensated based on the second depth compensation value 29 to generate a third compensated depth data 270 corresponding to the third depth data 27.
According to an embodiment of the present invention, in the step 101, the first preset exposure time 11 is a preferred exposure time preset by the TOF camera module 10 under a certain distance condition, and the TOF camera module 10 can acquire the first depth data 12 and the first gray scale data 13 of the target object 200 with the first preset exposure time 11. It is understood that the TOF camera module 10 further has a series of preset exposure times similar to the first preset exposure time 11, so that the TOF camera module 10 acquires corresponding depth data and gray scale data with the series of preset exposure times.
Specifically, according to a preferred embodiment of the present invention, when the TOF camera module 10 needs to be used to acquire the first depth data 12 and the first gray scale data 13, the RISC controls the kernel trigger function entry to start to configure the register of the TOF (Time-of-Flight) image sensor; data acquisition is carried out after the TOF image sensor is configured, and a Depth map (Depth) image and a gray Image (IR) can be obtained by analyzing the acquired data; and then multi-core distribution is carried out on the acquired data, and the algorithm can be performed with the inheritable processing, so that the algorithm efficiency is improved, and the real-time performance of the algorithm is ensured.
According to an embodiment of the present invention, in the step 102, the specific data of the first gray scale data 13 is collected to generate a first measured luminance data 14, and preferably, the collected specific data of the first gray scale data 13 is gray scale data corresponding to a certain range of the center of the gray scale image corresponding to the first gray scale data 13 as an average luminance data of the first gray scale data 13, so as to generate the first measured luminance data 14. That is, the first measured luminance data 14 reflects the average luminance data information of the first gradation data 13. It should be understood by those skilled in the art that the specific data of the first gray-scale data 13 collected for generating the first measured brightness data 14 can also be obtained by collecting some feature points in the first gray-scale data 13, and those skilled in the art should understand that the specific way of generating the first measured brightness data 14 by the first gray-scale data 13 should not be construed as a limitation to the present invention as long as the object of the present invention can be achieved.
According to an embodiment of the present invention, wherein in said step 103 a first automatic exposure gain 16 is generated based on said first measured luminance data 14 and said first reference luminance data 15, wherein said automatic exposure gain 16 is used for calculating in said step 104 a generation of said first suggested exposure time 17. Preferably, the value of the first automatic exposure gain 16 is equal to the value of the first measured brightness data 14 compared with the value of the first reference brightness data 15, where the first reference brightness data 15 is the reference brightness data corresponding to the first preset exposure time 11 of the TOF camera module 10. It should be understood by those skilled in the art that in other preferred embodiments of the present invention, the first automatic exposure gain 16 generated based on the first measured brightness data 14 and the first reference brightness data 15 may also be implemented by other algorithms or corresponding relations, and the specific calculation manner of the first automatic exposure gain 16 should not be construed as a limitation to the present invention as long as the object of the present invention can be achieved.
According to an embodiment of the present invention, in the step 104, a first recommended exposure time 17 is generated based on the first automatic exposure gain 16 and the first preset exposure time 11, wherein the first recommended exposure time 17 is used for the TOF camera module 10 to acquire depth data and gray scale data of a next frame at the first recommended exposure time 17. Wherein the value of the first proposed exposure time 17 is equal to the product of the value of the automatic exposure gain 16 and the value of the first preset exposure time 11.
According to an embodiment of the present invention, in the step 105, a second depth data 21 and a second gray scale data 22 of the target object 200 are acquired by the TOF camera module 10 with the first recommended exposure time 17, and with the duration of the first recommended exposure time 17 as the exposure time of the TOF camera module 10, the TOF camera module 10 acquires the second depth data 21 and the second gray scale data 22 of the target object 200 with the first recommended exposure time 17.
According to an embodiment of the present invention, the automatic exposure compensation method of the TOF camera module further includes:
judging whether the first suggested exposure time 17 is consistent with a second preset exposure time 110, and acquiring the second depth data 21 and the second gray scale data 22 of the target object 200 by using the second preset exposure time 110 when the first suggested exposure time 17 is consistent with the second preset exposure time 110;
acquiring the second depth data 21 and the second gray data 22 of the target object 200 at the first recommended exposure time 17 when the first recommended exposure time 17 does not coincide with the second preset exposure time 110; wherein determining whether the first proposed exposure time 17 and the second proposed exposure time 110 coincide further comprises translating the format of the first proposed exposure time 17 such that the format of the first proposed exposure time 17 is adapted to the format of the register of the TOF image sensor. When the first recommended exposure time 17 is not consistent with the second preset exposure time 110, it indicates that the exposure time needs to be modified, the first recommended exposure time 17 is configured into a register, and the first recommended exposure time 17 is taken as the exposure time, so that the automatic exposure function is realized.
According to an embodiment of the present invention, in the step 106, a first depth compensation value 18 corresponding to the first recommended exposure time 17 is obtained based on the first recommended exposure time 17, wherein the first depth compensation value 18 corresponds to the first recommended exposure time 17. Wherein after the first recommended exposure time 17 is determined, the first depth compensation value 18 corresponding to the first recommended exposure time 17 is obtained by referring to a depth compensation correspondence database according to the first recommended exposure time 17. The method for acquiring the depth compensation corresponding relation database comprises the following steps:
301: determining a first optimal exposure time 32 and a second optimal exposure time 33 corresponding to a first distance d1 and a second distance d2 respectively between a TOF camera module 10 and a target 31 to be measured;
302: acquiring depth data of the target 31 to be measured based on the first distance d1 and the first optimal exposure time 32 to generate first measured depth data 34;
303: measuring the actual distance of the first distance d1 to generate first actual distance data 35;
304: generating a first depth error 36 based on the first measured depth data 34 and the first actual distance data 35;
305: establishing a correspondence between the first optimal exposure time 32 and the first depth error 36;
306: acquiring depth data of the target 31 to be measured based on the second distance d2 and the second optimal exposure time 33 to generate second measured depth data 37;
307: measuring the actual distance of the second distance d2 to generate second actual distance data 38;
308: generating a second depth error 39 based on the second measured depth data 37 and the second actual distance data 38;
309: establishing a correspondence between the second optimal exposure time 33 and the second depth error 39; and
310: linear interpolation is performed on the value between the first optimal exposure time 32 and the second optimal exposure time 33 to obtain a depth error corresponding to the exposure time between the first optimal exposure time 32 and the second optimal exposure time 33, so as to obtain a complete depth compensation corresponding relationship.
According to an embodiment of the present invention, in the step 301, in determining a first optimal exposure time 32 and a second optimal exposure time 33 respectively corresponding to a first distance d1 and a second distance d2 between a TOF camera module 10 and a target board 31 to be tested, a reference exposure time P1 and a reference brightness value I1 corresponding to the reference exposure time P1 are first set, and then a first test gray scale image 311 of the target board 31 to be tested is acquired when the distance between the TOF camera module 10 and the target board 31 to be tested is d 1. And acquiring a second test gray image 312 of the target 3131 when the distance between the TOF camera module 10 and the target 31 is d 2.
The 10 × 10 size of the central region of the first test gray scale image 311 is selected, and the brightness value of the central region is measured, so as to generate a first test brightness value 313 based on the first test gray scale image 311.
The 10x10 size of the central region of the second test gray image 312 is selected and the brightness value of the central region is measured for generating a second test brightness value 314 based on the second test gray image 312.
It is known that the luminance value of the gray scale image generated by the TOF camera module reflects the intensity of the received reflected light signal, and the relationship between the intensity of the received light I, the distance d between the TOF camera module and the target to be measured, the exposure time P and the reflectivity R is I = C (R × P)/d ^ 2.
In the preferred embodiment, the target 31 is a flat white board having a reflectivity of 0.75.
The first optimal exposure time 32 corresponding to the first distance d1 is determined based on the first test brightness value 313, the reference brightness value I1 and the reference exposure time P1. Determining the second optimal exposure time 33 corresponding to the second distance d2 based on the second test brightness value 313, the reference brightness value I1 and the reference exposure time P1.
According to an embodiment of the present invention, in the step 302, the depth data of the target board 31 to be measured is collected based on the first distance d1 and the first optimal exposure time 32 to generate a first measured distance data 34, where the first measured distance data 34 is a measurement value of the TOF camera module on the first distance d 1.
In step 303, an actual distance of the first distance d1 is measured to generate a first actual distance 35, wherein the actual distance of the first distance d1 is measured by a three-party measuring tool.
In step 304, a first depth error 36 is generated based on the first measured depth data 34 and the first actual distance data 35, wherein the value of the first depth error 36 is equal to the value of the first actual distance 35 minus the value of the first measured distance 34.
In step 305, a corresponding relationship between the first optimal exposure time 32 and the first depth error 36 is established, and a corresponding relationship between a numerical value of the first optimal exposure time 32 and a numerical value of the first depth error 36 is established, so as to search and obtain corresponding depth error compensation data according to the exposure time.
Likewise, the second measured distance data 37 at the distance d2 is generated in the same manner in the step 306; in step 307, measuring an actual distance of the second distance d2 to generate the second actual distance data 38; in step 308, subtracting the value of the second measured distance data 37 from the value of the second actual distance data 38 to generate a second depth error 39; a correspondence between the second optimal exposure time 33 and the second depth error 39 is then established.
In the step 310, a linear interpolation is performed on a numerical value between the first optimal exposure time 32 and the second optimal exposure time 33 to obtain a depth error corresponding to an exposure time between the first optimal exposure time 32 and the second optimal exposure time 33, so as to obtain a complete depth compensation corresponding relationship, and a linear interpolation is performed on an exposure time between the first optimal exposure time 32 and the second optimal exposure time 33 to obtain a depth error corresponding to any exposure time between the first exposure time 32 and the second exposure time 33, so as to establish a continuous corresponding relationship between the exposure time and the depth error, thereby facilitating to obtain corresponding depth error compensation data based on a depth error search.
According to an embodiment of the present invention, in the step 107, in order to generate a second compensated depth data 19 corresponding to the second depth data 21 based on the first depth compensation value 18 and the second depth data 21, wherein a numerical value of the second compensated depth data 19 is equal to a sum of the first depth compensation value 18 and the second depth data 21, the second depth compensation data 19 is obtained by adding the first depth compensation value 18 to the second depth data 21, and the TOF camera module 10 outputs the compensated second depth compensation data 19 as the measured depth data.
Likewise, in the steps 201 to 203, the second recommended exposure time 26 is generated in the same manner as the first recommended exposure time 17 is generated, and first the second measured luminance data 23 is generated based on the second gradation data 22 in the step 201; in said step 202, generating said second automatic exposure gain 25; the second proposed exposure time 26 is then generated in said step 203.
In step 204 the third depth data 27 and the third greyscale data 28 are acquired with the generated second proposed exposure time 26 as exposure time.
In step 205, based on the second recommended exposure time 26, the depth compensation value 29 corresponding to the second recommended exposure time 26 is obtained by looking up the corresponding relationship between the exposure time and the depth error data with the second recommended exposure time 26 as a lookup index.
In the step 206, the second depth compensation value 29 is added to the third depth data 27 to obtain the third compensated depth data 270 corresponding to the third depth data 27.
Referring to fig. 4, a schematic flow structure diagram of a depth compensation method of a TOF camera module according to the present invention is shown. Firstly, controlling a nuclear trigger function inlet to start to configure a register of a TOF image sensor through an RSIC; acquiring a depth image (depth) and a gray Image (IR) after a register of the TOF image sensor is configured; judging whether the acquired depth image and gray image are first frame data, executing an exposure time suggestion algorithm when the acquired depth image and gray image are the first frame data, and executing a depth compensation algorithm when the acquired depth image and gray image are not the first frame data; judging whether the depth compensation algorithm is finished or not, executing an exposure time suggestion algorithm after judging that the depth compensation algorithm is finished, simultaneously outputting compensated depth data, and waiting for the completion of the execution of the depth compensation algorithm when judging that the depth compensation algorithm is not finished; carrying out format conversion on the suggested exposure time generated by the exposure time suggestion algorithm; and judging whether the recommended exposure time is consistent with the corresponding preset exposure time of the register, configuring the recommended exposure time into the register when the recommended exposure time is inconsistent with the corresponding preset exposure time of the register, and not configuring the recommended exposure time when the recommended exposure time is consistent with the corresponding preset exposure time of the register.
It will be appreciated by those skilled in the art that in other preferred embodiments of the invention, the exposure time suggestion algorithm can also be performed before the depth compensation algorithm after the TOF image sensor acquires the depth image and the grayscale image data, i.e. the exposure time suggestion algorithm is performed first and then the depth compensation algorithm is performed.
Referring to fig. 6, a flow chart of the process for establishing the exposure compensation parameters provided by the present invention is shown. Firstly, configuring optimal exposure parameters at different distances; measuring depth distances of different distances by taking the optimal exposure parameters corresponding to the distances as exposure time; measuring the corresponding actual distance using a three-party tool; measuring a depth error from the measured depth distance and the corresponding actual distance; an exposure compensation LUT (Lookup table) is established.
Referring to fig. 5, a flow chart of the exposure time calculation process provided by the present invention is shown. Firstly, acquiring a gray level image acquired by a TOF image sensor; selecting an interested area in the gray level image; measuring a quantized light intensity value of the region of interest; a proposed exposure time is then calculated based on the quantized light intensity value, the reference brightness value and the input exposure time.
Referring to fig. 7, a flow chart of the depth compensation process provided by the present invention is shown. Firstly, acquiring a depth image acquired by a TOF image sensor; acquiring a corresponding depth compensation value from an exposure compensation LUT according to the corresponding input exposure time; and acquiring the compensated depth image data.
Referring to fig. 8, the present invention further provides an automatic exposure compensation system for a TOF camera module according to another aspect of the present invention. The automatic exposure compensation system of the TOF camera module comprises a data acquisition unit 40, a brightness measurement unit 50, an exposure time suggestion unit 60 and a depth compensation unit 70, wherein the data acquisition unit 40 is communicably connected to the TOF camera module 10, and the data acquisition unit 40 can acquire depth image data and gray scale image data of the target object 200 through the TOF camera module 10; the brightness measuring unit 50 is communicably connected to the data acquiring unit 40, and the data acquiring unit 40 is capable of receiving the grayscale image data from the data acquiring unit 70 and generating measured brightness data based on the grayscale image data; the exposure time suggesting unit 60 is communicably connected to the brightness measuring unit 50, and the exposure time suggesting unit 60 is capable of receiving the measured brightness data from the brightness measuring unit 50, and calculating an suggested exposure time for the TOF camera module 10 to obtain the next frame of depth image data and gray scale image data based on the measured brightness data, so that the TOF camera module obtains the next frame of depth image data and gray scale image data with the suggested exposure time; the depth compensation unit 70 is communicably connected to the data acquisition unit 40, and the depth compensation unit 70 can acquire a depth compensation value corresponding to the recommended exposure time, so as to compensate the corresponding depth data and improve the accuracy of the depth data.
Specifically, the data obtaining unit 40 can obtain a first depth data 12 and a first gray scale data 13 acquired by the TOF camera module 10 within a first preset exposure time 11, and a second depth data 21 and a second gray scale data 22 acquired by the TOF camera module 10 within a first recommended exposure time 17.
The brightness measuring unit 50 can receive the first gray scale data 13 and the second gray scale data 22 from the data obtaining unit 40, and the brightness measuring unit 50 can collect feature data in the first gray scale data 13 and the second gray scale data 22 respectively, so as to generate a first measured brightness data 14 and a second measured brightness data 23 corresponding to the first gray scale data 13 and the second gray scale data 22 respectively.
The exposure time suggesting unit 60 can receive the first measured brightness data 14 and the second measured brightness data 23 from the brightness measuring unit 50, the exposure time suggesting unit 60 can generate the first suggested exposure time 17 based on the first measured brightness data 14 and the first reference brightness data 15, and the exposure time suggesting unit 60 can generate the second suggested exposure time 26 based on the second measured brightness data 23 and the second reference brightness data 24. The first suggested exposure time 17 is used for the TOF camera module 10 to acquire a second frame depth image data and a second frame gray scale image data, and the second suggested exposure time 26 is used for the TOF camera module 10 to acquire a third frame depth image data and a third frame gray scale image data.
The depth compensation unit 70 is capable of obtaining the second depth data 21 from the data obtaining unit 40, the depth compensation unit 70 is further capable of obtaining the first depth compensation value 18 corresponding to the first recommended exposure time 17, and the depth compensation unit 70 is capable of compensating the second depth data 21 based on the first depth compensation value 18 for generating a second compensated depth data 19. The second compensated depth data 19 is output as depth data of a second frame depth image acquired by the TOF camera module 10, and compared with the second depth data 21, the second compensated depth data 19 is compensated by the first depth compensation value 18, so that the depth data of the second compensated depth data 19 is closer to the real depth data and more accurate.
Further, preferably, in the present preferred embodiment, the brightness measuring unit 50 selects the gray scale data corresponding to the central area range of the gray scale image corresponding to the first gray scale data 13 as the feature data, and calculates and generates the first measured brightness data 14 corresponding to the first gray scale data 13. The first measured luminance data 14 reflects luminance information of a gray scale image corresponding to the first gray scale data 13. In other preferred embodiments of the present invention, the brightness measuring unit 50 can also calculate and generate the first measured brightness data 14 by selecting other data of the first gray scale data 13, and the invention should not be limited herein as long as the object of the invention can be achieved.
Accordingly, the brightness measuring unit 50 selects the gray scale data corresponding to the central area range of the gray scale image corresponding to the second gray scale data 22 as the feature data, and calculates and generates the second measured gray scale data 23 corresponding to the second gray scale data 22.
The exposure time suggesting unit 60 further includes an exposure gain calculating module 61 and a suggested exposure calculating module 62. The exposure gain calculation module 61 is communicatively connected to the proposed exposure calculation module 62, and the exposure gain calculation module 61 is also communicatively connected to the brightness measurement unit 50.
The exposure gain calculation module 61 can receive the first measured luminance data 14 and the second measured luminance data 23 from the luminance detection unit 50, and the exposure gain calculation module 61 can further obtain a first reference luminance data 15 and a second reference luminance data 24 corresponding to the first measured luminance data 14 and the second measured luminance data 23, respectively.
The exposure gain calculation module 61 is capable of calculating and generating the first automatic exposure gain 16 based on the first measured brightness data 14 and the first reference brightness data 15. Specifically, the value of the first automatic exposure gain 16 is equal to the value of the first measured brightness data 14 divided by the value of the first reference brightness data 15.
The exposure gain calculation module 61 is capable of calculating and generating the second automatic exposure gain 25 based on the second measured brightness data 23 and the second reference brightness data 24. Specifically, the value of the second automatic exposure gain 25 is equal to the value of the second measured brightness data 23 divided by the value of the second reference brightness data 24.
The proposed exposure calculation module 62 can receive the first and second automatic exposure gains 16 and 25 from the exposure gain calculation module 61. The proposed exposure calculation module 62 is further capable of obtaining the first preset exposure time 11 corresponding to the first automatic exposure gain 16.
The proposed exposure calculation module 62 is capable of calculating and generating the first proposed exposure time 17 based on the first automatic exposure gain 16 and the first preset exposure time 11. Specifically, the value of the first recommended exposure time 17 is equal to the product of the value of the first automatic exposure gain 16 and the value of the first preset exposure time 11.
The recommended exposure calculation module 62 is further capable of calculating and generating the second recommended exposure time 26 based on the second automatic exposure gain 16 and the first recommended exposure time 17. In particular, the value of the second recommended exposure time 26 is equal to the product of the value of the second automatic exposure gain 16 and the value of the first recommended exposure time 17.
The depth compensation unit 70 is operably connected to the recommended exposure calculation module 62, and the depth compensation unit 70 is capable of receiving the first recommended exposure time 17 and the second recommended exposure time 26 from the recommended exposure calculation module 62.
Further, the depth compensation unit 70 further includes a compensation index module 71 and a depth compensation module 72. The compensation index module 71 is communicatively connected to the depth compensation module 72, and the compensation index module 71 is further communicatively connected to the exposure time suggestion unit 60. Specifically, the compensation index unit 71 is communicatively connected to the proposed exposure calculation module 62.
The compensation indexing module 71 is capable of receiving the first proposed exposure time 17 and the second proposed exposure time 26 from the proposed exposure calculation module 62. The compensation indexing module 71 can index a first depth compensation value 18 corresponding to the first recommended exposure time 17 from a depth compensation mapping table based on the first recommended exposure time 17. The first depth compensation value 18 is used to compensate the second depth data 21. It should be noted that, the depth compensation correspondence index table is a correspondence between exposure time and depth errors, the exposure time within a certain range is respectively corresponding to the application of the depth errors, and when the numerical range of the exposure time 17 of the first suggestion is within the range of the exposure time of the depth compensation correspondence, the compensation index module 71 can generate the corresponding first depth compensation value 18 through the depth compensation correspondence index table index.
Accordingly, the compensation indexing module 71 can further index a second depth compensation value 18 corresponding to the second recommended exposure time 26 from the depth compensation correspondence index table based on the second recommended exposure time 26. The second depth compensation value 18 is used for performing depth compensation on the third frame depth data acquired by the TOF camera module 10.
The depth compensation module 72 is communicatively coupled to the data acquisition unit 40 and the compensation indexing module 72, respectively. The depth compensation module 72 can obtain the second depth data 22 from the data obtaining unit 40, the depth compensation module 72 can obtain the first depth compensation value 18 from the compensation indexing module 72, and the depth compensation module 72 can compensate the second depth data 22 based on the first depth compensation value 18 for generating the second compensated depth data 19 corresponding to the second depth data 22.
Further, the exposure compensation system of the TOF camera module further includes a determining unit 80, and the determining unit 80 is operably connected to the proposed exposure calculating module 62 of the exposure time proposing unit 60. The judging unit 80 can acquire the first recommended exposure time 17 and the second recommended exposure time 26 from the recommended exposure calculation module 62. The judging unit 80 can compare whether the first recommended exposure time 17 and the second preset exposure time 110 coincide. When the first suggested exposure time 17 is consistent with the second preset exposure time 110, the TOF camera module 10 obtains the second depth data 21 and the second gray data 22 by taking the second preset exposure time 110 as the exposure time; when the first suggested exposure time 17 is not consistent with the second preset exposure time 110, the TOF camera module 10 obtains the second depth data 21 and the second gray scale data 22 by using the first suggested exposure time 17 as the exposure time.
Specifically, before the determining module 80 determines whether the first suggested exposure time 17 is consistent with the second preset exposure time 110, the format of the first suggested exposure time 17 needs to be converted to adapt to the format of the register of the TOF sensor. And then comparing whether the first suggested exposure time 17 is consistent with the second preset exposure time 110, if not, indicating that the preset exposure time needs to be modified, and configuring the first suggested exposure time 17 into a register of a TOF sensor, thereby realizing the function of automatic exposure.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (9)

1. An automatic exposure compensation method of a TOF camera module is characterized by comprising the following steps:
acquiring first depth data and first gray data of a target object by a TOF camera module within a first preset exposure time;
collecting specific data of the first gray scale data to generate first measured brightness data;
generating a first automatic exposure gain based on the first measured brightness data and a first reference brightness data;
generating a first suggested exposure time based on the first automatic exposure gain and the first preset exposure time;
acquiring second depth data and second gray data of the target object by the TOF camera module according to the first suggested exposure time;
acquiring a first depth compensation value corresponding to the first recommended exposure time based on the first recommended exposure time; and
and compensating the second depth data based on the first depth compensation value to generate second compensated depth data corresponding to the second depth data.
2. The automatic exposure compensation method of the TOF camera module of claim 1, further comprising:
collecting specific data of the second gray scale data to generate second measured brightness data;
generating a second automatic exposure gain based on the second measured brightness data and a second reference brightness data;
generating a second recommended exposure time based on the second automatic exposure gain and the first recommended exposure time;
acquiring third depth data and third gray data of the target object by the TOF camera module according to the second suggested exposure time;
acquiring a second depth compensation value corresponding to the second recommended exposure time based on the second recommended exposure time; and
and compensating the third depth data based on the second depth compensation value to generate third compensated depth data corresponding to the third depth data.
3. The automatic exposure compensation method of the TOF camera module of claim 1, further comprising:
judging whether the first suggested exposure time is consistent with a second preset exposure time or not, and collecting the second depth data and the second gray scale data of the target object in the second preset exposure time when the first suggested exposure time is consistent with the second preset exposure time;
and when the first recommended exposure time is inconsistent with the second preset exposure time, acquiring the second depth data and the second gray scale data of the target object at the first recommended exposure time.
4. The automatic exposure compensation method of the TOF camera module of claim 2, further comprising:
judging whether the second recommended exposure time is consistent with a third preset exposure time or not, and collecting third depth data and third gray scale data of the target object in the third preset exposure time when the second recommended exposure time is consistent with the third preset exposure time;
and when the second recommended exposure time is inconsistent with the third preset exposure time, acquiring the third depth data and the third gray scale data of the target object at the second recommended exposure time.
5. The automatic exposure compensation method for the TOF camera module according to claim 1, wherein the specific data of the first gray scale data collected in the step of collecting the first measured luminance data is the gray scale data corresponding to the central region of the gray scale image corresponding to the first gray scale data.
6. The automatic exposure compensation method of the TOF camera module of claim 1, wherein in the step of obtaining a first depth compensation value corresponding to the first proposed exposure time based on the first proposed exposure time, the first depth compensation value corresponding to the first proposed exposure time is obtained by a depth compensation correspondence, wherein the obtaining of the depth compensation correspondence comprises:
determining the distance between a depth camera and a target to be measured as a first optimal exposure time and a second optimal exposure time corresponding to a first distance and a second distance respectively;
acquiring depth data of the target to be measured based on the first distance and the first optimal exposure time to generate first measurement depth data;
measuring the actual distance of the first distance to generate first actual distance data;
generating a first depth error based on the first measured depth data and the first actual distance data;
establishing a corresponding relation between the first optimal exposure time and the first depth error;
acquiring depth data of the target to be measured based on the second distance and the second optimal exposure time to generate second measurement depth data;
measuring the actual distance of the second distance to generate second actual distance data;
generating a second depth error based on the second measured depth data and the second actual distance data;
establishing a corresponding relation between the second optimal exposure time and the second depth error; and
and performing linear interpolation on the value between the first optimal exposure time and the second optimal exposure time to obtain the complete depth compensation corresponding relation.
7. An automatic exposure compensation system of module is made a video recording to TOF, its characterized in that includes:
a data acquisition unit capable of acquiring a first depth data and a first gray data acquired at a first preset exposure time, and a second depth data and a second gray data acquired at a first recommended exposure time;
the brightness measuring unit can acquire specific data of the first gray scale data to generate first measured brightness data and acquire specific data of the second gray scale data to generate second measured brightness data;
an exposure time suggesting unit capable of generating the first suggested exposure time based on the first preset exposure time, the first measured brightness data and a first reference brightness data; and
a depth compensation unit capable of obtaining a first depth compensation value corresponding to the first recommended exposure time and compensating the second depth data based on the first depth compensation value to generate a second compensated depth data.
8. The automatic exposure compensation system of the TOF camera module of claim 7, further comprising a determining unit capable of determining whether the first proposed exposure time is consistent with a second preset exposure time, and acquiring the second depth data and the second gray scale data with the second preset exposure time when the first proposed exposure time is consistent with the second preset exposure time; and when the first recommended exposure time is inconsistent with the second preset exposure time, acquiring the second depth data and the second gray scale data with the first recommended exposure time.
9. The system of claim 7, wherein the depth compensation unit further comprises a compensation index module and a depth compensation module communicatively coupled to the compensation index module, the compensation index module is capable of obtaining the first recommended exposure time and obtaining a corresponding first depth compensation value based on the first recommended exposure time, and the depth compensation module is capable of obtaining the first depth compensation value and compensating the first depth compensation value for the second depth data to generate the second compensated depth data.
CN201811587701.4A 2018-12-25 2018-12-25 Automatic exposure compensation method and system for TOF camera module Active CN111372005B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811587701.4A CN111372005B (en) 2018-12-25 2018-12-25 Automatic exposure compensation method and system for TOF camera module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811587701.4A CN111372005B (en) 2018-12-25 2018-12-25 Automatic exposure compensation method and system for TOF camera module

Publications (2)

Publication Number Publication Date
CN111372005A CN111372005A (en) 2020-07-03
CN111372005B true CN111372005B (en) 2023-03-24

Family

ID=71211305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811587701.4A Active CN111372005B (en) 2018-12-25 2018-12-25 Automatic exposure compensation method and system for TOF camera module

Country Status (1)

Country Link
CN (1) CN111372005B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111935423B (en) * 2020-08-02 2023-05-05 珠海一微半导体股份有限公司 Method for acquiring depth image data by robot and control system thereof
CN112363145A (en) * 2020-11-09 2021-02-12 浙江光珀智能科技有限公司 Vehicle-mounted laser radar temperature compensation system and method
CN114697560B (en) * 2020-12-31 2024-09-06 浙江舜宇智能光学技术有限公司 Active exposure method based on TOF imaging system and exposure time calculation method
CN114866703A (en) * 2021-02-03 2022-08-05 浙江舜宇智能光学技术有限公司 Active exposure method and device based on TOF imaging system and electronic equipment
CN113038028B (en) * 2021-03-24 2022-09-23 浙江光珀智能科技有限公司 Image generation method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262567A (en) * 2008-04-07 2008-09-10 北京中星微电子有限公司 Automatic exposure method and device
CN104796613A (en) * 2011-09-28 2015-07-22 原相科技股份有限公司 Imaging system
CN108848320A (en) * 2018-07-06 2018-11-20 京东方科技集团股份有限公司 Depth detection system and its exposure time adjusting method
CN108965732A (en) * 2018-08-22 2018-12-07 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7379100B2 (en) * 2004-02-12 2008-05-27 Canesta, Inc. Method and system to increase dynamic range of time-of-flight (TOF) and/or imaging sensors
TWI285047B (en) * 2005-11-24 2007-08-01 Sunplus Technology Co Ltd Method of automatic exposure control and automatic exposure compensated apparatus
WO2014208018A1 (en) * 2013-06-26 2014-12-31 パナソニックIpマネジメント株式会社 Distance measuring system
US9638791B2 (en) * 2015-06-25 2017-05-02 Qualcomm Incorporated Methods and apparatus for performing exposure estimation using a time-of-flight sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262567A (en) * 2008-04-07 2008-09-10 北京中星微电子有限公司 Automatic exposure method and device
CN104796613A (en) * 2011-09-28 2015-07-22 原相科技股份有限公司 Imaging system
CN108848320A (en) * 2018-07-06 2018-11-20 京东方科技集团股份有限公司 Depth detection system and its exposure time adjusting method
CN108965732A (en) * 2018-08-22 2018-12-07 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
飞行时间法三维成像摄像机数据处理方法研究;潘华东等;《浙江大学学报(工学版)》;20100615(第06期);全文 *

Also Published As

Publication number Publication date
CN111372005A (en) 2020-07-03

Similar Documents

Publication Publication Date Title
CN111372005B (en) Automatic exposure compensation method and system for TOF camera module
US11436748B2 (en) Volume measurement method and system, apparatus and computer-readable storage medium
CN104966308B (en) A kind of method for calculating laser beam spot size
CN110260820B (en) Underwater binocular stereo vision measurement system and method based on dynamic reference coordinate system
JP6876445B2 (en) Data compressors, control methods, programs and storage media
CN108871185B (en) Method, device and equipment for detecting parts and computer readable storage medium
CN112689097B (en) Automatic brightness control method and system for line laser and storage medium
US20220113171A1 (en) Sensor diagnosis device and computer readable medium
CN113251925A (en) Glue amount detection method and device based on area array laser
CN116935369A (en) Ship water gauge reading method and system based on computer vision
CN116167932A (en) Image quality optimization method, device, equipment and storage medium
TWI468658B (en) Lens test device and method
CN117434568A (en) Intelligent positioning system based on remote sensing satellite
CN114708243A (en) Cigarette end face tobacco missing quantitative detection method and device based on deep learning
CN113888420A (en) Underwater image restoration method and device based on correction model and storage medium
CN116433661B (en) Method, device, equipment and medium for detecting semiconductor wafer by multitasking
CN112270712A (en) Temperature drift calibration method and system based on depth camera module
CN114663519A (en) Multi-camera calibration method and device and related equipment
CN106646677B (en) Rainfall detection method and device
CN214410073U (en) Three-dimensional detection positioning system combining industrial camera and depth camera
CN112444302B (en) Image acquisition device, water level detection device, method and storage medium
US7821652B2 (en) System and method for focusing discrete points on an under-measured object
CN115496810A (en) External parameter evaluation method based on Lidar calibration camera
CN115218781A (en) Object surface 3D measuring method, device, equipment and computer readable storage medium
TWI494549B (en) A luminance inspecting method for backlight modules based on multiple kernel support vector regression and apparatus thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant