CN111586307A - Exposure method and image sensing device using same - Google Patents

Exposure method and image sensing device using same Download PDF

Info

Publication number
CN111586307A
CN111586307A CN201910435784.3A CN201910435784A CN111586307A CN 111586307 A CN111586307 A CN 111586307A CN 201910435784 A CN201910435784 A CN 201910435784A CN 111586307 A CN111586307 A CN 111586307A
Authority
CN
China
Prior art keywords
value
light intensity
pixel unit
confidence value
intensity confidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910435784.3A
Other languages
Chinese (zh)
Other versions
CN111586307B (en
Inventor
魏守德
陈韦志
吴峻豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lite On Technology Corp
Original Assignee
Lite On Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lite On Technology Corp filed Critical Lite On Technology Corp
Priority to US16/793,850 priority Critical patent/US11223759B2/en
Publication of CN111586307A publication Critical patent/CN111586307A/en
Application granted granted Critical
Publication of CN111586307B publication Critical patent/CN111586307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

An exposure method and image sensing device using the same, the exposure method includes the following steps: obtaining a first light intensity confidence value of each pixel unit under a first exposure time; obtaining a second light intensity confidence value of each pixel unit in a second exposure time, wherein the second light intensity confidence value is different from the first light intensity confidence value; and taking the phase difference value corresponding to one of the first and second light intensity confidence values of each pixel unit as the output value of the corresponding pixel unit.

Description

Exposure method and image sensing device using same
Technical Field
The present invention relates to an exposure method and an image sensing device using the same, and more particularly, to an exposure method using a plurality of different exposure times and an image sensing device using the same.
Background
The conventional image capturing device captures an image of a target and analyzes a distance between the target and the image capturing device through the captured image. For example, the image capturing device emits light to the target, the light is reflected from the target to the pixel unit of the image capturing device, and the difference between the electrical storage amounts of the two capacitors of the pixel unit changes. The image capturing device calculates the phase difference between the signal of the emitted light and the signal of the reflected light according to the electricity storage quantity difference, and then calculates the distance between the target and the image capturing device according to the phase difference.
However, within the scene, there may be near objects as well as distant objects. When the exposure time set by the distance measuring device is short, the near object in the image is clear, but the far object is not clear. On the contrary, when the exposure time set by the distance measuring device is long, the distant object in the image becomes clear, but the near object is overexposed. Therefore, how to provide a new exposure method is one of the efforts of those skilled in the art.
Disclosure of Invention
In view of the above-mentioned problems of the prior art, an object of the present invention is to provide an exposure method and an image sensing apparatus using the same.
According to an embodiment of the present invention, an exposure method is provided. The exposure method comprises the following steps: obtaining a first light intensity confidence value of each pixel unit under a first exposure time; obtaining a second intensity confidence value of each pixel unit in a second exposure time, wherein the second intensity confidence value is different from the first intensity confidence value; and taking the phase difference value corresponding to one of the first and second light intensity confidence values of each pixel unit as an output value of the corresponding pixel unit.
According to another embodiment of the present invention, an image sensing device is provided. The image sensing device comprises an image sensor and a controller. The image sensor comprises a plurality of pixel units. The controller is used for: obtaining a first light intensity confidence value of each pixel unit under a first exposure time; obtaining a second intensity confidence value of each pixel unit in a second exposure time, wherein the second intensity confidence value is different from the first intensity confidence value; and taking the phase difference value corresponding to one of the first and second light intensity confidence values of each pixel unit as an output value of the corresponding pixel unit.
The invention is described in detail below with reference to the drawings and specific examples, but the invention is not limited thereto.
Drawings
FIG. 1A is a functional block diagram of an electronic device according to a first embodiment of the invention;
FIG. 1B is a schematic diagram of a plurality of pixel units of the image sensor shown in FIG. 1A;
FIG. 1C is a schematic diagram of an exposure circuit of the pixel unit shown in FIG. 1B;
FIG. 1D is a schematic diagram of signals of the emitted light and the reflected light of FIG. 1A and control signals of the first switch and the second switch of FIG. 1C;
FIG. 1E is a signal diagram of several emitted light beams with different phase differences compared to the emitted light beam of FIG. 1D;
FIG. 2 is a flow chart showing an exposure method according to a first embodiment of the invention;
FIG. 3A is a graph showing the relationship between the three intensity confidence values obtained by one pixel unit of FIG. 1B under three different exposure times;
FIG. 3B is a diagram illustrating another relationship between three intensity confidence values obtained by one pixel unit of FIG. 1B at three different exposure times;
FIG. 3C is a graph showing another relationship between three intensity confidence values obtained by one pixel unit of FIG. 1B at three different exposure times;
FIG. 4 is a flowchart illustrating a method for determining output values of all pixel units of an image sensor according to a second embodiment of the invention;
FIGS. 5A-5B are diagrams illustrating a process of determining output values of pixel units of an image sensor according to a second embodiment of the invention;
FIG. 6 is a flowchart illustrating a method for determining output values of pixel units of an image sensor according to another embodiment of the invention;
FIG. 7 is a graph showing the relationship between three intensity confidence values obtained by the pixel unit during three different exposure times according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for determining output values of pixel units of an image sensor according to a third embodiment of the invention;
FIG. 9 is a graph showing the relationship between three intensity confidence values obtained by the pixel unit in the third embodiment of the invention at three different exposure times.
Detailed Description
The invention will be described in detail with reference to the following drawings, which are provided for illustration purposes and the like:
referring to fig. 1A to 1E, fig. 1A is a functional block diagram of an electronic device 10 according to a first embodiment of the invention, and fig. 1B is a diagram illustrating a plurality of pixel units P of the image sensor 120 of fig. 1AN×MFIG. 1C shows the pixel unit P of FIG. 1BN×MFig. 1D is a schematic diagram of signals of the emitted light L1 and the reflected light L2 in fig. 1A and control signals of the first switch SW1 and the second switch SW2 in fig. 1C, and fig. 1E is a schematic diagram of signals of a plurality of emitted lights having different phase differences compared to the emitted light in fig. 1D.
The electronic device 10 is, for example, an image pickup device, a distance measuring device, or a face recognition device. The electronic device 10 includes an image sensor 11 and a processor (processor)12, wherein the processor 12 is electrically coupled to the image sensor 11 and can process information provided by the image sensor 11. The image sensing device 11 includes a light source 110, an image sensor 120, and a controller 130. In one embodiment, image sensor 120 and controller 130 may be integrated into a single component. The image sensor 120 and/or the controller 130 are, for example, a solid circuit structure (circuit) formed by a semiconductor process. In one embodiment, the image sensor 11 is a Time of Flight (ToF) device, for example.
The Light source 110 is, for example, a Light Emitting Diode (LED) orA Laser Diode (LD) for emitting light L1 to irradiate the object O to photograph the surface O of the object OSThe image (or picture). The light L1 is, for example, infrared light, but the embodiment of the invention is not limited thereto. The light L1 becomes a reflected light L2 after being reflected from the object O, and the reflected light L2 is reflected to the image sensor 120. The image sensor 120 converts the received reflected light L2 into a received signal S12. The controller 130 performs corresponding processing and calculation according to the received signal S12 received by the image sensor 120, and generates an output value S2 to the processor 12. The output value S2 is, for example, a phase difference between the signal of the light L1 and the signal of the light L2, and the processor 12 can calculate the distance between the electronic device 10 and the object O according to the phase difference.
As shown in FIG. 1B, the image sensor 120 includes a plurality of pixel units PN×MWherein N and M are any positive integer equal to or greater than 1, and the values of N and M may be equal or different. As shown in fig. 1C, each pixel unit PN×MIncludes an exposure circuit 121, which can sense the reflected light L2 to generate a received signal S12. As shown in FIG. 1C, any pixel unit PN×MIncludes a first capacitor C1, a second capacitor C2, a photodiode 1211, switches SW1 and SW2, and switches R1 and R2.
Any pixel unit PN×MThe exposure process at an exposure time is: first, the switches R1 and R2 are turned on (turn on) to simultaneously charge the first capacitor C1 and the second capacitor C2. When the first capacitor C1 and the second capacitor C2 are fully charged, the switches R1 and R2 are switched off (turn off). Then, the first switch SW1 and the second switch SW2 can be selectively controlled to be turned on or off in turn. For example, as shown in fig. 1D, the control signals S13 and S14 indicate that the switch is turned on in the low level region (low) and turned off in the high level region (high). The control signal S13 and the control signal S14 are 180 degrees out of phase in FIG. 1D. As shown by the complementarity of the control signal S13 and the control signal S14 shown in FIG. 1D, the first switch SW1 and the second switch SW2 are turned on or off alternately. Thus, when the photodiode 1211 receives a photon (as seen by the signal S12 in FIG. 1D, the high level region indicates that the photodiode 1211 receives the reflected light L2) andwhen the first switch SW1 and the second switch SW2 are turned on in turn, the first capacitor C1 and the second capacitor C2 are discharged in turn.
As shown in fig. 1D, the dashed block a indicates that the first capacitor C1 is in a discharging state (the photodiode 1211 receives the reflected light L2 and the first switch SW1 is turned on), and the dashed block B indicates that the second capacitor C2 is in a discharging state (the photodiode 1211 receives the reflected light L2 and the second switch SW2 is turned on). In other words, the dotted line block a represents the discharge amount of the first capacitor C1, and the dotted line block B represents the discharge amount of the second capacitor C2. Therefore, after a period of exposure time, the discharge amount of the first capacitor C1 and the discharge amount of the second capacitor C2 can be obtained according to the variation of the dotted line blocks a and B. Under control signals of different phases, the difference in the respective amounts of charge of the first capacitor C1 and the second capacitor C2 (which is equal to the amount of discharge) is expressed as Image (0 °), Image (180 °), Image (90 °), and Image (270 °), in the following equations (1) and (2). And Image (0 °) indicates a discharge amount of the first capacitor C1 (i.e., a difference in the charge amounts of the first capacitor C1) read by a control signal of zero degrees in phase, and Image (180 °) indicates a discharge amount of the second capacitor C2 (a difference in the charge amounts of the second capacitor C2) read by a control signal of 180 degrees in phase. Image (90 °) indicates a discharge amount of the first capacitor C1 (a difference in the charge amount of the first capacitor C1) read by the control signal of one phase of 90 degrees, and Image (270 °) indicates a discharge amount of the second capacitor C2 (a difference in the charge amount of the second capacitor C2) read by the control signal of one phase of 270 degrees. The "exposure time" herein is defined as: if the first capacitor C1 is referred to as a capacitor, it means that the first switch SW1 is turned on for the light emitting time of the light source 110; for the second capacitor C2, it means that the second switch SW2 is in the on state for the light emitting time of the light source 110.
Surface O of object OSHas a stereoscopic (3D) contour, so that each pixel unit PN×MAnd surface OSAre not exactly the same, each pixel unit PN×MThe resulting received signal S12 may be different. The processor 12 obtains the surface O of the object O according to the phase difference value based on each pixel unitSThe overall three-dimensional profile of (a).
In fig. 1D, the signal S11 represents an emission signal pattern (pattern) of the emission light L1, the signal S12 represents a reception signal pattern of the reflection light L2 received by one of the pixel units, and Δ S is a phase difference between the emission signal S11 and the reception signal S12.
To increase the accuracy of the distance measurement, the light source 110 emits four light beams L1 with different phase delays, and calculates the distance between the electronic device 10 and the target O according to the four received signals S12 of the four reflected light beams L2. For example, the emission signal S11 in fig. 1D is a first emission light L1, the emission signal S11 'in fig. 1E is a second emission light, the phase of which is delayed by 90 degrees compared with the phase of the first emission light L1, the emission signal S11 ″ in fig. 1E represents a third emission light L1, the phase of which is delayed by 180 degrees compared with the phase of the first emission light L1, and the emission signal S11' ″ in fig. 1E represents a fourth emission light, the phase of which is delayed by 270 degrees compared with the phase of the first emission light L1. The four emitted light beams L1 are emitted from the surface O of the object OSAfter reflection, the light becomes four reflected light beams L2. The four reflected light beams L2 are reflected to the image sensor 120, and the image sensor 120 converts the received four reflected light beams L2 into four received signals S12 (each reflected light beam L2 is similar to fig. 1D, except that the phase difference is different according to the actual situation). The controller 130 can calculate the distance between the electronic device 10 and the target O more precisely according to the four transmitting signals S11 and the four receiving signals S12.
For example, during an exposure time (e.g., capturing an Image) of the pixel unit, the controller 130 calculates the output values S2, Image (0 °), Image (90 °), Image (180 °), and Image (270 °) according to the following equations (1) - (3), which are similar or identical to the above-mentioned calculations, and are not repeated herein.
I=Image(0°)-Image(180°).....................(1)
Q=Image(90°)-Image(270°).................(2)
S2=tan-1(Q/I)........................................(3)
The controller 130 obtains the output value S2 according to equations (1) to (3), and then transmits the output value S2 to the processor 12. The processor 12 calculates a distance value between the electronic device 10 and the target O according to the output value S2. Controller130 calculating the surface O of the plurality of pixel units and the object O according to the above principle and the equations (1) - (3)SThereby obtaining the surface OSThe three-dimensional profile of (a).
In the present embodiment, the controller 130 further obtains the intensity reliability value C of each pixel unit according to the four received signals S12 (the signal of the four reflected light beams L2). For example, please refer to the following formula (4) for calculating the light intensity confidence value C. From the equation (4), the intensity reliability value C is proportional to the intensity of the reflected light L2. Therefore, the intensity reliability C of each pixel unit in the current exposure time can be calculated in different exposure times, and the output value S2 of each pixel unit can be determined or selected according to the intensity reliability C.
Figure BDA0002070464340000061
Referring to fig. 2, a flow chart of an exposure method according to a first embodiment of the invention is shown. Steps S110 to S170 are applied to each pixel unit, and one pixel unit is exemplified below.
In step S110, please refer to fig. 3A, which illustrates a relationship diagram of three intensity reliability values respectively obtained by one pixel unit of fig. 1B during three different exposure times (capturing three images). In this step, the controller 130 obtains a first confidence value C1 of the light intensity of the pixel unit within the first exposure time T1.
In step S120, as shown in fig. 3A, the controller 130 obtains a second confidence value C2 of the pixel unit under a second exposure time T2, wherein the second exposure time T2 is longer than the first exposure time T1, and the second confidence value C2 is different from the first confidence value C1.
In step S130, as shown in fig. 3A, the controller 130 obtains a third confidence value C3 of the pixel unit under a third exposure time T3, wherein the third exposure time T3 is longer than the second exposure time T2, and the third confidence value C3 is different from the second confidence value C2 and the first confidence value C1. In one embodiment, the longer the exposure time, the higher the intensity reliability for the same pixel unit; conversely, the shorter the exposure time, the lower the intensity reliability.
In step S140, the controller 130 determines whether the first, second and third confidence values C1, C2 and C3 are all located between the upper confidence limit CU and the lower confidence limit CL (hereinafter referred to as "pass interval"). If yes, the process proceeds to step S150; if not, the flow proceeds to step S160. The intensity confidence values within the acceptable interval are referred to herein as "acceptable intensity confidence values", and the intensity confidence values outside the acceptable interval are referred to herein as "unacceptable intensity confidence values".
In step S150, as shown in fig. 3A, since the first, second and third confidence values C1, C2 and C3 are all qualified confidence values, the controller 130 can use the phase difference value corresponding to any one of the first, second and third confidence values C1, C2 and C3 as the output value S2 of the pixel unit.
In step S160, please refer to fig. 3B, which illustrates another relationship diagram of three intensity confidence values respectively obtained by one pixel unit of fig. 1B under three different exposure times. Unlike FIG. 3A, the first, second and third intensity confidence values C1, C2 and C3 in FIG. 3B are all lower than the confidence lower limit CL and belong to unqualified intensity confidence values. Therefore, the controller 130 can analyze the variation trend of the first, second and third confidence values C1, C2 and C3 by using a suitable mathematical method, such as linear regression, to obtain the qualified fourth confidence value C4 and the corresponding fourth exposure time T4.
Then, in step S170, the controller 130 controls the image sensing device 11 to capture the object O for a fourth exposure time T4 to obtain an output value S2 of the pixel unit at the fourth exposure time T4.
In another embodiment, please refer to fig. 3C, which illustrates another relationship diagram of the three intensity confidence values respectively obtained by one pixel unit of fig. 1B under three different exposure times. Unlike FIG. 3B, the first, second and third intensity confidence values C1, C2 and C3 in FIG. 3C are all higher than the confidence upper limit CU and belong to unqualified intensity confidence values. Therefore, the controller 130 can analyze the variation trend of the first, second and third confidence values C1, C2 and C3 by using a suitable mathematical method, such as linear regression, to obtain the qualified fourth confidence value C4 and the corresponding fourth exposure time T4. Then, in step S170, the controller 130 controls the image sensing device 11 to capture the object O for a fourth exposure time T4 to obtain an output value S2 under the fourth exposure time T4.
As can be seen from the embodiments shown in FIG. 3B and FIG. 3C, when the pixel unit still fails to generate the qualified intensity reliability value for a plurality of predetermined exposure times (e.g., the first exposure time T1, the second exposure time T2, and the third exposure time T3), the qualified intensity reliability value can be obtained by the steps S160-170 shown in FIG. 2.
In addition, in the exposure method of other embodiments, step S130 in fig. 2 may be omitted, that is, the output value is determined by two light intensity reliability values obtained in two exposure times.
Fig. 2 to 3C illustrate an example of determining the output value S2 of one pixel unit. The process of determining the output values of all the pixel units of the image sensor 120 is described next.
Referring to fig. 4 and 5A-5B, fig. 4 is a flowchart illustrating a process of determining output values of all pixel cells of an image sensor according to a second embodiment of the invention, and fig. 5A-5B are a flowchart illustrating a process of determining output values S2 of a plurality of pixel cells of an image sensor according to a second embodiment of the invention.
First, as shown in fig. 4, the controller 130 obtains three intensity confidence values of each pixel unit at three different exposure times according to steps S110 to S130 of fig. 2. Then, the controller 130 may first eliminate the unqualified light intensity confidence values from all the light intensity confidence values of all the pixel units, wherein the number of the qualified light intensity confidence values retained by each pixel unit may be three, two or one, or may not have the qualified light intensity confidence value.
Then, in step S210 of fig. 4, for the qualified intensity confidence value of each pixel unit, the controller 130 preferentially uses the phase difference value corresponding to the qualified intensity confidence value (e.g., the second intensity confidence value C2) obtained during the middle exposure time (e.g., the second exposure time T2 of fig. 3A) as the output value S2 of the pixel unit. For example, the controller 130 determines whether the second confidence value C2 is within the qualified interval, and if the second confidence value C2 is within the qualified interval, the process goes to S220; if not, the process proceeds to S230.
In step S220, the controller 130 uses the phase difference value corresponding to the second confidence value C2 as the output value S2 of the corresponding pixel unit. The pixel cells with cross-section lines in FIG. 5A (e.g., the pixel cells P11, P21, P54, etc.) indicate the pixel cells with the second confidence value C2 of acceptable confidence value, while the pixel cells with non-cross-section lines (e.g., the pixel cells P33, P24, P35, P45, etc.) indicate the pixel cells with the second confidence value C2 of unacceptable confidence value.
In step S230, for the pixel unit without the cross-section shown in fig. 5A, the controller 130 further uses the phase difference corresponding to the qualified intensity confidence value (e.g., the third intensity confidence value C3) obtained during the high exposure time (e.g., the third exposure time T3 shown in fig. 3A) as the output value S2 of the pixel unit. For example, since the second confidence value C2 is outside the qualified interval, the controller 130 determines whether the third confidence value C3 is inside the qualified interval. If the third confidence value C3 is within the qualified interval, the process goes to S240; if not, the flow proceeds to S250.
In step S240, the controller 130 uses the phase difference value corresponding to the third confidence value C3 as the output value S2 of the corresponding pixel unit. The pixel cells not hatched in FIG. 5B show the pixel cells having the second confidence value C2 and the third confidence value C3 both being unqualified confidence values.
In step S250, for the pixel units (such as the pixel units P35, P45, etc.) without being cut away in fig. 5B, the controller 130 further uses the phase difference corresponding to the qualified intensity confidence value (such as the first intensity confidence value C1) obtained in the low exposure time (such as the first exposure time T1 in fig. 3A) as the output value S2 of the pixel unit. For example, since the second confidence value C2 and the third confidence value C3 are outside the qualified interval, the controller 130 determines whether the first confidence value C1 is within the qualified interval. If the first confidence value of light intensity C1 is within the qualified interval, the process proceeds to step S260.
In step S260, the controller 130 uses the phase difference value corresponding to the first confidence value C1 as the output value S2 of the corresponding pixel unit.
In step S250, if the first intensity reliability value C1 is outside the acceptable range (no), indicating that the pixel unit cannot generate the acceptable intensity reliability value during the low, middle and high exposure time, the controller 130 may determine the acceptable intensity reliability value of the pixel unit or units by using the above-mentioned steps S160 and S170 of fig. 2.
As can be seen, in one embodiment, for each pixel unit, the controller 130 is configured to: after obtaining the light intensity confidence value of each pixel unit under the low, medium and high exposure time, preferentially determining the output value of the pixel unit according to the qualified light intensity confidence value under the medium exposure time; then, for the pixel unit which has not determined the output value, determining the output value of the pixel unit by the qualified light intensity confidence value under the high exposure time; then, for the pixel unit which has not determined the output value, determining the output value of the pixel unit by the qualified light intensity confidence value under the low exposure time; thereafter, for pixel units (if any) that fail to generate the acceptable intensity reliability value during the low, medium, and high exposure times, the controller 130 may determine the acceptable intensity reliability value of the pixel unit (S) by using steps S160 and S170 of fig. 2. In addition, the embodiment of the invention does not limit the numerical ranges of the high exposure time, the medium exposure time and the low exposure time.
In another embodiment, for each pixel unit, the controller 130 is configured to: after obtaining the light intensity confidence value of each pixel unit under the low, medium and high exposure time, the qualified light intensity confidence value under the high exposure time can be used to determine the output value of the pixel unit; then, for the pixel unit which has not determined the output value, determining the output value of the pixel unit by the qualified light intensity confidence value under the low exposure time; then, for the pixel units (if any) that cannot generate the qualified intensity reliability value during the high and low exposure times, the controller 130 may determine the qualified intensity reliability value of the pixel unit (S) by using steps S160 and S170 of fig. 2, where the exposure time corresponding to the qualified intensity reliability value is between the high exposure time and the low exposure time, and belongs to the middle exposure time.
In other embodiments, for each pixel unit, the controller 130 is configured to: after obtaining the light intensity confidence value of each pixel unit under a plurality of different exposure time, the qualified light intensity confidence value under any one of the different exposure time can be used for determining the output value of the pixel unit; then, for the pixel unit which has not determined the output value, determining the output value of the pixel unit according to the qualified light intensity reliability value under another one of the different exposure time; then, for the pixel units (if any) that cannot generate the qualified intensity confidence values, the controller 130 may determine the qualified intensity confidence values of the pixel unit (S) in steps S160 and S170 of fig. 2, which may be higher or lower than either one of the pixel units or higher or lower than the other pixel unit.
Referring to fig. 6 and 7, fig. 6 is a flowchart illustrating a process of determining a plurality of output values S2 of a plurality of pixel units of an image sensor according to another embodiment of the present invention, and fig. 7 is a graph illustrating a relationship between three intensity reliability values obtained by the pixel units under three different exposure times according to an embodiment of the present invention.
First, as shown in fig. 6, the controller 130 obtains three intensity confidence values of each pixel unit at three different exposure times according to steps S110 to S130 of fig. 2. Then, the controller 130 may first eliminate the unqualified light intensity confidence values from all the light intensity confidence values of all the pixel units, wherein the number of the qualified light intensity confidence values retained by each pixel unit may be three, two or one, or may not have the qualified light intensity confidence value.
In step S310 of fig. 6, as shown in fig. 7, the controller 130 obtains a first absolute difference Δ C1 between each first light intensity confidence value C1 and the preset confidence value Ci. The predetermined confidence value Ci is any value in the qualified interval between the upper confidence limit CU and the lower confidence limit CL. The first absolute difference Δ C1 is a value obtained by taking an absolute value.
In step S320, as shown in fig. 7, the controller 130 obtains a second absolute difference Δ C2 between each second confidence value C2 and the predetermined confidence value Ci. The second absolute difference Δ C2 is a value obtained by taking an absolute value.
In step S330, as shown in fig. 7, the controller 130 obtains a third absolute difference Δ C3 between each third confidence value C3 and the predetermined confidence value Ci. The third absolute difference Δ C3 is a value obtained by taking an absolute value.
In step S340, for each pixel unit, the controller 130 sets the phase difference corresponding to the minimum of the first absolute difference Δ C1, the second absolute difference Δ C2, and the third absolute difference Δ C3 as the output value S2 of the pixel unit. For example, as shown in fig. 7, among the first absolute difference Δ C1, the second absolute difference Δ C2 and the third absolute difference Δ C3, the second absolute difference Δ C2 is the smallest, so that the controller 130 uses the phase difference corresponding to the second absolute difference Δ C2 as the output value S2 of the pixel unit.
In the embodiment, the controller 130 uniformly uses the phase difference corresponding to the minimum of the first absolute difference Δ C1, the second absolute difference Δ C2 and the third absolute difference Δ C3 as the output value S2 of the pixel unit, regardless of whether the first intensity confidence value C1, the second intensity confidence value C2 and the third intensity confidence value C3 are within the qualified interval.
In another embodiment, the controller 130 selects the phase difference corresponding to the smallest absolute difference value from the plurality of absolute difference values within the qualified interval as the output value S2 of the pixel unit. For example, in the case of fig. 7, since the second and third absolute differences Δ C2 and Δ C3 are within the pass interval, only the second and third absolute differences Δ C2 and Δ C3 are considered, and the first absolute difference Δ C1 is not considered. In addition, for the pixel cells whose first, second and third confidence values C1, C2 and C3 are all outside the qualified interval, the controller 130 can determine the qualified confidence value of the pixel cell by using steps S160 and S170 of FIG. 2. Then, steps S310 to S340 in fig. 6 are repeated to determine the output value S2 of the pixel unit.
Referring to fig. 8 and 9, fig. 8 is a flowchart illustrating a method for determining output values S2 of pixel units of an image sensor according to a third embodiment of the invention, and fig. 9 is a graph illustrating a relationship between three intensity confidence values obtained by the pixel units of the third embodiment of the invention during three different exposure times.
In step S410, the controller 130 obtains the light intensity reliability value of each pixel unit during the high exposure time. For example, as shown in fig. 9, the controller 130 obtains a second confidence value C2 of the light intensity of each pixel unit under the second exposure time T2. The second confidence value C2 may be higher than the upper confidence limit CU, lower than the lower confidence limit CL, or within the acceptable range.
In step S420, the controller 130 determines whether a ratio of the amount of the second confidence values C2 of all the pixel cells that is higher than a threshold value to the total second confidence values C2 is higher than a threshold ratio. If the ratio is higher than the threshold ratio, the process proceeds to step S430; if not, the flow of the exposure method is ended. The intensity confidence threshold is, for example, a confidence upper limit CU, and the threshold ratio is, for example, 5% higher or lower.
In step S430, the controller 130 obtains the light intensity confidence value of each pixel unit in the low exposure time. For example, as shown in fig. 9, the controller 130 obtains a first confidence value C1 of the light intensity of each pixel unit under a first exposure time T1, wherein the first exposure time T1 is shorter than the second exposure time T2. In addition, the first light intensity confidence value C1 may be higher than the confidence upper limit CU, lower than the confidence lower limit CL, or within the qualified interval.
In step S440, the controller 130 may analyze the variation trends of the first and second confidence values C1 and C2 by using a suitable mathematical method, such as a linear regression method, to determine a third confidence value C3 of the intensity of each pixel unit within the qualified interval and a third exposure time T3 corresponding thereto. In the embodiment, the third confidence value C3 is within the qualified interval, but may be higher than the upper confidence limit CU or lower than the lower confidence limit CL.
In step S450, the controller 130 controls the image sensor 11 to capture the object O for a third exposure time T3 to obtain an output value S2 of each pixel unit under the third exposure time T3.
In summary, a phase difference value of each pixel unit can be obtained in one exposure time (one shot of the target). The embodiment of the invention takes a plurality of different exposure times (shooting a target for a plurality of times) to obtain a plurality of phase difference values of each pixel unit, and then determines (or selects) a phase difference value representing proper exposure (non-overexposure or non-underexposure) from the phase difference values according to the light intensity reliability value.
The present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof, and it should be understood that various changes and modifications can be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. An exposure method for obtaining a plurality of phase difference values of a plurality of pixel units of an image sensor, the exposure method comprising:
obtaining a first light intensity confidence value of each pixel unit under a first exposure time;
obtaining a second light intensity confidence value of each pixel unit in a second exposure time, wherein the second light intensity confidence value is different from the first light intensity confidence value; and
the phase difference value corresponding to one of the first and second light intensity confidence values of each pixel unit is used as an output value of the corresponding pixel unit.
2. The exposure method of claim 1, further comprising:
wherein the first exposure time is less than the second exposure time;
wherein the step of using the phase difference value corresponding to the first and second light intensity confidence values of each pixel unit as the output value of the corresponding pixel unit comprises:
judging whether the second light intensity confidence value is within a qualified interval of a confidence upper limit value and a confidence lower limit value;
if the second light intensity confidence value is within the qualified interval, the phase difference value corresponding to the second light intensity confidence value is taken as the output value of the corresponding pixel unit;
if the second light intensity confidence value is outside the qualified interval, judging whether the first light intensity confidence value is within the qualified interval; and
if the first light intensity confidence value is within the qualified interval, the phase difference value corresponding to the first light intensity confidence value is used as the output value of the corresponding pixel unit.
3. The exposure method according to claim 2, wherein the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit further comprises:
if the first light intensity confidence value and the second light intensity confidence value do not fall within the qualified interval, a third light intensity confidence value of the corresponding pixel unit in a third exposure time is obtained according to the variation trend of the first light intensity confidence value and the second light intensity confidence value, wherein the third light intensity confidence value is located within the qualified interval.
4. The exposure method according to claim 1, wherein the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit comprises:
obtaining a plurality of first absolute differences between each first light intensity confidence value and a preset confidence value;
obtaining a plurality of second absolute differences between each second light intensity confidence value and the preset confidence value;
for each pixel unit, the phase difference corresponding to the minimum of the first absolute difference and the second absolute difference is used as the output value of the corresponding pixel unit.
5. The exposure method according to claim 1, wherein the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit comprises:
analyzing the variation trend of the first and second light intensity confidence values to obtain a third light intensity confidence value of each pixel unit in a third exposure time, wherein the third light intensity confidence value is within the qualified interval.
6. The exposure method of claim 5, wherein the first exposure time is less than the second exposure time.
7. The exposure method according to claim 5, wherein the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit comprises:
judging whether the number of the second light intensity confidence values of the pixel units, which is higher than a light intensity confidence value threshold, accounts for a ratio of the second light intensity confidence values to be higher than a threshold ratio or not;
if the ratio is higher than the threshold ratio, the step of obtaining the first intensity confidence value of each pixel unit in the first exposure time is performed.
8. The exposure method according to claim 5, wherein the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit comprises:
judging whether the number of the second light intensity confidence values of the pixel units, which is higher than a light intensity confidence value threshold, accounts for a ratio of the second light intensity confidence values to be higher than a threshold ratio or not; and
if the ratio is lower than the threshold ratio, the process of the exposure execution method is ended.
9. An image sensing device, comprising:
an image sensor, including a plurality of pixel units; and
a controller for:
obtaining a first light intensity confidence value of each pixel unit under a first exposure time;
obtaining a second light intensity confidence value of each pixel unit in a second exposure time, wherein the second light intensity confidence value is different from the first light intensity confidence value; and
the phase difference value corresponding to one of the first and second light intensity confidence values of each pixel unit is used as an output value of the corresponding pixel unit.
10. The image sensing device of claim 9, wherein the controller is further configured to:
obtaining a third light intensity confidence value of each pixel unit under a third exposure time, wherein the first exposure time is shorter than the second exposure time, and the second exposure time is shorter than the third exposure time;
wherein the controller is further configured to, in the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit:
judging whether the second light intensity confidence value is within a qualified interval of a confidence upper limit value and a confidence lower limit value;
if the second light intensity confidence value is within the qualified interval, the phase difference value corresponding to the second light intensity confidence value is taken as the output value of the corresponding pixel unit;
if the second light intensity confidence value is outside the qualified interval, judging whether the third light intensity confidence value is within the qualified interval;
if the third light intensity confidence value is within the qualified interval, the phase difference value corresponding to the third light intensity confidence value is used as the output value of the corresponding pixel unit;
if the third light intensity confidence value is outside the qualified interval, judging whether the first light intensity confidence value is within the qualified interval; and
if the first light intensity confidence value is within the qualified interval, the phase difference value corresponding to the first light intensity confidence value is used as the output value of the corresponding pixel unit.
11. The image sensor as claimed in claim 9, wherein the controller is further configured to, in the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit:
obtaining a plurality of first absolute differences between each first light intensity confidence value and a preset confidence value; and
obtaining a plurality of second absolute differences between each second light intensity confidence value and the preset confidence value;
for each pixel unit, the phase difference corresponding to the minimum of the first difference and the second difference is used as the output value of the corresponding pixel unit.
12. The image sensor as claimed in claim 9, wherein the controller is further configured to, in the step of using the phase difference value corresponding to the first intensity confidence value and the second intensity confidence value of each pixel unit as the output value of the corresponding pixel unit:
analyzing the variation trend of the first and second light intensity confidence values to obtain a third light intensity confidence value of each pixel unit in a third exposure time, wherein the third light intensity confidence value is within the qualified interval.
CN201910435784.3A 2019-02-19 2019-05-23 Exposure method and image sensing device using same Active CN111586307B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/793,850 US11223759B2 (en) 2019-02-19 2020-02-18 Exposure method and image sensing device using the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962807246P 2019-02-19 2019-02-19
US62/807,246 2019-02-19

Publications (2)

Publication Number Publication Date
CN111586307A true CN111586307A (en) 2020-08-25
CN111586307B CN111586307B (en) 2021-11-02

Family

ID=72110768

Family Applications (5)

Application Number Title Priority Date Filing Date
CN201910263049.9A Pending CN111580117A (en) 2019-02-19 2019-04-02 Control method of flight time distance measurement sensing system
CN201910341808.9A Active CN111586306B (en) 2019-02-19 2019-04-25 Anti-overexposure circuit structure and electronic device using same
CN201910435784.3A Active CN111586307B (en) 2019-02-19 2019-05-23 Exposure method and image sensing device using same
CN201910541119.2A Active CN111580067B (en) 2019-02-19 2019-06-21 Operation device, sensing device and processing method based on time-of-flight ranging
CN201910971700.8A Active CN111624612B (en) 2019-02-19 2019-10-14 Verification method and verification system of time-of-flight camera module

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN201910263049.9A Pending CN111580117A (en) 2019-02-19 2019-04-02 Control method of flight time distance measurement sensing system
CN201910341808.9A Active CN111586306B (en) 2019-02-19 2019-04-25 Anti-overexposure circuit structure and electronic device using same

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201910541119.2A Active CN111580067B (en) 2019-02-19 2019-06-21 Operation device, sensing device and processing method based on time-of-flight ranging
CN201910971700.8A Active CN111624612B (en) 2019-02-19 2019-10-14 Verification method and verification system of time-of-flight camera module

Country Status (2)

Country Link
CN (5) CN111580117A (en)
TW (2) TWI741291B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112954230B (en) * 2021-02-08 2022-09-09 深圳市汇顶科技股份有限公司 Depth measurement method, chip and electronic device
CN113298778B (en) * 2021-05-21 2023-04-07 奥比中光科技集团股份有限公司 Depth calculation method and system based on flight time and storage medium
CN113219476B (en) * 2021-07-08 2021-09-28 武汉市聚芯微电子有限责任公司 Ranging method, terminal and storage medium
TWI762387B (en) * 2021-07-16 2022-04-21 台達電子工業股份有限公司 Time of flight devide and inspecting method for the same

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080205708A1 (en) * 2007-02-27 2008-08-28 Fujifilm Corporation Ranging apparatus and ranging method
CN105245796A (en) * 2014-07-01 2016-01-13 晶相光电股份有限公司 Sensor and sensing method
CN106461763A (en) * 2014-06-09 2017-02-22 松下知识产权经营株式会社 Distance measuring device
CN107229056A (en) * 2016-03-23 2017-10-03 松下知识产权经营株式会社 Image processing apparatus, image processing method and recording medium
CN107765260A (en) * 2016-08-22 2018-03-06 三星电子株式会社 For obtaining the method, equipment and computer readable recording medium storing program for performing of range information
CN108401098A (en) * 2018-05-15 2018-08-14 绍兴知威光电科技有限公司 A kind of TOF depth camera systems and its method for reducing external error
CN108616726A (en) * 2016-12-21 2018-10-02 光宝电子(广州)有限公司 Exposal control method based on structure light and exposure-control device
CN108700664A (en) * 2017-02-06 2018-10-23 松下知识产权经营株式会社 Three-dimensional motion acquisition device and three-dimensional motion adquisitiones

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002139818A (en) * 2000-11-01 2002-05-17 Fuji Photo Film Co Ltd Lens-fitted photographic film unit
CN101252802B (en) * 2007-02-25 2013-08-21 电灯专利信托有限公司 Charge pump electric ballast for low input voltage
US8699008B2 (en) * 2009-02-27 2014-04-15 Panasonic Corporation Distance measuring device
CN102735910B (en) * 2011-04-08 2014-10-29 中山大学 Maximum peak voltage detection circuit
CN103181156B (en) * 2011-07-12 2017-09-01 三星电子株式会社 Fuzzy Processing device and method
WO2013009099A2 (en) * 2011-07-12 2013-01-17 삼성전자 주식회사 Device and method for blur processing
EP2728374B1 (en) * 2012-10-30 2016-12-28 Technische Universität Darmstadt Invention relating to the hand-eye calibration of cameras, in particular depth image cameras
AT513589B1 (en) * 2012-11-08 2015-11-15 Bluetechnix Gmbh Recording method for at least two ToF cameras
US9019480B2 (en) * 2013-02-26 2015-04-28 Jds Uniphase Corporation Time-of-flight (TOF) system, sensor pixel, and method
US9681123B2 (en) * 2014-04-04 2017-06-13 Microsoft Technology Licensing, Llc Time-of-flight phase-offset calibration
US9641830B2 (en) * 2014-04-08 2017-05-02 Lucasfilm Entertainment Company Ltd. Automated camera calibration methods and systems
EP2978216B1 (en) * 2014-07-24 2017-08-16 Espros Photonics AG Method for the detection of motion blur
JP6280002B2 (en) * 2014-08-22 2018-02-14 浜松ホトニクス株式会社 Ranging method and ranging device
US10061029B2 (en) * 2015-01-06 2018-08-28 Samsung Electronics Co., Ltd. Correction of depth images from T-O-F 3D camera with electronic-rolling-shutter for light modulation changes taking place during light integration
CN104677277B (en) * 2015-02-16 2017-06-06 武汉天远视科技有限责任公司 A kind of method and system for measuring object geometric attribute or distance
CN106152947B (en) * 2015-03-31 2019-11-29 北京京东尚科信息技术有限公司 Measure equipment, the method and apparatus of dimension of object
US9945936B2 (en) * 2015-05-27 2018-04-17 Microsoft Technology Licensing, Llc Reduction in camera to camera interference in depth measurements using spread spectrum
CN107850664B (en) * 2015-07-22 2021-11-05 新唐科技日本株式会社 Distance measuring device
US9716850B2 (en) * 2015-09-08 2017-07-25 Pixart Imaging (Penang) Sdn. Bhd. BJT pixel circuit capable of cancelling ambient light influence, image system including the same and operating method thereof
TWI625538B (en) * 2015-09-10 2018-06-01 義明科技股份有限公司 Non-contact optical sensing device and method for sensing depth and position of an object in three-dimensional space
TWI557393B (en) * 2015-10-08 2016-11-11 微星科技股份有限公司 Calibration method of laser ranging and device utilizing the method
US10057526B2 (en) * 2015-11-13 2018-08-21 Pixart Imaging Inc. Pixel circuit with low power consumption, image system including the same and operating method thereof
US9762824B2 (en) * 2015-12-30 2017-09-12 Raytheon Company Gain adaptable unit cell
CN106997582A (en) * 2016-01-22 2017-08-01 北京三星通信技术研究有限公司 The motion blur removing method and equipment of flight time three-dimension sensor
US10516875B2 (en) * 2016-01-22 2019-12-24 Samsung Electronics Co., Ltd. Method and apparatus for obtaining depth image by using time-of-flight sensor
CN107040732B (en) * 2016-02-03 2019-11-05 原相科技股份有限公司 Image sensing circuit and method
US10762651B2 (en) * 2016-09-30 2020-09-01 Magic Leap, Inc. Real time calibration for time-of-flight depth measurement
JP6862751B2 (en) * 2016-10-14 2021-04-21 富士通株式会社 Distance measuring device, distance measuring method and program
US20180189977A1 (en) * 2016-12-30 2018-07-05 Analog Devices Global Light detector calibrating a time-of-flight optical system
US10557921B2 (en) * 2017-01-23 2020-02-11 Microsoft Technology Licensing, Llc Active brightness-based strategy for invalidating pixels in time-of-flight depth-sensing
JP7133220B2 (en) * 2017-02-17 2022-09-08 北陽電機株式会社 object capture device
EP3644281A4 (en) * 2017-06-20 2021-04-28 Sony Interactive Entertainment Inc. Calibration device, calibration chart, chart pattern generation device, and calibration method
EP3418681B1 (en) * 2017-06-22 2020-06-17 Hexagon Technology Center GmbH Calibration of a triangulation sensor
TWI622960B (en) * 2017-11-10 2018-05-01 財團法人工業技術研究院 Calibration method of depth image acquiring device
CN112363150A (en) * 2018-08-22 2021-02-12 Oppo广东移动通信有限公司 Calibration method, calibration controller, electronic device and calibration system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080205708A1 (en) * 2007-02-27 2008-08-28 Fujifilm Corporation Ranging apparatus and ranging method
CN106461763A (en) * 2014-06-09 2017-02-22 松下知识产权经营株式会社 Distance measuring device
CN105245796A (en) * 2014-07-01 2016-01-13 晶相光电股份有限公司 Sensor and sensing method
CN107229056A (en) * 2016-03-23 2017-10-03 松下知识产权经营株式会社 Image processing apparatus, image processing method and recording medium
CN107765260A (en) * 2016-08-22 2018-03-06 三星电子株式会社 For obtaining the method, equipment and computer readable recording medium storing program for performing of range information
CN108616726A (en) * 2016-12-21 2018-10-02 光宝电子(广州)有限公司 Exposal control method based on structure light and exposure-control device
CN108700664A (en) * 2017-02-06 2018-10-23 松下知识产权经营株式会社 Three-dimensional motion acquisition device and three-dimensional motion adquisitiones
CN108401098A (en) * 2018-05-15 2018-08-14 绍兴知威光电科技有限公司 A kind of TOF depth camera systems and its method for reducing external error

Also Published As

Publication number Publication date
TW202032154A (en) 2020-09-01
CN111580067A (en) 2020-08-25
CN111624612B (en) 2023-04-07
TWI741291B (en) 2021-10-01
CN111586306B (en) 2022-02-01
CN111580117A (en) 2020-08-25
TW202032155A (en) 2020-09-01
CN111624612A (en) 2020-09-04
CN111586306A (en) 2020-08-25
CN111586307B (en) 2021-11-02
CN111580067B (en) 2022-12-02
TWI696841B (en) 2020-06-21

Similar Documents

Publication Publication Date Title
CN111586307B (en) Exposure method and image sensing device using same
US20210181317A1 (en) Time-of-flight-based distance measurement system and method
US11769775B2 (en) Distance-measuring imaging device, distance measuring method of distance-measuring imaging device, and solid-state imaging device
CN109863418B (en) Interference handling in time-of-flight depth sensing
CN109100702B (en) Photoelectric sensor and method for measuring distance to object
JP6241793B2 (en) Three-dimensional measuring apparatus and three-dimensional measuring method
JP6863342B2 (en) Optical ranging device
US10616561B2 (en) Method and apparatus for generating a 3-D image
KR101737518B1 (en) Method and system for determining optimal exposure time and frequency of structured light based 3d camera
US9978148B2 (en) Motion sensor apparatus having a plurality of light sources
JP6709335B2 (en) Optical sensor, electronic device, arithmetic unit, and method for measuring distance between optical sensor and detection target
CN110520756B (en) Optical measuring device
CN110596727A (en) Distance measuring device for outputting precision information
US11223759B2 (en) Exposure method and image sensing device using the same
CN107615010B (en) Light receiving device, control method, and electronic apparatus
CN104219455A (en) Imaging apparatus, method of detecting flicker, and information processing unit
CN109547764B (en) Image depth sensing method and image depth sensing device
CN110895336B (en) Object detection device based on avalanche diode
CN109031333B (en) Distance measuring method and device, storage medium, and electronic device
JP2020160044A (en) Distance measuring device and distance measuring method
WO2020049126A1 (en) Time of flight apparatus and method
US11914039B2 (en) Range finding device and range finding method
KR20180096332A (en) Method for measuring distance
US20220003875A1 (en) Distance measurement imaging system, distance measurement imaging method, and non-transitory computer readable storage medium
JP2020051991A (en) Depth acquisition device, depth acquisition method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant