CN108495115A - Imaging sensor and its pixel groups and pel array, the method for obtaining image information - Google Patents

Imaging sensor and its pixel groups and pel array, the method for obtaining image information Download PDF

Info

Publication number
CN108495115A
CN108495115A CN201810340537.0A CN201810340537A CN108495115A CN 108495115 A CN108495115 A CN 108495115A CN 201810340537 A CN201810340537 A CN 201810340537A CN 108495115 A CN108495115 A CN 108495115A
Authority
CN
China
Prior art keywords
signal
subregion
pixel
photosensitive element
filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810340537.0A
Other languages
Chinese (zh)
Other versions
CN108495115B (en
Inventor
龚劲峰
王永刚
常建光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaian Xide Industrial Design Co ltd
Original Assignee
Huaian Imaging Device Manufacturer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaian Imaging Device Manufacturer Corp filed Critical Huaian Imaging Device Manufacturer Corp
Priority to CN201810340537.0A priority Critical patent/CN108495115B/en
Publication of CN108495115A publication Critical patent/CN108495115A/en
Application granted granted Critical
Publication of CN108495115B publication Critical patent/CN108495115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

Technical solution of the present invention discloses a kind of imaging sensor and its pixel groups and pel array, the method for obtaining image information.The pixel groups of described image sensor include:With 2n photosensitive element of matrix arrangement, the color filter of the corresponding 2n photosensitive element and several lenticules;Wherein, each two adjacent photosensitive element corresponds to a lenticule in a line, and n is natural number.The imaging sensor of technical solution of the present invention can either generate depth map, and can generate coloured image, both be not necessarily to increase the optical element of additional sensor or complex precise, and image processing algorithm is also uncomplicated.

Description

Imaging sensor and its pixel groups and pel array, the method for obtaining image information
Technical field
The invention belongs to image sensor technologies fields, and in particular to a kind of imaging sensor, the pixel of imaging sensor Array and pixel groups obtain the method and electronic equipment of image information.
Background technology
Cmos image sensor has been mass produced and has been applied.Traditional imaging sensor can generate two dimension (2D) image and video, the imaging sensor and system that can generate three-dimensional (3D) image recently receive significant attention, these 3D Imaging sensor can be applied to face recognition, augmented reality (AR, Augmented Reality)/virtual reality (VR, Virtual Reality), unmanned plane etc..
There are mainly three types of realization methods for existing 3D rendering sensor:Three-dimensional binocular, structure light and flight time (TOF, Time of Flight)。
Three-dimensional binocular is using two cameras and triangle principle come measurement distance, it uses ready-made colour image sensing Device (such as RGB image sensor), and cromogram and depth map can be generated simultaneously.But this mode needs two cameras With complicated image recognition algorithm, the power consumption of image recognition and pattern-recognition is all inevitably high.
Structure light is using on laser light source projects pattern to object, and the image being transmitted back to is captured by infrared camera, is led to The difference of processing projection and reflection graphic patterns is crossed, can pass through and depth or range information is calculated.But this mode needs again Miscellaneous light-source system and image generation system, and it needs an infrared sensor and a color image sensor to generate Cromogram and depth map.
Flight time is the pixel using special designing, by measuring the time of photon flight and return come ranging.It is not The algorithm of additional complexity is needed, but current technology can't generate the depth map of enough accuracy, while in addition it is also required to Color image sensor generate cromogram.
Therefore, most of existing 3D rendering information technology is required for more than one sensor, some modes also need to multiple The optical element of miscellaneous precision, also some modes need complicated image processing algorithm.
Invention content
Technical solution of the present invention technical problems to be solved are to provide a kind of imaging sensor, can either generate depth map, Coloured image can be generated again.
In order to solve the above technical problems, technical solution of the present invention provides a kind of pixel groups of imaging sensor, including with square 2n photosensitive element of battle array arrangement, the color filter of the corresponding 2n photosensitive element and several lenticules;Wherein, in a line Each two adjacent photosensitive element corresponds to a lenticule, and n is natural number.
Optionally, the 2n photosensitive element is arranged in square formation.
Optionally, n=2 or 8.
In order to solve the above technical problems, technical solution of the present invention also provides a kind of pel array of imaging sensor, packet Include several green paxels, several red paxels and several blue paxels;Wherein, pixel groups are above-mentioned pixel groups, institute The color filter for stating green paxel is green filter, and the color filter of the red paxel is red filter, The color filter of the blue paxel is blue filter;Several green filters, several red filters and several indigo plants Color optical filter is arranged in Bayer color filter array.
In order to solve the above technical problems, technical solution of the present invention also provides a kind of imaging sensor, including above-mentioned pixel Array further includes:Basal layer is formed with the photosensitive element in the basal layer;Filter layer is located on the basal layer, institute It includes the green filter, the red filter and the blue filter to state filter layer;Lens jacket is located at the optical filtering On layer, the lens jacket includes the lenticule.
Optionally, described image sensor further includes the interlayer dielectric layer between the basal layer and filter layer.
Optionally, described image sensor further includes the planarization layer between the filter layer and lens jacket.
In order to solve the above technical problems, technical solution of the present invention also provide it is a kind of using above-mentioned imaging sensor acquisition figure As the method for information, every a line of described image sensor is divided into several subregions, and each subregion includes 2m+1 to pixel pair, Wherein, m is natural number, and for the pixel to including the first photosensitive element and the second photosensitive element, the pixel centering first is photosensitive Element and the second photosensitive element are two photosensitive elements adjacent under same lenticule;
The method for obtaining image information includes the following steps:For each subregion, based on pixel in the subregion To the first signal and the second signal, determine the signal difference of the subregion, wherein the first signal is obtained from the first photosensitive element It takes, second signal is obtained from the second photosensitive element, and the signal difference of the subregion refers to the first letter of pixel pair in subregion Difference number with second signal, the signal difference are related to the quantity of pixel pair;Based on signal difference pass corresponding with object distance System or the correspondence based on signal difference and image distance obtain object distance corresponding with the signal difference of the subregion;Based on son The corresponding object distance of signal difference in region, obtains the depth information of image.
Optionally, the first signal and the second signal based on pixel pair in the subregion, determine the signal difference of the subregion It is different to include:According to D (x)=∑ P1 (a) P2 (a+x), corresponding each D (x) value when calculating x values from-m to m, wherein subregion In pixel to being marked successively with-m~m, P1 (a) indicates to mark the signal of the first signal of the pixel pair for being in the subregion Value ,-m≤a≤m, P2 (a+x) indicate the second signal for the pixel pair adjusted the distance to pixel with the pixel for being labeled as a to differing x Signal value, a, x are integer;Determine that corresponding x is the signal difference of the subregion when D (x) value maximums.
Optionally, the first signal and the second signal based on pixel pair in the subregion, determine the signal difference of the subregion It is different to include:According to D (x)=∑ P1 (a) P2 (a+x), corresponding each D (x) value when calculating x values from-m to m, wherein subregion In pixel to being marked successively with-m~m, P1 (a) indicates to mark the signal of the first signal of the pixel pair for being in the subregion Value ,-m≤a≤m, P2 (a+x) indicate the second signal for the pixel pair adjusted the distance to pixel with the pixel for being labeled as a to differing x Signal value, a, x are integer;According to each D (x) value of calculating and corresponding x, formation curve function D (X);Determine D (X) value most Corresponding X is the signal difference of the subregion when big.
Optionally, the correspondence based on signal difference and image distance obtains object corresponding with the signal difference of the subregion Away from including:Correspondence based on signal difference and image distance determines image distance corresponding with the signal difference of the subregion;According to taking the photograph Focal length as camera lens and image distance corresponding with the signal difference of the subregion calculate object corresponding with the signal difference of the subregion Away from.
Optionally, the signal difference and the correspondence of image distance are by calibrating the opposite of pick-up lens and imaging sensor Position obtains.
Optionally, the signal difference and the correspondence of object distance are obtained by calibrating the relative position of object and pick-up lens .
Optionally, the method for obtaining image information further includes:Photosensitive element based on Bayer color filter array obtains The signal taken determines the colouring information of each pixel groups;Colouring information based on pixel groups obtains the color information of image.
In order to solve the above technical problems, technical solution of the present invention also provides a kind of electronic equipment, including memory, processor And store the computer program that can be run on a memory and on a processor, which is characterized in that described in the processor executes Program realizes the step described in the method for above-mentioned acquisition image information.
Compared with prior art, technical solution of the present invention has the advantages that:In a line of imaging sensor, every two A adjacent photosensitive element shares a lenticule, can be with using the signal difference of two adjacent photosensitive elements when out of focus Depth information of the corresponding image-region of the two adjacent photosensitive elements relative to focusing area is obtained, depth is thus generated Figure.Also, the color filter of imaging sensor is arranged in Bayer filter array, can be with using the colouring information of sets of adjacent pixels The color information for obtaining image, thus generates coloured image.Therefore, the imaging sensor of technical solution of the present invention can either generate Depth map, and coloured image can be generated, it has both been not necessarily to increase the optical element of additional sensor or complex precise, and at image Adjustment method is also uncomplicated.
Description of the drawings
Figure 1A and Figure 1B is the structural schematic diagram of a kind of Bayer filter array and pixel unit;
Fig. 2A to Fig. 2 C is the structural schematic diagram of the pixel groups of the imaging sensor of the embodiment of the present invention;
Fig. 3 is the example schematic of the pel array of the imaging sensor of the embodiment of the present invention;
Fig. 4 A to Fig. 4 C are the signal difference schematic diagram of difference imaging contexts of the embodiment of the present invention;
Fig. 5 is the flow diagram of the method for the acquisition image information of the embodiment of the present invention;
Fig. 6 A and Fig. 6 B are the example schematic of image of the embodiment of the present invention and corresponding signal difference;
Fig. 7 is the flow diagram of the method for the acquisition image information of another embodiment of the present invention.
Specific implementation mode
Bayer color filter array (Bayer Color Filter Array, referred to as Bayer array) is to realize that image passes One of the major technique of sensor shoot coloured image, by taking RGB color image as an example, as shown in FIG. 1A and 1B, Figure 1A is to overlook Figure, Figure 1B is the sectional view of the line A-A along Figure 1A, and Bayer array is a 4x4 array, by 8 green pixel cell G, 4 indigo plants Color pixel unit B and 4 red pixel cell R compositions.Each pixel unit includes that 1 photosensitive element (is usually two pole of photoelectricity Pipe) 10,1 color filter and 1 lenticule 13;The color filter of green pixel cell G is green filter g, blue The color filter of pixel unit B is blue filter b (not shown)s, and the color filter of red pixel cell R is red Color optical filter r.Bayer filter array simulates human eye to the sensitivity of color, using 1 red 2 green 1 blue arrangement mode by gray scale Information is converted into colour information, and actually each pixel unit only has a kind of colouring information, need using demosaicking algorithm into Row interpolation calculates, and obtains colour information, is finally reduced into coloured image.
Depth map (Depth Map) is a kind of expression way of three-dimensional scene information, each pixel of depth image Gray value can be used for characterizing distance (i.e. object distance) of the certain point apart from pick-up lens in scene.
It applies same imaging sensor that can obtain coloured image to realize but also obtains depth image, hair of the invention A person of good sense is by research, it is proposed that a kind of pixel groups of imaging sensor and pel array made of being arranged by the pixel groups.
The pixel groups of described image sensor include:Include the 2n photosensitive element with matrix arrangement, it is corresponding 2n described The color filter of photosensitive element and several lenticules;Wherein, each two adjacent photosensitive element correspondence one is micro- in a line Mirror, n are natural number.
The pel array of described image sensor includes several green paxels, several red paxels and several blue pictures Plain group;Each pixel groups respectively include the 2n photosensitive element with matrix arrangement, the color filter of the corresponding 2n photosensitive element Device and several lenticules;Wherein, each two adjacent photosensitive element corresponds to a lenticule in a line;The green paxel Color filter is green filter, and the color filter of the red paxel is red filter, the blue paxel Color filter be blue filter;Several green filters, several red filters and several blue filters are arranged in Bayer color filter array.
In the embodiment of the present invention, with RGB image sensor, for n=1, as shown in Fig. 2A to Fig. 2 C, Fig. 2A is to overlook Figure, Fig. 2 B are the sectional views of the line A-A along Fig. 2A, and Fig. 2 C are the sectional views of the line C-C along Fig. 2A.2 groups of greens are shown in figure Pixel groups Gn, 1 group of red paxel Rn and 1 group of blue paxel Bn.
In an example, every group of pixel groups include 4 photosensitive elements, 20,1 color filter with 2x2 matrix arrangements (red filter r, green filter g, blue filter b) and 1 lenticule 23;In other words, 4 photosensitive elements 20 share 1 A color filter and shared 1 lenticule 23.
In another example, every group of pixel groups include 4 photosensitive elements, 20,1 color filter with 2x2 matrix arrangements Device (with 2x1 matrixes arranged by red filter r, green filter g, blue filter b) and 2 lenticules, 23,2 lenticules 23 Row;In other words, 4 photosensitive elements share 1 color filter, micro- with shared 1 of 2 in a line adjacent photosensitive elements 20 Mirror 23.
The indigo plant of the green filter g of green paxel Gn, the red filter r and blue paxel Bn of red paxel Rn Color optical filter b is arranged in Bayer color filter array, thus constitutes pel array shown in Fig. 3, the pel array it is each In row, each two adjacent photosensitive element shares a lenticule.
Accordingly, as shown in fig. 2 b and fig. 2 c, the imaging sensor of the embodiment of the present invention includes:Basal layer, the substrate Several photosensitive elements 20 with matrix arrangement are formed in layer;Filter layer is located on the basal layer, and the filter layer includes row Arrange into several green filter g, several red filter r and several blue filter b of Bayer color filter array, each filter Light device corresponds to four photosensitive elements 20 covered in the basal layer with 2x2 matrix arrangements;Lens jacket is located at the filter layer On, the lens jacket includes several lenticules 23.Lens jacket can be there are two types of realization method:One kind is achieved in that, Mei Yiwei Lens 23 correspond to cover color filter (red filter r or green filter g or blue filter b) and it is corresponding described in With four photosensitive elements 20 of 2x2 matrix arrangements in basal layer, that is to say, that each lenticule 23 has 1 color filter below Device and 4 photosensitive elements 20;Another kind is achieved in that each two lenticule 23 is with 2x1 matrix arrangements and corresponding covering one With four photosensitive elements 20 of 2x2 matrix arrangements in color filter and the corresponding basal layer, that is to say, that each color There are 4 photosensitive elements 20 below optical filter, there are 2 lenticules 23, each lenticule 23 to have 2 below above each color filter The adjacent photosensitive element 20 of a same a line.
Further, described image sensor can also include the interlayer dielectric layer between the basal layer and filter layer 21.The isolation or metal interconnection, interlayer dielectric layer 21 that interlayer dielectric layer 21 is used between film layer can be set as according to actual demand Single or multi-layer structure.In addition, since different color filter thickness is different, described image sensor can also include being located at Planarization layer 22 between the filter layer and lens jacket so that lens jacket can be formed on a flat surface.
It should be noted that the embodiment of the present invention is by taking RGB color image sensor as an example, color filter therein can Think red filter, green filter and blue filter.If it is the imaging sensor of other coloured typies, color filter Device can be set as corresponding color sensor according to practical application or demand.
In addition, according to the requirement of image analytic degree, the pel array of the imaging sensor of the embodiment of the present invention includes very much A array shown in Fig. 3, compares traditional structure, and image analytic degree is traditional a quarter.Due to existing sensor Reach more than ten million pixel level, single pixel (pixel unit) already close to or be less than one micron, the figure of the embodiment of the present invention Picture sensor is still small than existing most of 3D rendering sensor (single pixel is more than ten microns) enough.In addition to big portion Divide 3D applications, the image analytic degree needed is between hundreds of thousands to million, the resolution of the imaging sensor of the embodiment of the present invention Still enough meet demands.
Inventor has found, after sharing color filter and lenticule using two adjacent photosensitive elements, in a line, and phase With under color filter and same lenticule, different photosensitive elements can obtain different signals.It please compare with reference to figure 1B and figure Traditional structure shown in 2B, Figure 1B, the propagation path in incident light is identical, the reception of two neighboring photosensitive element Optical signal is identical;And in the structure of the present embodiment shown in Fig. 2 B, it is identical in incident light propagation path, it is adjacent micro- The optical signal that the photosensitive element of two positions p1 receives under mirror is identical, (i.e. the positions p1 and the positions p2) adjacent under same lenticule Two photosensitive elements receive different optical signals, and the photosensitive element for such as showing as the positions p1 in fig. 2b receives optical signal, The photosensitive element of the positions p2 is not received by optical signal, therefore after photosensitive element converts optical signals to electric signal, from p1 Signal that the photosensitive element set is read and the signal read from the photosensitive element of the positions p2 will be variant, this signal difference It is different to show as a signal relative to the advanced of another signal or lag.
Specifically, please continue to refer to Fig. 4 A to Fig. 4 C, when the incident light of different propagation paths is focused by pick-up lens L Afterwards:If focus point is too close (focused too close) from pick-up lens L, the imaging plane of object OB is fallen in image sensing Before plane where device S, imaging sensor S is too far apart from pick-up lens L in other words, then being read from the photosensitive element of the positions p1 The signal got can lag (lags) in the signal read from the photosensitive element of the positions p2, as shown in Figure 4 A, the signal side of p1 Along the signal edge for lagging behind p2.If focusing is accurate, the imaging plane of object OB is fallen in the plane where imaging sensor S On, then the signal read from the photosensitive element of the positions p1 can substantially be weighed with the signal read from the photosensitive element of the positions p2 It closes, as shown in Figure 4 B.If focus point is too far from pick-up lens L (focused too far away), the imaging of object OB is flat Face is fallen after the plane where imaging sensor S, and imaging sensor S is too close apart from pick-up lens L in other words, then from p1 The signal meeting advanced (leads) that the photosensitive element set is read is in the signal read from the photosensitive element of the positions p2, such as Fig. 4 C Shown, the signal edge of p1 is ahead of the signal edge of p2.It can be seen that in a line, same color optical filter and same micro- The signal difference of two adjacent photosensitive elements and object distance (at a distance from object and pick-up lens), image distance (image sensing under lens Device is at a distance from pick-up lens) there is correlation.
Based on above-mentioned principle, it will be understood that the pixel groups of imaging sensor of the present invention can be not limited to the above embodiments institute What is stated includes 4 photosensitive elements, that is to say, that pixel groups may include even number (2n) a photosensitive element, at least one color filter Light device and n lenticule, as long as each two adjacent photosensitive element corresponds to (shared) lenticule, color filter in per a line Device can be the optical filter of one or more same colors, and the optical filter of multiple same colors is arranged with array way.
Further, if it is considered that image scaling length and width can be proportional, the 2n photosensitive element is arranged in square formation, example Such as, 4 photosensitive element arrangement 2x2 matrixes, 16 photosensitive elements are arranged in 4x4 matrixes, and 36 photosensitive elements are arranged in 6x6 etc.. It is appreciated that the photosensitive element quantity of pixel groups is more, resolution is lower, therefore, usually can be with from ensuring that high-res consider Take n=2 or 8.
Using above-mentioned imaging sensor, the method for the acquisition image information of the embodiment of the present invention as shown in figure 5, include with Lower step:
Step S11 determines each subregion based on the first signal and the second signal of pixel pair in the subregion The signal difference of the subregion;
Step S12a, the correspondence based on signal difference and object distance obtain corresponding with the signal difference of the subregion Object distance;
Step S13, the corresponding object distance of signal difference based on subregion, obtains the depth information of image.
Each step is described in detail below.Every a line of described image sensor is divided into several subregions, Mei Yizi Region includes 2m+1 to pixel pair, and m is natural number, and the pixel is described to including the first photosensitive element and the second photosensitive element The first photosensitive element of pixel centering and the second photosensitive element are two photosensitive elements adjacent under same lenticule, each subregion In pixel to from left to right successively be labeled as-m~m.
Step S11 determines each subregion based on the first signal and the second signal of pixel pair in the subregion The signal difference of the subregion.As shown in Figure 2 B, the first photosensitive element is the photosensitive element of the positions p1, and the second photosensitive element is p2 The photosensitive element of position indicates the first photosensitive element with p1 below for convenience, and the second photosensitive element is indicated with p2, the One signal is obtained from the first photosensitive element p1, and second signal is obtained from the second photosensitive element p2, the signal difference of the subregion Refer to the first signal of pixel pair in subregion and the difference of second signal, the signal difference is related to the quantity of pixel pair.
Specifically, the first signal and the second signal based on pixel pair in the subregion, determine the letter of the pixel pair Number difference includes:According to D (x)=∑ P1 (a) P2 (a+x), corresponding each D (x) value when calculating x values from-m to+m, before m It is +/- to indicate the advanced of signal or lag;Determine that corresponding x is the signal difference of the pixel pair when D (x) value maximums, x is integer. Wherein, for the pixel in subregion to being marked successively with-m~m, P1 indicates that the signal value of the first signal, P2 indicate second signal Signal value, P1 (a) indicate to mark the signal value of the first signal of the pixel pair for being ,-m≤a≤m, P2 (a+x) in the subregion Indicate the signal value of the second signal for the pixel pair adjusted the distance to pixel with the pixel for being labeled as a to differing x.Here calculating can It can need the signal for using the pixel pair in adjacent subarea domain that it is adjacent (with this may to use the subregion left side when x is negative value In subregion label be m pixel pair the first photosensitive element it is adjacent) subregion in pixel pair signal;When x is positive value When, it may use adjacent (adjacent with the second photosensitive element of the pixel pair for being is marked in the subregion) on the right of the subregion The signal of pixel pair in subregion.As-m≤a+x≤m, P2 (a+x) indicates to mark the pixel pair for being in the subregion The signal value of second signal;Work as a+x<When-m, P2 (a+x) is actually that label is in the adjacent subregion in the subregion left side The signal value of the second signal of the pixel pair of+a+x;Work as a+x>When m, P2 (a+x) is actually the adjacent sub-district in subregion the right The signal value of the second signal for the pixel pair for being is marked in domain.
As previously mentioned, in a line, the letter of two adjacent photosensitive elements under same color optical filter and same lenticule Number difference has correlation with object distance (at a distance from object and pick-up lens), image distance (imaging sensor is at a distance from pick-up lens) Property.Accurate in focusing, the first signal and the second signal of pixel pair should overlap.And in the case of out of focus, root According to out of focus serious, the first signal and the second signal can be variant.Under the applicable cases of 3D rangings, usually we can select to scheme Some part (so-called point focusing or region focusing) of picture is focused, and the other parts of image because too remotely from or Person is too close, it may appear that situation out of focus, then the corresponding pixel in these parts is to just will appear signal difference.This signal difference The different relative displacement for being referred to as the first signal and second signal or phase difference can be determined opposite by relevance algorithms Displacement information is indicated with the quantity of pixel pair.
For example with m=5, every a line of imaging sensor is divided into several subregions, and each subregion includes 11 pairs of pictures Element is right, this 11 pairs of pixels to being labeled as -5, -4, -3, -2, -1,0,1,2,3,4,5 successively from left to right.In the case of focusing, Mark the first signal and the second signal of identical pixel pair should be identical.And in the case of out of focus, for a sub-regions, Calculate relative displacement be+5 pairs of pixels pair following multiplier and:
D (+5)=∑ P1 (a) P2 (a+5)=P1 (- 5) * P2 (0)+P1 (- 4) * P2 (1)+P1 (- 3) * P2 (2)+P1 (- 2) * P2(3)+P1(-1)*P2(4)+P1(0)*P2(5)+P1(1)*P2(6)+P1(2)*P2(7)+P1(3)*P2(8)+P1(4)*P2 (9)+P1(5)*P2(10).Wherein, P2 (6), P2 (7), P2 (8), P2 (9), P2 (10) are followed successively by adjacent on the right of the subregion P2 (- 5), P2 (- 4), P2 (- 3), P2 (- 2), the P2 (- 1) of subregion.
Calculate relative displacement be+4 pairs of pixels pair multiplier and:
D (+4)=∑ P1 (a) P2 (a+4)=P1 (- 5) * P2 (- 1)+P1 (- 4) * P2 (0)+P1 (- 3) * P2 (1)+P1 (- 2)*P2(2)+P1(-1)*P2(3)+P1(0)*P2(4)+P1(1)*P2(5)+P1(2)*P2(6)+P1(3)*P2(7)+P1(4)* P2(8)+P1(5)*P2(9).Wherein, P2 (6), P2 (7), P2 (8), P2 (9) are followed successively by subregion adjacent on the right of the subregion P2 (- 5), P2 (- 4), P2 (- 3), P2 (- 2).
Calculate relative displacement be+3 pairs of pixels pair multiplier and:
D (+3)=∑ P1 (a) P2 (a+3)=P1 (- 5) * P2 (- 2)+P1 (- 4) * P2 (- 1)+P1 (- 3) * P2 (0)+P1 (- 2)*P2(1)+P1(-1)*P2(2)+P1(0)*P2(3)+P1(1)*P2(4)+P1(2)*P2(5)+P1(3)*P2(6)+P1(4)* P2(7)+P1(5)*P2(8).Wherein, P2 (6), P2 (7), P2 (8) be followed successively by adjacent subregion on the right of the subregion P2 (- 5)、P2(-4)、P2(-3)。
Calculate relative displacement be+2 pairs of pixels pair multiplier and:
D (+2)=∑ P1 (a) P2 (a+2)=P1 (- 5) * P2 (- 3)+P1 (- 4) * P2 (- 2)+P1 (- 3) * P2 (- 1)+P1 (- 2)*P2(0)+P1(-1)*P2(1)+P1(0)*P2(2)+P1(1)*P2(3)+P1(2)*P2(4)+P1(3)*P2(5)+P1(4)* P2(6)+P1(5)*P2(7).Wherein, P2 (6), P2 (7) be followed successively by the P2 (- 5) of adjacent subregion on the right of the subregion, P2 (- 4)。
Calculate relative displacement be+1 pair of pixel pair multiplier and:
D (+1)=∑ P1 (a) P2 (a+1)=P1 (- 5) * P2 (- 4)+P1 (- 4) * P2 (- 3)+P1 (- 3) * P2 (- 2)+P1 (- 2)*P2(-1)+P1(-1)*P2(0)+P1(0)*P2(1)+P1(1)*P2(2)+P1(2)*P2(3)+P1(3)*P2(4)+P1(4)* P2(5)+P1(5)*P2(6).Wherein, P2 (6) is the P2 (- 5) of adjacent subregion on the right of the subregion.
Calculate relative displacement be 0 pair of pixel pair multiplier and:
D (0)=∑ P1 (a) P2 (a)=P1 (- 5) * P2 (- 5)+P1 (- 4) * P2 (- 4)+P1 (- 3) * P2 (- 3)+P1 (- 2) * P2(-2)+P1(-1)*P2(-1)+P1(0)*P2(0)+P1(1)*P2(1)+P1(2)*P2(2)+P1(3)*P2(3)+P1(4)*P2 (4)+P1(5)*P2(5)。
Calculate relative displacement be -1 pair of pixel pair multiplier and:
D (- 1)=∑ P1 (a) P2 (a-1)=P1 (5) * P2 (4)+P1 (4) * P2 (3)+P1 (3) * P2 (2)+P1 (2) * P2 (1)+P1(1)*P2(0)+P1(0)*P2(-1)+P1(-1)*P2(-2)+P1(-2)*P2(-3)+P1(-3)*P2(-4)+P1(- 4)*P2(-5)+P1(-5)*P2(-6).Wherein, P2 (- 6) is the P2 (5) of the adjacent subregion in the subregion left side.
Calculate relative displacement be -2 pairs of pixels pair multiplier and:
D (- 2)=∑ P1 (a) P2 (a-2)=P1 (5) * P2 (3)+P1 (4) * P2 (2)+P1 (3) * P2 (1)+P1 (2) * P2 (0)+P1(1)*P2(-1)+P1(0)*P2(-2)+P1(-1)*P2(-3)+P1(-2)*P2(-4)+P1(-3)*P2(-5)+P1(- 4)*P2(-6)+P1(-5)*P2(-7).Wherein, P2 (- 6), P2 (- 7) are followed successively by the P2 of the adjacent subregion in the subregion left side (5)、P2(4)。
Calculate relative displacement be -3 pairs of pixels pair multiplier and:
D (- 3)=∑ P1 (a) P2 (a-3)=P1 (5) * P2 (2)+P1 (4) * P2 (1)+P1 (3) * P2 (0)+P1 (2) * P2 (- 1)+P1(1)*P2(-2)+P1(0)*P2(-3)+P1(-1)*P2(-4)+P1(-2)*P2(-5)+P1(-3)*P2(-6)+P1(- 4)*P2(-7)+P1(-5)*P2(-8).Wherein, P2 (- 6), P2 (- 7), P2 (- 8) are followed successively by the adjacent sub-district in the subregion left side P2 (5), P2 (4), the P2 (3) in domain.
Calculate relative displacement be -4 pairs of pixels pair multiplier and:
D (- 4)=∑ P1 (a) P2 (a-4)=P1 (5) * P2 (1)+P1 (4) * P2 (0)+P1 (3) * P2 (- 1)+P1 (2) * P2 (-2)+P1(1)*P2(-3)+P1(0)*P2(-4)+P1(-1)*P2(-5)+P1(-2)*P2(-6)+P1(-3)*P2(-7)+P1(- 4)*P2(-8)+P1(-5)*P2(-9).Wherein, P2 (- 6), P2 (- 7), P2 (- 8), P2 (- 9) are followed successively by the subregion left side phase P2 (5), P2 (4), P2 (3), the P2 (2) of adjacent subregion.
Calculate relative displacement be -5 pairs of pixels pair multiplier and:D (- 5)=∑ P1 (a) P2 (a-5)=P1 (5) * P2 (0)+ P1(4)*P2(-1)+P1(3)*P2(-2)+P1(2)*P2(-3)+P1(1)*P2(-4)+P1(0)*P2(-5)+P1(1)*P2(-6) +P1(2)*P2(-7)+P1(3)*P2(-8)+P1(4)*P2(-9)+P1(5)*P2(-10).Wherein, P2 (- 6), P2 (- 7), P2 (- 8), P2 (- 9), P2 (- 10) are followed successively by P2 (5), P2 (4), P2 (3), P2 (2), the P2 of the adjacent subregion in the subregion left side (1).From obtained each D (x) value calculated above, determine that corresponding x is the signal difference of the subregion when D (x) value maximums, it can It can be overlapped with another signal to rear with being interpreted as one of signal movement how many pixel.
Further, the signal difference being calculated by mode above is integer, but in practical applications, really most Being worth corresponding signal difference value greatly may be between two adjacent integers, therefore, more accurate signal difference in order to obtain, can be with First according to each D (x) value of calculating and corresponding x, formation curve function D (X), then determine that corresponding X is to be somebody's turn to do when D (X) value maximum The signal difference of subregion.Specifically, if indicated with the curve in coordinate diagram, X-axis indicates relative displacement x, Y-axis expression pair The multiplier and D (x) for answering relative displacement, can be with using the existing algorithm such as interpolation method according to each D (x) value of calculating and corresponding x The discrete point D (x) obtained by calculating is linked to be a curve in coordinate diagram, this curve is described with curvilinear function D (X), then The corresponding X values of maximum value of peak value, that is, D (X) of curve are exactly that the first signal and the second signal of the pixel pair of the subregion are practical Relative displacement, i.e., the signal difference of the subregion, X may be integer, it is also possible to not be integers.
In a similar way, the first signal and the second signal that can obtain pixel pair in other subregions are actual opposite Displacement.It should be noted that relative displacement is possible to the width more than single subregion, the value of m determines relative displacement Thus measurable range determines the resolution of depth map and maximum measurement range.
Step S12a, the correspondence based on signal difference and object distance obtain corresponding with the signal difference of the subregion Object distance.
The signal difference and the correspondence of object distance can have previously been stored in readable memory, when passing through step S11 After obtaining the signal difference of subregion, picture corresponding with the signal difference of the subregion can be obtained by reading readable memory Away from numerical value.
In the present embodiment, the signal difference and the correspondence of object distance are by calibrating the opposite of object and pick-up lens Position obtains.Specifically, in calibration, pick-up lens and imaging sensor remain stationary as, and can use the 2D of specific pattern Plan view (be typically striped) is placed in the position from pick-up lens different distance, measure in subregion at this time the first signal with The difference (i.e. relative displacement) of second signal.Fig. 6 A and Fig. 6 B are please referred to, the left side is image, and the right is corresponding signal graph, When focusing accurate, imaging clearly, the first signal and the second signal essentially coincide, as shown in Figure 6A;When out of focus, it is imaged mould Paste, there are relative displacements for the first signal and the second signal, as shown in Figure 6B, using above-mentioned calculating multiplier and by the way of obtain first The relative displacement of signal and second signal.For example, picture is placed in position of 50 centimetres from pick-up lens, measurement obtains opposite position Shifting has changed 5 pairs of pixels pair, if 5 pairs of pixels are 5 millimeters to corresponding distance in the picture, can be obtained from calibration This object distance and the correspondence of relative displacement are 50 centimetres/5 millimeters.In practical application, if obtaining figure by step S11 As certain sub-regions relative displacement be 5 pairs of pixels to (i.e. 5 millimeters), then can determine this sub-regions by step S12a There is 50 centimetres of depth relative to focusing area.The form of expression of the signal difference and the correspondence of object distance is not limited to seat Curve in marking on a map indicates, can also be indicated with other forms such as equation or tables.
Step S13, the corresponding object distance of signal difference based on subregion, obtains the depth information of image.Pass through above-mentioned step Suddenly the corresponding object distance of signal difference (i.e. depth) of every sub-regions of image, the depth information of such whole image can be obtained It can also obtain, be indicated per the corresponding depth of sub-regions, it is possible thereby to generate depth map with gray value.
Further, as shown in figure 5, the method for the acquisition image information of the present embodiment further includes:
Step S14 determines that the color of each pixel groups is believed based on the signal that the photosensitive element of Bayer color filter array obtains Breath.Specifically, for each pixel groups region, the pixel groups region corresponding color can be determined for example, by calculating signal mean value The colouring information of optical filter, and obtain other colouring informations for example, by interpolation calculation.For example, for green pixel Group determines the green information of the green paxel by the average value of the signal of each photosensitive element in pixel groups, and according to adjacent The blue information of blue paxel and the red information of neighboring red pixel group do interpolation calculation, obtain the indigo plant of the green paxel Color information and red information, this algorithm is similar with the algorithm of the colouring information of the pixel of existing Bayer array, herein no longer Explanation is unfolded.
Step S15, the colouring information based on pixel groups obtain the color information of image.Each picture is obtained by step S14 The colouring information in element group region, the color information of such whole image can also obtain, it is possible thereby to generate coloured image.
Using above-mentioned imaging sensor, the method for the acquisition image information of another embodiment of the present invention is as shown in fig. 7, packet Include following steps:
Step S11 determines each subregion based on the first signal and the second signal of pixel pair in the subregion The signal difference of the subregion;
Step S12b, the correspondence based on signal difference and image distance obtain corresponding with the signal difference of the subregion Object distance;
Step S13, the corresponding object distance of signal difference based on subregion, obtains the depth information of image;
Step S14 determines that the color of each pixel groups is believed based on the signal that the photosensitive element of Bayer color filter array obtains Breath;
Colouring informations of the step S15 based on pixel groups, obtains the color information of image.
The step S12b of embodiment illustrated in fig. 7 is different from the step S12a of embodiment illustrated in fig. 5, specifically, is based on signal The correspondence of difference and image distance, obtaining object distance corresponding with the signal difference of the subregion may include:Based on signal difference With the correspondence of image distance, image distance corresponding with the signal difference of the subregion is determined;According to the focal length of pick-up lens and with this The corresponding image distance of signal difference of subregion calculates object distance corresponding with the signal difference of the subregion.In focal length f and image distance v In the case of known, according to basic optical principle 1/f=1/u+1/v, object distance u can be obtained.
In the present embodiment, the correspondence of the signal difference and image distance is by calibrating pick-up lens and imaging sensor Relative position obtains.Specifically, in calibration, the 2D plan views (being typically striped) of specific pattern can be used, are placed in Position from pick-up lens fixed range, dollying camera lens to the position from imaging sensor different distance measure sub at this time The difference (i.e. relative displacement) of first signal and second signal in region.When focusing accurate, imaging clearly, the first signal and the Binary signal essentially coincides;When out of focus, image blur, there are relative displacements for the first signal and the second signal, using above-mentioned calculating The mode measurement of multiplier sum obtains the relative displacement of the first signal and the second signal.For example, pick-up lens is from imaging sensor 30 millimeters of position, measurement obtains relative displacement and has changed 5 pairs of pixels pair, if 5 pairs of pixels are to corresponding distance in the picture It it is 5 millimeters, then the correspondence that can obtain this image distance and relative displacement from calibration is 30 millimeters/5 millimeters.Work as reality It is logical if the relative displacement for obtaining image sub-regions by step S11 is 5 pairs of pixels to (i.e. 5 millimeters) in Crossing step S12b can determine that the corresponding image distance of the signal difference of this sub-regions is 30 millimeters, and the focal length of pick-up lens is to determine , then depth of this sub-regions relative to focusing area can be calculated further according to 1/f=1/u+1/v.The signal difference Be not limited to be indicated with the curve in coordinate diagram with the form of expression of the correspondence of image distance, can also with equation or table etc. its He indicates form.
The realization of other steps of the present embodiment is identical as other steps of embodiment illustrated in fig. 5, and details are not described herein.
Although the present invention discloses as above in a preferred embodiment thereof, it is not for limiting the present invention, any ability Field technique personnel without departing from the spirit and scope of the present invention, may be by the methods and technical content of the disclosure above to this Inventive technique scheme makes possible variation and modification, therefore, every content without departing from technical solution of the present invention, according to this hair Bright technical spirit belongs to the technology of the present invention to any simple modifications, equivalents, and modifications made by embodiment of above The protection domain of scheme.

Claims (15)

1. a kind of pixel groups of imaging sensor, which is characterized in that include the 2n photosensitive element with matrix arrangement, described in correspondence The color filter of 2n photosensitive element and several lenticules;Wherein, each two adjacent photosensitive element corresponds to one in a line Lenticule, n are natural number.
2. the pixel groups of imaging sensor as described in claim 1, which is characterized in that the 2n photosensitive element side of being arranged in Battle array.
3. the pixel groups of imaging sensor as claimed in claim 1 or 2, which is characterized in that n=2 or 8.
4. a kind of pel array of imaging sensor, which is characterized in that including several green paxels, several red paxels and Several blue paxels;Wherein, pixel groups are claims 1 to 3 any one of them pixel groups, the face of the green paxel Color optical filter is green filter, and the color filter of the red paxel is red filter, the blue paxel Color filter is blue filter;Several green filters, several red filters and several blue filters are arranged in and visit Ear colour filter array.
5. a kind of imaging sensor, which is characterized in that including the pel array described in claim 4, further include:
Basal layer is formed with the photosensitive element in the basal layer;
Filter layer is located on the basal layer, and the filter layer includes the green filter, the red filter and described Blue filter;
Lens jacket is located on the filter layer, and the lens jacket includes the lenticule.
6. imaging sensor as claimed in claim 5, which is characterized in that further include between the basal layer and filter layer Interlayer dielectric layer.
7. such as imaging sensor described in claim 5 or 6, which is characterized in that further include being located at the filter layer and lens jacket Between planarization layer.
8. a kind of method obtaining image information using claim 5 to 7 any one of them imaging sensor, feature exist In every a line of described image sensor is divided into several subregions, and each subregion includes 2m+1 to pixel pair, wherein m is certainly So number, the pixel is to including the first photosensitive element and the second photosensitive element, first photosensitive element of pixel centering and second Photosensitive element is two photosensitive elements adjacent under same lenticule;
The method for obtaining image information includes the following steps:
The letter of the subregion is determined based on the first signal and the second signal of pixel pair in the subregion for each subregion Number difference, wherein the first signal is obtained from the first photosensitive element, and second signal is obtained from the second photosensitive element, the subregion Signal difference refer to the first signal of pixel pair in subregion and the difference of second signal, the signal difference and pixel pair Quantity is related;
Correspondence based on signal difference and object distance or the correspondence based on signal difference and image distance obtain and the sub-district The corresponding object distance of signal difference in domain;
The corresponding object distance of signal difference based on subregion, obtains the depth information of image.
9. the method for obtaining image information as claimed in claim 8, which is characterized in that the based on pixel pair in the subregion One signal and second signal determine that the signal difference of the subregion includes:
According to D (x)=∑ P1 (a) P2 (a+x), corresponding each D (x) value when calculating x values from-m to m, wherein in subregion To being marked successively with-m~m, P1 (a) indicates to mark the signal value of the first signal of the pixel pair for being ,-m in the subregion pixel ≤ a≤m, P2 (a+x) indicate the signal of the second signal for the pixel pair adjusted the distance to pixel with the pixel for being labeled as a to differing x Value, a, x are integer;
Determine that corresponding x is the signal difference of the subregion when D (x) value maximums.
10. the method for obtaining image information as claimed in claim 8, which is characterized in that based on pixel pair in the subregion The first signal and the second signal determine that the signal difference of the subregion includes:
According to D (x)=∑ P1 (a) P2 (a+x), corresponding each D (x) value when calculating x values from-m to m, wherein in subregion To being marked successively with-m~m, P1 (a) indicates to mark the signal value of the first signal of the pixel pair for being ,-m in the subregion pixel ≤ a≤m, P2 (a+x) indicate the signal of the second signal for the pixel pair adjusted the distance to pixel with the pixel for being labeled as a to differing x Value, a, x are integer;
According to each D (x) value of calculating and corresponding x, formation curve function D (X);
Determine that corresponding X is the signal difference of the subregion when D (X) value maximum.
11. the method for obtaining image information as claimed in claim 8, which is characterized in that pair based on signal difference and image distance It should be related to, obtaining object distance corresponding with the signal difference of the subregion includes:
Correspondence based on signal difference and image distance determines image distance corresponding with the signal difference of the subregion;
According to the focal length of pick-up lens and image distance corresponding with the signal difference of the subregion, the signal difference with the subregion is calculated Different corresponding object distance.
12. the method for obtaining image information as claimed in claim 8, which is characterized in that pair of the signal difference and image distance It should be related to and be obtained by the relative position for calibrating pick-up lens and imaging sensor.
13. the method for obtaining image information as claimed in claim 8, which is characterized in that pair of the signal difference and object distance It should be related to and be obtained by the relative position for calibrating object and pick-up lens.
14. the method for obtaining image information such as claim 8 to 13 any one of them, which is characterized in that further include:Based on visiing The signal that the photosensitive element of ear colour filter array obtains determines the colouring information of each pixel groups;Color letter based on pixel groups Breath, obtains the color information of image.
15. a kind of electronic equipment, including memory, processor and storage are on a memory and the calculating that can run on a processor Machine program, which is characterized in that the processor executes described program and realizes that claim 8 to 14 any one of them obtains image Step described in the method for information.
CN201810340537.0A 2018-04-17 2018-04-17 Imaging sensor and its pixel group and pixel array, the method for obtaining image information Active CN108495115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810340537.0A CN108495115B (en) 2018-04-17 2018-04-17 Imaging sensor and its pixel group and pixel array, the method for obtaining image information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810340537.0A CN108495115B (en) 2018-04-17 2018-04-17 Imaging sensor and its pixel group and pixel array, the method for obtaining image information

Publications (2)

Publication Number Publication Date
CN108495115A true CN108495115A (en) 2018-09-04
CN108495115B CN108495115B (en) 2019-09-10

Family

ID=63316109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810340537.0A Active CN108495115B (en) 2018-04-17 2018-04-17 Imaging sensor and its pixel group and pixel array, the method for obtaining image information

Country Status (1)

Country Link
CN (1) CN108495115B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111353405A (en) * 2019-07-17 2020-06-30 上海思立微电子科技有限公司 Fingerprint identification device, fingerprint identification system and electronic equipment
CN113271395A (en) * 2020-02-14 2021-08-17 戴克斯莱恩有限公司 Camera system adopting complementary Pixlet structure
CN114125243A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and pixel information acquisition method
WO2022083234A1 (en) * 2020-10-22 2022-04-28 Oppo广东移动通信有限公司 Image sensor, camera assembly and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191285A (en) * 2013-06-04 2015-12-23 索尼公司 Solid-state imaging device, electronic apparatus, lens control method, and imaging module
CN106210497A (en) * 2015-05-07 2016-12-07 原相科技股份有限公司 Object distance calculating method and object distance calculate device
CN106352847A (en) * 2015-07-14 2017-01-25 原相科技股份有限公司 Phase difference based distance measurement device and distance measurement method
CN107222734A (en) * 2017-06-30 2017-09-29 联想(北京)有限公司 A kind of image collecting device and electronic equipment
CN206727072U (en) * 2016-05-19 2017-12-08 半导体元件工业有限责任公司 Imaging system with global shutter phase-detection pixel

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191285A (en) * 2013-06-04 2015-12-23 索尼公司 Solid-state imaging device, electronic apparatus, lens control method, and imaging module
CN106210497A (en) * 2015-05-07 2016-12-07 原相科技股份有限公司 Object distance calculating method and object distance calculate device
CN106352847A (en) * 2015-07-14 2017-01-25 原相科技股份有限公司 Phase difference based distance measurement device and distance measurement method
CN206727072U (en) * 2016-05-19 2017-12-08 半导体元件工业有限责任公司 Imaging system with global shutter phase-detection pixel
CN107222734A (en) * 2017-06-30 2017-09-29 联想(北京)有限公司 A kind of image collecting device and electronic equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111353405A (en) * 2019-07-17 2020-06-30 上海思立微电子科技有限公司 Fingerprint identification device, fingerprint identification system and electronic equipment
CN111353405B (en) * 2019-07-17 2023-08-11 上海思立微电子科技有限公司 Fingerprint identification device, fingerprint identification system and electronic equipment
CN113271395A (en) * 2020-02-14 2021-08-17 戴克斯莱恩有限公司 Camera system adopting complementary Pixlet structure
WO2022083234A1 (en) * 2020-10-22 2022-04-28 Oppo广东移动通信有限公司 Image sensor, camera assembly and electronic device
CN114125243A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and pixel information acquisition method

Also Published As

Publication number Publication date
CN108495115B (en) 2019-09-10

Similar Documents

Publication Publication Date Title
CN108495115B (en) Imaging sensor and its pixel group and pixel array, the method for obtaining image information
CN105210361B (en) Full light imaging apparatus
Manakov et al. A reconfigurable camera add-on for high dynamic range, multispectral, polarization, and light-field imaging
CN102385168B (en) Picture element matrix is divided into stereoscopic imaging method and the system of subgroup
US9383548B2 (en) Image sensor for depth estimation
CN108780142A (en) 3D imaging systems and method
CN108055452A (en) Image processing method, device and equipment
CN107465866A (en) Image processing equipment and method, picture pick-up device and computer-readable recording medium
TW201230773A (en) Image capture using three-dimensional reconstruction
CN102917235A (en) Image processing apparatus, image processing method, and program
CN102457682A (en) Camera system and imaging method using multiple lens and aperture units
JPWO2008099589A1 (en) Image processing system, method and apparatus
JP5923754B2 (en) 3D imaging device
Forman et al. Continuous parallax in discrete pixelated integral three-dimensional displays
CN103098480B (en) Image processing apparatus and method, three-dimensional image pickup device
CN106170086B (en) Method and device thereof, the system of drawing three-dimensional image
CN108053438A (en) Depth of field acquisition methods, device and equipment
CN111145269A (en) Calibration method for external orientation elements of fisheye camera and single-line laser radar
CN106067937A (en) Camera lens module array, image sensering device and digital zooming image interfusion method
CN103621078B (en) Image processing apparatus and image processing program
CN108805921A (en) Image-taking system and method
CN109792511B (en) Full photon aperture view slippage for richer color sampling
CN103563369B (en) Image processing apparatus, camera head and image processing method
US20220295038A1 (en) Multi-modal and multi-spectral stereo camera arrays
Goldlücke et al. Plenoptic Cameras.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221228

Address after: 223001 Room 318, Building 6, east of Zhenda Steel Pipe Company, south of Qianjiang Road, Huaiyin District, Huai'an City, Jiangsu Province

Patentee after: Huaian Xide Industrial Design Co.,Ltd.

Address before: 223300 no.599, East Changjiang Road, Huaiyin District, Huai'an City, Jiangsu Province

Patentee before: HUAIAN IMAGING DEVICE MANUFACTURER Corp.

TR01 Transfer of patent right