WO2023028769A1 - 成像模组、成像系统、图像处理方法及终端 - Google Patents

成像模组、成像系统、图像处理方法及终端 Download PDF

Info

Publication number
WO2023028769A1
WO2023028769A1 PCT/CN2021/115402 CN2021115402W WO2023028769A1 WO 2023028769 A1 WO2023028769 A1 WO 2023028769A1 CN 2021115402 W CN2021115402 W CN 2021115402W WO 2023028769 A1 WO2023028769 A1 WO 2023028769A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light
pixel
optical filter
imaging
Prior art date
Application number
PCT/CN2021/115402
Other languages
English (en)
French (fr)
Inventor
张召杰
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to PCT/CN2021/115402 priority Critical patent/WO2023028769A1/zh
Priority to CN202180099921.6A priority patent/CN117581155A/zh
Publication of WO2023028769A1 publication Critical patent/WO2023028769A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Definitions

  • the present application relates to the field of imaging technology, and more specifically, to an imaging module, an imaging system, an image processing method, and a terminal.
  • Embodiments of the present application provide an imaging module, an imaging system, an image processing method, and a terminal.
  • the imaging module includes a module body, a lens group, an image sensor and an optical filter.
  • the lens group is accommodated in the module body.
  • the image sensor is accommodated in the module body and located on the image side of the lens group.
  • the optical filter is used to filter at least part of the light entering the image sensor with a wavelength range of 530nm-580nm.
  • the imaging system in the embodiment of the present application includes an imaging component, an imaging module, and one or more processors.
  • the imaging component is used to receive the first light to acquire the first image.
  • the imaging module includes a module body, a lens group, an image sensor and a filter.
  • the lens group is accommodated in the module body.
  • the image sensor is accommodated in the module body and located on the image side of the lens group.
  • the optical filter is used to filter at least part of the light entering the image sensor with a wavelength range of 530nm-580nm.
  • the imaging module is used to receive the second light of the first light outside the wavelength range of 530nm-580nm to obtain a second image, and the pixels of the second image have pixels corresponding to the first image.
  • One or more of the processors are configured to acquire an image of a target based on the first image and the second image.
  • the imaging system in the embodiment of the present application includes an imaging module and a movable filter.
  • the imaging module includes an image sensor.
  • the image sensor When the optical filter is moved out of the incident optical path of the image sensor, the image sensor is used to receive the first light to obtain the first image; when the optical filter is moved to the incident optical path of the image sensor , the image sensor is configured to receive second light outside a specific wavelength range in the first light to acquire a second image, and the pixels of the second image have pixels corresponding to the first image.
  • the imaging system in the embodiment of the present application includes an imaging component, an imaging module, and one or more processors.
  • the imaging component is used to receive the first light to acquire the first image.
  • the imaging module is used to receive second light outside a specific wavelength range in the first light to obtain a second image, and the pixels of the second image have pixels corresponding to the first image.
  • the one or more processors are configured to: generate a gain data map according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image; and generate a gain data map according to the first image and the gain Datamap to get the target image.
  • the image processing method in the embodiment of the present application includes: receiving first light to obtain a first image; receiving second light in the first light outside a specific wavelength range to obtain a second image, and the pixels of the second image have the same The pixel corresponding to the first image; generating a gain data map according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image, and according to the first image and the gain data Figure to obtain the target image.
  • the terminal in the embodiment of the present application includes an imaging module, and the imaging module includes a module body, a lens group, an image sensor, and an optical filter.
  • the lens group is accommodated in the module body.
  • the image sensor is accommodated in the module body and is located on the image side of the lens group.
  • the optical filter is used to filter at least part of the light entering the image sensor with a wavelength range of 530nm-580nm.
  • the terminal in the embodiment of the present application includes an imaging system, and the imaging system includes an imaging component, an imaging module, and one or more processors.
  • the imaging component is used to receive the first light to acquire the first image.
  • the imaging module includes a module body, a lens group, an image sensor and a filter.
  • the lens group is accommodated in the module body.
  • the image sensor is accommodated in the module body and located on the image side of the lens group.
  • the optical filter is used to filter at least part of the light entering the image sensor with a wavelength range of 530nm-580nm.
  • the imaging module is used to receive the second light of the first light outside the wavelength range of 530nm-580nm to obtain a second image, and the pixels of the second image have pixels corresponding to the first image.
  • One or more of the processors are configured to acquire an image of a target based on the first image and the second image.
  • the terminal in the embodiment of the present application includes an imaging system, and the imaging system includes an imaging module and a movable filter.
  • the imaging module includes an image sensor.
  • the image sensor When the optical filter is moved out of the incident optical path of the image sensor, the image sensor is used to receive the first light to obtain the first image; when the optical filter is moved to the incident optical path of the image sensor , the image sensor is configured to receive second light outside a specific wavelength range in the first light to acquire a second image, and the pixels of the second image have pixels corresponding to the first image.
  • the terminal in the embodiment of the present application includes an imaging system, and the imaging system includes an imaging component, an imaging module, and one or more processors.
  • the imaging component is used to receive the first light to acquire the first image.
  • the imaging module is used to receive second light outside a specific wavelength range in the first light to obtain a second image, and the pixels of the second image have pixels corresponding to the first image.
  • the one or more processors are configured to: generate a gain data map according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image; and generate a gain data map according to the first image and the gain Datamap to get the target image.
  • the terminal in the embodiments of the present application includes one or more processors, and the one or more processors are configured to implement the image processing method described in the embodiments of the present application.
  • the image processing method includes: receiving first light to obtain a first image; receiving second light in the first light outside a specific wavelength range to obtain a second image, and the pixels of the second image have the same A pixel corresponding to an image; generate a gain data map according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image, and obtain according to the first image and the gain data map target image.
  • the imaging module, imaging system, image processing method, and terminal in the embodiments of the present application filter at least part of the light entering the image sensor with a wavelength range of 530nm-580nm through an optical filter.
  • the main factors affecting the skin color of the portrait are the content of melanin and hemoglobin.
  • the reflectance of melanin corresponding to the light in this wavelength range is higher than that of hemoglobin.
  • the filter When the light entering the image sensor is filtered by the filter in this wavelength range After part of the light within, the difference between the reflectance of melanin and the reflectance of hemoglobin can be reduced, and the relative sensitivity reduction of the green channel in the pixel of the image sensor is greater than the reduction of the relative sensitivity of the red channel and the blue channel, then It can make the melanin of the skin not obvious, and the skin color is pinker. Therefore, after the light entering the image sensor is filtered by the filter for part of the light with a wavelength range of 530nm to 580nm, there is no need to distinguish the skin color area of the image formed by the imaging module, and the skin color area can be adjusted individually. Therefore, the calculation burden is reduced, and an image with better skin performance satisfactory to the user can be obtained, thereby ensuring the efficiency and effect of portrait skin adjustment.
  • FIG. 1 is a schematic structural diagram of an imaging module in some embodiments of the present application.
  • Fig. 2 is a schematic diagram of the relationship between the reflectance and transmittance of the imaging module in different wave bands and the corresponding reflectance and transmittance of melanin and hemoglobin of the imaging module according to some embodiments of the present application;
  • Fig. 3 is a schematic diagram of the relationship between light in different wavelength bands and the reflectivity of corresponding melanin and hemoglobin after the imaging module in some embodiments of the present application is filtered by an optical filter;
  • Fig. 4 is a schematic diagram of the relationship between light in different wavelength bands and the relative sensitivity of the corresponding red channel, green channel, and blue channel when the imaging module in some embodiments of the present application is not filtered by an optical filter;
  • Fig. 5 is a schematic diagram of the relationship between light of different wavelength bands and the relative sensitivities of the corresponding red channel, green channel, and blue channel after the imaging module in some embodiments of the present application is filtered by an optical filter;
  • FIG. 6 is a schematic structural diagram of an imaging system in some embodiments of the present application.
  • Fig. 7 is a schematic structural diagram of a lens of an imaging module according to some embodiments of the present application.
  • Fig. 8 is a schematic structural diagram of an image sensor of an imaging module according to some embodiments of the present application.
  • FIGS 9 to 11 are schematic diagrams of installation of optical filters of imaging modules in some embodiments of the present application.
  • FIG. 12 is a schematic diagram of a first image and a second image of an imaging system according to some embodiments of the present application.
  • Fig. 13 is a schematic diagram of the relationship between light in different wavelength bands and the corresponding reflectance of melanin and hemoglobin after the imaging system in some embodiments of the present application is filtered by two filters;
  • Fig. 14 is a schematic diagram of the relationship between light in different wavelength bands and the corresponding reflectance of melanin and hemoglobin after the imaging system in some embodiments of the present application is filtered by three filters;
  • 15 is a schematic structural diagram of an imaging system in some embodiments of the present application.
  • FIG. 16 is a structural schematic diagram of another angle of an imaging system in some embodiments of the present application.
  • FIG. 17 and FIG. 18 are schematic diagrams of scenarios in which a driver drives an optical filter in an imaging system according to some embodiments of the present application;
  • FIG. 24 is a schematic structural diagram of a terminal in some embodiments of the present application.
  • 25 to 27 are schematic diagrams of scenarios of terminals in some embodiments of the present application.
  • an embodiment of the present application provides an imaging module 100 .
  • the imaging module 100 includes a module body 10 , a lens group 20 , an image sensor 30 and a filter 40 .
  • the lens group 20 is accommodated in the module body 10 .
  • the image sensor 30 is accommodated in the module body 10 and located on the image side 202 of the lens group 20 .
  • the optical filter 40 is used to filter at least part of the light entering the image sensor 30 with a wavelength range of 530nm-580nm.
  • the module body 10 defines a housing space 11 for housing the lens group 20 , the image sensor 30 and the optical filter 40 .
  • FIG. 2(a) shows the reflectivity of hemoglobin and melanin under different wavelength bands of light when the imaging module 100 does not add a filter 40
  • the abscissa represents the wavelength of the light
  • the ordinate The coordinates represent the reflectance of light
  • the curve H represents the reflectance of hemoglobin under different wave bands
  • the curve M represents the reflectance of melanin under different wave bands.
  • FIG. 2( b ) shows a graph of the light filtered by the filter 40 and the transmittance of the light.
  • the abscissa represents the wavelength of the light
  • the ordinate represents the transmittance of the light.
  • FIG. 3 shows the reflectance of hemoglobin and melanin under different wavelength bands of light after the imaging module 100 filters at least part of the light in the wavelength range of 530nm to 580nm through the filter 40.
  • the abscissa indicates the wavelength of the light, and the ordinate indicates Reflectance, similarly, curve H represents the reflectance of hemoglobin under different wave bands, and curve M represents the reflectance of melanin under different wave bands.
  • the imaging module 100 should reduce the transmittance in the wavelength range of 530nm-580nm, so that the skin in the image formed by the imaging module 100 is better.
  • FIG. 4 shows the relative sensitivity of the R (red) channel, G (green) channel, and B (blue) channel when the imaging module 100 is not filtered by the filter 40
  • the curve R is the relative sensitivity curve of R channel
  • curve G is the relative sensitivity curve of G channel
  • curve B is the relative sensitivity curve of B channel.
  • FIG. 5 shows the change degree of the relative sensitivity of the R (red) channel, the G (green) channel, and the B (blue) channel after the imaging module 100 is filtered by the optical filter 40 .
  • Curve R, curve G and curve B are respectively the relative sensitivity curves of the R channel, G channel and B channel when the imaging module 100 is not filtered by the optical filter 40; After filtering by the optical filter 40, the relative sensitivity curves of the R channel, the G channel, and the B channel.
  • the degree of decline of the relative sensitivity of the green channel (the difference between the G curve and G1 The gap between) is greater than the decrease in the relative sensitivity of the red channel (the gap between the R curve and R1), and it is also greater than the decrease in the relative sensitivity of the blue channel (the gap between the B curve and B1).
  • the skin color shifts towards pink in the image formed by the imaging module 100 , and the skin expression effect is better.
  • the filter 40 when the filter 40 is added to filter at least part of the light entering the image sensor 30 in the wavelength range of 530nm-580nm, the light transmittance in this wavelength range decreases, and the melanin
  • the reflectance of the light in the wavelength range of 530nm to 580nm begins to approach the reflectance of hemoglobin in the light of the wavelength range. 4 and 5, after adding the filter 40, although the relative sensitivity of the G channel, R channel and B channel corresponding to the light with a wavelength range of 400nm-700nm will all decrease, the wavelength range is 530nm-580nm.
  • the decrease degree of the relative sensitivity of the G channel corresponding to the light is higher than the decrease degree of the relative sensitivity of the R channel and the B channel.
  • At least part of the light in the wavelength range of 530nm to 580nm is selected to be filtered by the optical filter 40, which can simultaneously ensure that: the light transmittance of the light in this wavelength range is reduced; the reflectance of melanin corresponding to this wavelength range approaches the reflectance of hemoglobin ; and the ratio between the relative sensitivity of the R1 (red) channel and the relative sensitivity of the G1 (green) channel increases, and the ratio between the relative sensitivity of the B1 (blue) channel and the relative sensitivity of the G1 (green) channel also increases Large, so that the skin color of the skin in the final image is shifted to pink, and the skin performance is better.
  • the imaging module 100 in the embodiment of the present application passes through the filter 40 to filter at least part of the light entering the image sensor 30 with a wavelength range of 530nm-580nm.
  • the main factors affecting the skin color of the portrait are the content of melanin and hemoglobin.
  • the reflectance of melanin corresponding to the light in this wavelength range is higher than the reflectance of hemoglobin.
  • the filter 40 When the light entering the image sensor 30 is filtered by the filter 40, the After part of the light in the wavelength range, the difference between the reflectance of melanin and the reflectance of hemoglobin can be reduced, and the relative sensitivity reduction of the green channel in the pixel of the image sensor 30 is greater than the reduction of the relative sensitivity of the red channel, and is also greater than The reduction of the relative sensitivity of the blue channel can make the melanin of the skin less obvious, and the skin color is pinker. Therefore, after the light entering the image sensor 30 is filtered by the filter 40 to filter part of the light in the wavelength range of 530nm-580nm, it is not necessary to distinguish the skin color area of the image formed by the imaging module 100, and the skin color area can be targeted. Adjusted individually, thereby reducing the computational burden and enabling users to obtain images with better skin performance that are satisfactory to the user.
  • the embodiment of the present application further provides an imaging system 1000 .
  • the imaging system 1000 includes an imaging module 100 , an imaging component 200 and one or more processors 300 .
  • the imaging component 200 is used for receiving the first light to acquire the first image.
  • the imaging module 100 is used for receiving second light outside a specific wavelength range in the first light to obtain a second image, and the pixels of the second image have pixels corresponding to the first image.
  • the specific band range is a band range in which the reflectance of melanin is higher than that of hemoglobin, and the specific band range is within the red band range.
  • the specific wavelength range in the embodiments of the present application is 530nm-580nm.
  • the imaging module 100 is filtered by the optical filter 40 and enters at least part of the light in the image sensor 30 with a wavelength range of 530nm-580nm, the light transmittance of the light will be reduced.
  • the ratio between the relative sensitivity of the red channel and the relative sensitivity of the green channel increases, and the ratio of the relative sensitivity of the blue channel increases.
  • the ratio between the relative sensitivity of the channel and the relative sensitivity of the green channel is increased, so that the skin color of the skin in the final generated target image is shifted to pink, and the skin performance is better.
  • the lens group 20 includes at least one lens, and the lens group 20 has an object side 201 and an image side 202 .
  • the positions of the object side 201 and the image side 202 are related to the incident direction of the light, and the object side 201 is the incident direction when the light has not yet entered the lens.
  • the side where the light surface 21 is located, the image side 202 is the side where the light emitting surface 22 of the lens is located when the light is incident on the lens, converges through the lens and then exits the lens.
  • the image sensor 30 is located at the image side 202 of the lens group 20 , and when the image sensor 30 receives the second light after the first light is filtered by the filter 40 , the second image can be obtained.
  • the pixels in the second image have pixels corresponding to the first image.
  • the image sensor 30 may include a microlens array 31 , a filter array 32 and a pixel array 33 .
  • the microlens array 31 , the filter array 32 and the pixel array 33 are sequentially stacked.
  • the micro-lens array 31 includes a plurality of micro-lenses 311 , and the micro-lenses 311 can converge the light emitted by the lens to guide more of the incident light to the filter array 32 .
  • the filter array 32 includes a plurality of filters 321, and the filters 321 are used to filter part of the light to allow red, green and blue three channels (certainly also red, yellow and blue three channels, or other channels) Light enters the pixel array 31 .
  • the pixel array 33 includes a plurality of pixels 331 for converting received optical signals into electrical signals.
  • the filter array 32 is arranged between the microlens array 31 and the pixel array 33, each microlens 311 corresponds to a filter 321 and a pixel 331, along the light receiving direction of the image sensor 30, the light passes through the microlens 311 It reaches the optical filter 321 , and then is filtered by the optical filter 321 to reach the corresponding pixel 331 .
  • the optical filter 40 is used to filter at least part of the light entering the image sensor 30 with a wavelength range of 530nm ⁇ 580nm.
  • the optical filter 40 can be disposed outside the module body 10 , and the optical filter 40 can also be disposed inside the module body 10 .
  • the optical filter 40 when the optical filter 40 is disposed outside the module body 10 , the optical filter 40 needs to be disposed on the object side 201 of the lens group 20 .
  • the optical filter 40 is arranged on the top wall of the module body 10, and before the light enters the module body 10, it will first pass through the optical filter 40 to filter At least part of the light with a wavelength range of 530nm-580nm, when the light enters the module body 10 to enter the image sensor 30, the light incident on the image sensor 30 is the second light, and the image sensor 30 can obtain the second image.
  • a protective cover 50 is provided outside the module body 10 , and the protective cover 50 will not affect the imaging module 100 to receive light. Therefore, when the optical filter 40 is disposed outside the module body 10 , as shown in FIG. 9 , the optical filter 40 can also be integrated on the protective cover 50 . Specifically, as shown in Figure 9(b), the optical filter 40 can be embedded in the protective cover 50; as shown in Figure 9(c), the optical filter 40 can be arranged on the lower surface of the protective cover 50 ( close to the surface of the module body 10); as shown in FIG.
  • the protective cover 50 may be a light-transmitting cover integrated with a coverglass and a display screen.
  • the optical filter 40 When the optical filter 40 is arranged inside the module body 10, the optical filter 40 can be arranged on the object side 201 of the lens group 20, and the optical filter 40 can also be arranged on the image side 202 of the lens group 20 and is located in the lens group 20. and image sensor 30.
  • the optical filter 40 can be a filter film, and the optical filter 40 can be arranged on the light-emitting surface 22 or the light-incident surface 21 of any lens in the lens group 20 (shown in FIG. 7 ).
  • the light passes through the lens group 20, the light will be filtered by the filter 40 disposed on the light exit surface 22 or the light entrance surface 21 of the lens, so that the second light entering the image sensor 30 is received to generate a second image.
  • the optical filter 40 may also be a single filter structure integrated on the image sensor 30 .
  • the optical filter 40 can be arranged on each microlens 311 of the microlens array 31, that is, the optical filter 40, the microlens array 31, the optical filter array 32 and the pixel array 33 are stacked in sequence;
  • the filter 40 can also be integrated on the optical filter array 32, such as the optical filter 40 can be integrated between the optical filter array 32 and the microlens array 31, or the optical filter 40 can be integrated between the optical filter array 32 and the pixel array 33, that is, the microlens array 31, the filter 40, the filter array 32 and the pixel array 33 are stacked in sequence, or the microlens array 31, the filter array 32, the filter 40 and the pixel array 33 are stacked in sequence It is set to ensure that the light has been filtered by the filter 40 before entering the pixel array 33, so as to ensure that the light entering the pixel array 33 is the second light, so that the imaging device obtains the second image.
  • the optical filter 40 can also be a single filter structure independent of the image sensor 30 and the lens group 20 .
  • the optical filter 40 can be disposed outside or inside the module body 10 , and the optical filter 40 can be moved. Wherein, when the optical filter 40 is located outside the incident optical path of the image sensor 30, the image sensor 30 is used to acquire the first light; when the optical filter 40 moves into the incident optical path of the image sensor 30, the image sensor 30 is used to acquire The second light outside the wavelength range of 530nm-580nm in the first light.
  • the optical filter 40 when the optical filter 40 is arranged on the outside of the module body 10, when the optical filter 40 is arranged on the top wall of the module body 10, as shown in FIG. 9(a), the optical filter 40 It can be moved relative to the top wall of the module body 10, so that the filter 40 can be moved outside or inside the incident light path of the image sensor 30, so that the image sensor 30 can be controlled to receive the first light or the second light, and then the second light can be obtained. first image or second image.
  • the optical filter 40 when the optical filter 40 is integrated in the protective cover 50, the optical filter 40 can move relative to the protective cover 50, such as when the optical filter 40 is embedded in the protective cover 50, the optical filter 40 can be Move inside the protective cover 50, and for another example, when the optical filter 40 is arranged on the upper surface or the lower surface of the protective cover 50, the optical filter 40 can move relative to the protective cover 50 outside the protective cover 50, so that the filter The optical device 40 moves out of or into the incident light path of the image sensor 30 .
  • the optical filter 40 when the optical filter 40 is accommodated inside the module body 10, the optical filter 40 can be arranged between the top wall of the module body 10 and the lens group 20 (as shown in FIG. 11(a) shown), the filter 40 can be arranged between any two lenses in the lens group 20 (as shown in Figure 11(b)), and the filter 40 can also be arranged between the lens group 20 and the image sensor 30 ( As shown in Figure 1).
  • the optical filter 40 can be rotated and/or translated relative to the optical axis of the lens group 20, so that the optical filter 40 can selectively enter outside or in the incident light path of the image sensor 30 to selectively obtain the first image or the second image.
  • the imaging system 1000 can flexibly choose whether to use the filter 40 to filter the light. image, so that the captured image is more complete, clear, and real; when the user chooses to use the optical filter 40, the imaging component 200 acquires the first image, and the imaging module 100 acquires the second image. , an image with better performance, therefore, the imaging system 1000 can fuse the first image according to the second image, so that the captured image has better skin performance.
  • the imaging component 200 is used for receiving the first light to acquire the first image.
  • the imaging component 200 may include a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • the number of pixels in the first image captured by the imaging component 200 may be greater than or equal to the number of pixels in the second image captured by the imaging module 100 , so that the pixels in the second image have pixels corresponding to the pixels in the first image.
  • the one or more processors 300 are configured to acquire target images according to the first image and the second image. Specifically, the first image is generated by the imaging system 1000 receiving the first light through the imaging component 200, and the second image is generated by the imaging system 1000 through the imaging module 100 and filtered by the filter 40 in the first light in the wavelength range of 530nm-580nm. other than the second ray generated.
  • the difference between the first image and the second image is that the received light is different, thus, the gains of the pixels in the second image relative to the pixels corresponding to the first image in the R pixel, G pixel and B pixel can be obtained
  • the one or more processors 300 can change the pixel values of R pixels, G pixels and B pixels in the first image according to the first image and the gain amount, so as to obtain the target image.
  • the imaging system 2000 acquires the first image 60 and the second image 70
  • the pixel values of the R channel, the G channel and the B channel in each pixel in the first image 60 and the second image 70 can be obtained, Take the pixels P00, P01, P10, and P11 of the first image 60 in FIG. P10' and P11' are examples.
  • the pixel P00 in the first image 60 is the R channel, and the pixel value is R1; P01 and P10 are the G channel, and the pixel values are G1 and G1' respectively; P11 is the B channel, and the pixel value is B1, the pixel P00' in the second image 70 is the R channel, and the pixel value is R2; P01' and P10' are the G channel, and the pixel values are G2 and G2' respectively; P11' is the B channel, and the pixel value is B2, then pass The ratio of each pixel in the second image 70 corresponding to the position of the first image 60 can obtain the gain of the second image 70 relative to the first image 60 .
  • the gain value of the pixel P00 and the pixel P00' is R2/R1
  • the gain value of the pixel P01 and the pixel P01' is G2/G1
  • the gain value of the pixel P10 and the pixel P10' is G2'/G1'
  • the gain value of P11' is B2/B1.
  • Fig. 13 is when being provided with 2 optical filters 40, the reflectivity of hemoglobin and melanin under the light of different wavebands, abscissa represents The wavelength of the light, the ordinate represents the reflectance;
  • Figure 14 shows the reflectance of hemoglobin and melanin under different wavelength bands of light when three filters 40 are provided, the abscissa represents the wavelength of the light, and the ordinate represents the reflectance.
  • the ratio between the relative sensitivity of the red channel and the relative sensitivity of the green channel is greater, and the ratio between the relative sensitivity of the blue channel and the relative sensitivity of the green channel is also greater , the skin color in the target image appears pinker.
  • the number of optical filters 40 in the imaging module 100 is only one, and if it is necessary to facilitate user adjustment so that the user can obtain the image desired by the user, it needs to be based on The following formulas can be used to obtain the pixel values of the R channel, G channel and B channel of each pixel in the target image.
  • the target image is obtained by the user after adjusting pixel values on the basis of the first image according to actual needs.
  • Rnew is the pixel value of each R pixel channel in the target image
  • R1 is the pixel value of each R pixel channel in the first image
  • R2 is the pixel value of each R pixel channel in the second image
  • N is the user’s
  • the actual demand needs to adjust the multiple, and N can be an integer, decimal, or negative number. That is, the gain data of the R pixel channel.
  • the pixel value calculation formulas of the G pixel channel and the B pixel channel are similar to the pixel value calculation formulas of the R pixel channel, respectively: Thus, pixel values including all pixels in the target image can be obtained.
  • N can be a decimal, so that the user can adjust a more satisfactory target image, so as to add a filter to the imaging system 1000
  • the target image generated by the imaging system 1000 can show the imaging effect of adding a plurality of optical filters 40.
  • the present application also provides an imaging system 2000 , which may include an imaging module 100 , a movable filter 40 , one or more processors 300 and a driver 600 .
  • the imaging module 100 may include a module body 10 , a lens group 20 , and an image sensor 30 .
  • the module body 10 defines a housing space 11 for housing the lens group 20 and the image sensor 30 .
  • the optical filter 40 is used to filter part of the first light entering the image sensor 30 within a specific wavelength range, so that the image sensor 30 receives the second light.
  • the specific band range is a band range in which the reflectance of melanin is higher than that of hemoglobin, and the specific band range is within the red band range.
  • the specific wavelength range in the embodiments of the present application is 530nm-580nm.
  • the imaging module 100 is filtered by the optical filter 40 and enters at least part of the light in the image sensor 30 with a wavelength range of 530nm-580nm, the light transmittance of the light will be reduced.
  • the ratio between the relative sensitivity of the red channel and the relative sensitivity of the green channel increases, and the ratio of the relative sensitivity of the blue channel increases.
  • the ratio between the relative sensitivity of the channel and the relative sensitivity of the green channel is increased, so that the skin color of the skin in the final generated target image is shifted to pink, and the skin performance is better.
  • the lens group 20 includes at least one lens, and the lens group 20 has an object side 201 and an image side 202 .
  • the positions of the object side 201 and the image side 202 are related to the incident direction of the light, and the object side 201 is the incident direction when the light has not yet entered the lens.
  • the side where the light surface 21 is located, the image side 202 is the side where the light emitting surface 22 of the lens is located when the light is incident on the lens, converges through the lens and then exits the lens.
  • the image sensor 30 is located on the image side 202 of the lens group 20, and when the image sensor 30 receives the first light that is not filtered by the filter 40, the first image can be obtained.
  • the image sensor 30 receives When the second light after filtering the first light by the filter 40 arrives, the second image can be obtained.
  • the pixels in the second image have pixels corresponding to the first image.
  • the image sensor 30 includes a microlens array 31 , a filter array 32 and a pixel array 33 .
  • the microlens array 31 , the filter array 32 and the pixel array 33 are sequentially stacked.
  • the micro-lens array 31 includes a plurality of micro-lenses 311 , and the micro-lenses 311 can converge the light emitted by the lens to guide more of the incident light to the filter array 32 .
  • the filter array 32 includes a plurality of filters 321, and the filters 321 are used to filter part of the light to allow red, green and blue three channels (certainly also red, yellow and blue three channels, or other channels) Light enters the pixel array 31 .
  • the pixel array 33 includes a plurality of pixels 331 for converting received optical signals into electrical signals.
  • the filter array 32 is arranged between the microlens array 31 and the pixel array 33, each microlens 311 corresponds to a filter 321 and a pixel 331, along the light receiving direction of the image sensor 30, the light passes through the microlens 311 It reaches the optical filter 321 , and is filtered by the optical filter 321 to reach the corresponding pixel 331 .
  • the driver 600 is used to drive the optical filter 40 to rotate and/or translate relative to the optical axis 23 of the lens group 20, so that the optical filter 40 selectively enters the incident light path of the image sensor 30 or among.
  • the connecting member 601 connected to the optical filter 40 can be provided on the driving member 600, and the connecting member 601 can rotate or translate relative to the optical axis of the lens group 20 on the driving member 600 to drive the optical filter 40 relative to the lens
  • the optical axis 23 of the group 20 rotates or translates.
  • the driving member 600 drives the optical filter 40 to translate relative to the optical axis 23 of the lens group 20 to move away from the module body 10
  • the connecting member 601 moves away from the module body 10 .
  • the orientation of the group body 10 is shifted so that the filter 40 is located outside the incident light path of the image sensor 30 .
  • the first light entering the imaging module 100 will not be filtered by the filter 40 to filter part of the light within a specific wavelength range, and the light received by the image sensor 30 is the first light, so that the image sensor 30 acquires the first image .
  • the driving member 600 drives the optical filter 40 to translate relative to the optical axis 23 of the lens group 20 to move toward the direction of the module body 10
  • the connecting member 601 moves closer to the module body 10 .
  • the filter 40 will enter the incident light path of the image sensor 30 .
  • the first light entering the imaging module 100 will be filtered by the filter 40 within a specific wavelength range, so that the image sensor 30 receives the second light, so that the image sensor 30 can obtain a second image.
  • the driving member 600 may also be a driving member 600 disposed in the imaging module 100 to drive the optical filter 40 to translate inside the imaging module 100 .
  • the filter 40 can be located between the lens group 20 and the image sensor 30, and the filter 40 Located in the incident light path of the image sensor 30, so that the light entering the image sensor 30 is the second light, so that the image sensor 30 acquires the second image, when the driving member 600 drives the optical filter 40 to rotate relative to the optical axis of the lens group 20
  • the optical filter 40 can be located outside the incident light path of the image sensor 30, so that the light entering the image sensor 30 is the first light, so that the image sensor 30 acquires the first image .
  • the driver 600 and the optical filter 40 can also be arranged between any two lens groups 20 in the lens groups 20, and can selectively enter outside or in the incident light path of the image
  • the optical filter 40 is arranged on the driving member 600 so that the movement can be realized by the driving member 600.
  • the image sensor 30 is used to receive the first light to obtain the first image ;
  • the image sensor 30 is used to receive the second light outside the specific wavelength range in the first light to obtain the second image, and the pixels of the second image have the same The corresponding pixels of the image.
  • the optical filter 40 can be disposed outside the module body 10 , and the optical filter 40 can also be accommodated inside the module body 10 .
  • the optical filter 40 when the optical filter 40 is disposed outside the module body 10, the optical filter 40 can be disposed on the top wall of the module body 10, and before the light enters the module body 10, it will first pass through the The optical filter 40 is used to filter at least part of the light in a specific wavelength range, then when the light enters the module body 10 to enter the image sensor 30, the light entering the image sensor 30 is the second light, and the image sensor 30 can obtain second image.
  • the driver 600 is located on the outer side of the module body 10, and the driver 600 can drive the optical filter 40 to translate relative to the optical axis 23 of the lens group 20, so that the optical filter 40 is in the module.
  • the top wall of the module body 10 can be translated so as to selectively block the top wall of the module body 10, then the light entering the image sensor 30 can be the first light or the second light, and the image sensor 30 can acquire the first image or the second image.
  • a protective cover 50 is provided outside the module body 10 , and the protective cover 50 will not affect the imaging module 100 to receive light. Therefore, when the optical filter 40 is disposed outside the module body 10 , as shown in FIG. 9 , the optical filter 40 can also be integrated on the protective cover 50 . Specifically, as shown in Figure 9(b), the optical filter 40 can be embedded in the protective cover 50; as shown in Figure 9(c), the optical filter 40 can be arranged on the lower surface of the protective cover 50 ( away from the surface of the module body 10); as shown in FIG.
  • the protective cover 50 may be a light-transmitting cover integrated with a coverglass and a display screen.
  • the driver 600 can also be disposed on the protective cover 50 , and the driver 600 can drive the optical filter 40 to rotate and/or translate relative to the optical axis of the lens group 20 .
  • the driver 600 can drive the optical filter 40 to translate in the protective cover 50, so that the optical filter 40 enters into or outside the optical path of the image sensor 30. outside.
  • the driving member 600 can drive the optical filter 40 to translate on the protective cover 50 so that the optical filter 40 enters the image sensor 30 Inside or outside the light path, so as to selectively acquire the first image or the second image.
  • the optical filter 40 when the optical filter 40 is accommodated inside the module body 10, the optical filter 40 can be arranged between the top wall of the module body 10 and the lens group 20 (as shown in FIG. 11(a) shown), the filter 40 can be arranged between any two lenses in the lens group 20 (as shown in Figure 11(b)), and the filter 40 can also be arranged between the lens group 20 and the image sensor 30 ( As shown in Figure 1). In this way, when the light enters the imaging module 100 , it will be filtered by the filter 40 before entering the pixel array 33 of the image sensor 30 , so that the second light can be obtained by entering the image sensor 30 to generate a second image.
  • the driver 600 is also accommodated inside the module body 10 to drive the optical filter 40 to rotate and/or translate relative to the optical axis 23 of the lens group 20, so that the optical filter 40 selectively enters the image sensor 30 Outside or in the incident light path, so as to selectively acquire the first image or the second image.
  • the imaging system 2000 when the user uses the imaging system to capture images, the user can flexibly choose whether to use the optical filter 40 to filter the light.
  • the imaging system 2000 acquires the first image, so that the captured image It is closer to reality; when the user chooses to use the optical filter 40, the imaging system 2000 acquires the second image. Since the second image is an image with a pinker skin color and better performance, in addition, the imaging system 2000 can also acquire the first image respectively and the second image, and fuse the first image according to the second image, so that the captured image has better skin performance.
  • the one or more processors 300 are configured to generate a gain data map according to pixel values of pixels in the first image and pixel values of corresponding pixels in the second image; and obtain a target image according to the first image and the gain data map.
  • the imaging system 2000 acquires the first image 60 and the second image 70
  • the pixel values of the R channel, the G channel and the B channel in each pixel in the first image 60 and the second image 70 can be obtained, Take the pixels P00, P01, P10, and P11 of the first image 60 in FIG. P10' and P11' are examples.
  • the pixel P00 in the first image 60 is the R channel, and the pixel value is R1; P01 and P10 are the G channel, and the pixel values are G1 and G1' respectively; P11 is the B channel, and the pixel value is B1, the pixel P00' in the second image 70 is the R channel, and the pixel value is R2; P01' and P10' are the G channel, and the pixel values are G2 and G2' respectively; P11' is the B channel, and the pixel value is B2, then pass The ratio of each pixel in the second image 70 corresponding to the position of the first image 60 can obtain the gain of the second image 70 relative to the first image 60 .
  • the gain value of the pixel P00 and the pixel P00' is R2/R1
  • the gain value of the pixel P01 and the pixel P01' is G2/G1
  • the gain value of the pixel P10 and the pixel P10' is G2'/G1'
  • the gain value of P11' is B2/B1.
  • Fig. 13 is when being provided with 2 optical filters 40, the reflectivity of hemoglobin and melanin under the light of different wavebands, abscissa represents The wavelength of the light, the ordinate represents the reflectance;
  • Figure 14 shows the reflectance of hemoglobin and melanin under different wavelength bands of light when three filters 40 are provided, the abscissa represents the wavelength of the light, and the ordinate represents the reflectance.
  • the ratio between the relative sensitivity of the red channel and the relative sensitivity of the green channel is greater, and the ratio between the relative sensitivity of the blue channel and the relative sensitivity of the green channel is also greater , the skin color in the target image appears pinker.
  • the number of optical filters 40 in the imaging module 100 is only one, and if it is necessary to facilitate user adjustment so that the user can obtain the image desired by the user, it is necessary to pass Computed to adjust the pixel values of the R, G, and B channels for each pixel in the target image.
  • the specific calculation method is the same as the above calculation method and will not be repeated here.
  • the target image generated by the imaging system 2000 can show the imaging effect of adding multiple optical filters 40 .
  • the embodiment of this application also provides an image processing method, the image processing method includes steps:
  • the first image and the second image can be generated by the imaging assembly 200 and the imaging module 100 of the above-mentioned embodiment respectively, or both can be generated by the imaging module 100 (the filter 40 can be selectively moved to the receiving end of the image sensor 30 generated when in or out of the light path).
  • the pixels of the second image generated by the imaging module 100 have pixels corresponding to the pixels of the first image, that is, the number of pixels of the second image is at least equal to the number of pixels of the first image.
  • the first light may be ambient light, or light after infrared light and ultraviolet light are filtered by the filter of the image sensor 30 .
  • the second light is obtained after the first light is filtered by the optical filter 40 or partially filters light within a specific wavelength range.
  • the first image is the original image taken by the imaging system 1000
  • the second image is an improved image acquired by the imaging system 1000 after the filter 40 filters part of the light in a specific wavelength range.
  • the specific band range is a band range in which the reflectance of melanin is higher than that of hemoglobin, and the specific band range is within the red band range.
  • the specific wavelength range in the embodiments of the present application is 530nm-580nm.
  • the first image and the second image are respectively generated by the imaging assembly 200 and the imaging module 100 of the above-mentioned embodiment
  • the first image and the second image are based on The acquisition by different components will result in that the shooting positions (with shooting parallax) and the time of generating the first image and the second image may not be completely the same. Therefore, before calculating the gain data map, it is necessary to use methods such as Scale-invariant feature transform (SIFT) to align the pixels in the second image with the corresponding pixels in the first image, thus, then The gain data map may be generated according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image.
  • SIFT Scale-invariant feature transform
  • the target image is the image desired by the user
  • the pixel value of each pixel to obtain the pixel value of each pixel in the target image that meets the user's expectations.
  • the first image and the second image are obtained by receiving the first light and the second light respectively, so as to generate the gain according to the pixel value of the pixel corresponding to the second image in the first image data map, and finally obtain the target image according to the first image and the gain data map.
  • the second light is obtained by filtering or partially filtering light in a specific wavelength range according to the first light, and based on the above reasons, the generated second image can be compared with the first image without identifying and distinguishing the skin color area first.
  • the skin color area of an image is better and healthier. Therefore, after adjusting the pixel value of each pixel in the first image through the gain data map, the obtained target image skin performance can be better and healthier.
  • step 07 generate a gain data map according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image, and also includes the steps of:
  • a ratio map (as shown in Figure 12(c)) can be generated according to the ratio of the pixel value of the pixel in the first image to the pixel value of the corresponding pixel in the second image .
  • the pixel value of each pixel in the first image and the second image can be obtained, and the pixels P00, P01, P10, P11 ( The first number in the subscript is the row, the second number is the column) and the pixels P00', P01', P10', P11' at the corresponding positions in the second image 70 are examples, for example, the pixel P00 in the first image 60 is R channel, the pixel value is R1; P01 and P10 are the G channel, the pixel value is G1 and G1' respectively; P11 is the B channel, the pixel value is B1, and the pixel P00' in the second image 70 is the R channel, the pixel value is R2 ; P01' and P10' are G channels, and the pixel values are G2 and G2' respectively; P11' is a B channel, and the pixel value is B2, then pass through each pixel in the pixel of the second image 70 corresponding to the first image 60 The ratio of ,
  • the gain value of the pixel P00 and the pixel P00' is R2/R1
  • the gain value of the pixel P01 and the pixel P01' is G2/G1
  • the gain value of the pixel P10 and the pixel P10' is G2'/G1'
  • the gain value of P11' is B2/B1
  • a ratio map (as shown in Figure 12(c)) can be generated, which can show that when the first light is filtered by an optical filter 40, the collected The change in the pixel value of each pixel in the obtained first image compared to the pixel value of each pixel at the corresponding position in the second image.
  • Rnew is the pixel value of each R pixel channel in the target image
  • R1 is the pixel value of each R pixel channel in the first image
  • R2 is the pixel value of each R pixel channel in the second image
  • N is the preselected Adjustment coefficient, that is, the multiple that the user needs to adjust according to actual needs, and N can be an integer, decimal, or negative number. That is, the gain data of the R pixel channel.
  • the pixel value calculation formulas of the G pixel channel and the B pixel channel are similar to the pixel value calculation formulas of the R pixel channel, respectively: Therefore, before obtaining the pixel value of each pixel in the target image, the size of N can be adjusted to obtain the gain data that each pixel in the target image needs to adjust on the basis of the first image, and the gain of multiple pixels The data constitutes the Gain Data Map.
  • the final target image when N gradually increases from 1, the final target image will present an image filtered by a larger number of filters 40, that is, the skin in the target image is getting better and better, and the skin color is getting better. and when N gradually decreases from 1, the final target image will present an image that is not filtered by the optical filter 40, or even reversely optimized, that is, the skin in the target image is getting worse, The complexion is getting yellower and yellower. In this way, the user can flexibly adjust the preset adjustment coefficient to obtain a target image that meets his own needs.
  • Step 07 Generate a gain according to the pixel value of the pixel in the first image and the pixel value of the corresponding pixel in the second image Data graph, also includes steps:
  • 075 Generate a gain data map according to the pixel value of the pixel of the skin color area in the first image and the pixel value of the pixel corresponding to the skin color area in the second image;
  • Step 09 Obtain the target image according to the first image and the gain data map, further comprising steps:
  • the pixel values of the pixels in the skin-colored area of the portrait in the first image and the pixels corresponding to the skin-colored area of the portrait in the second image can be Value to generate a gain data map, so that the final acquired target image will only adjust the skin color of the skin color area in the first image based on the skin color area in the first image and the gain data map corresponding to the skin color area , so that the skin color of the skin color area in the first image becomes pinker, and the skin performance becomes better.
  • the color of the background area in the target image may not be affected, so that the improvement of the skin area in the target image can be achieved on the premise of ensuring the background color.
  • the first image includes a skin color area
  • the second image includes a skin color area
  • the image processing method also includes steps:
  • one or more processors 300 will first detect whether the collected first image and the second image contain a skin color area, so as to determine whether to generate the gain data map, and according to the first image and gain data plots to obtain target images.
  • the one or more processors 300 if it is detected that there is an image that does not contain a skin color area in the first image or the second image, it indicates that the parallax of the first image and the second image is relatively large or the captured scene deviation is large, and the alignment cannot be achieved. If the one or more processors 300 cannot generate the gain data map corresponding to the skin color area, the one or more processors 300 will stop working to prompt the user to take another shot.
  • the processor 300 will not Generate a gain data map to directly output the first image or the second image as the target image.
  • the one or more processors 300 will continue to perform the work of generating the gain data map, and based on the first image and the second image, The gain data map generates the target image, so that the skin color area in the target image is individually adjusted to obtain a target image with better skin performance and pinker color.
  • one or more processors 300 may also calculate the average value of the pixel values of all pixels in the first image, and the pixel difference between the average value of all pixels in the corresponding position in the second image, if If the pixel difference is greater than the preset difference, it indicates that the parallax between the first image and the second image is relatively large or the captured scene deviates greatly, and the alignment cannot be achieved, and one or more processors 300 cannot generate the gain data corresponding to the skin color area. Figure, one or more processors 300 will stop working to prompt the user to take another shot.
  • the preset difference is 50
  • the average value of the pixel values of all pixels in the first image is 140
  • the average value of all pixels in the corresponding position in the second image is 200
  • the preset difference value is 50
  • the average value of the pixel values of all pixels in the first image is 160
  • the average value of all pixels in the corresponding position in the second image is 200
  • one or more processors 300 start to perform the work of the gain data map, and generate the target image according to the first image and the gain data map, so as to correct the skin color area in the target image Individual adjustments are made to obtain a target image with better skin representation and pinker colors.
  • the one or more processors 300 will also determine the average value of the pixel values of all pixels in the first image, which is the same as the average value of the pixel values in the second image.
  • the pixel difference of the average value of all pixels at the corresponding positions in the image is greater than the preset threshold, if the pixel difference is greater than the preset threshold, it indicates that there is a large difference in the performance of the skin color area in the first image and the second image, at this time, One or more processors 300 will continue to perform the work of generating the gain data map, and generate the target image according to the first image and the gain data map, so as to individually adjust the skin color area in the target image to obtain better skin performance, A target image with a pinker color. If the pixel difference is smaller than the preset threshold, it indicates that the skin color area in the first image and the second image are similar, and there is no need to generate a gain data map, and the imaging system 1000 directly uses the second image as the target image.
  • the skin color area may include the areas corresponding to all human skin exposed in the portrait, for example, the face area, the hand area exposed outside the clothing, the leg area or the foot area, etc., to ensure that when the captured image is When taking a portrait, the final target area will only improve the skin color area in the image, and will not affect the effect of areas other than the skin color area.
  • the image processing method of the present application by detecting whether there are skin-colored areas in the first image and the second image at the same time, to ensure that the final generated target image will only adjust the skin-colored area, and will not affect the color of the non-skinned area. , to get a target image with better skin performance and pinker color.
  • the image processing method also includes steps:
  • one or more processors 300 will first detect whether the acquired first image and the second image contain a skin color area, and detect whether the first image and the second image The gender of the skin color is used to determine whether to generate a gain data map, and the target image is obtained according to the first image and the gain data map.
  • one or more processors 300 After acquiring the first image and the second image, if one or more processors 300 detect that at least one of the acquired first image and the second image does not contain a skin color area, it indicates that the first image and the second image The parallax of the second image is large or the deviation of the captured scene is large, and the alignment cannot be performed. One or more processors 300 cannot generate a gain data map corresponding to the skin color area, and one or more processors 300 will directly stop working and will not It will then determine the gender of the skin-colored area and prompt the user to take another shot.
  • the one or more processors 300 After acquiring the first image and the second image, if one or more processors 300 detect that both the acquired first image and the second image contain a skin color area, the one or more processors 300 will Continue to detect the gender of the skin color in the first image and the second image.
  • one or more processors 300 may also calculate the average value of the pixel values of all pixels in the first image, and the average value of all pixels in the corresponding position in the second image. value, if the pixel difference is greater than the preset difference, it indicates that the parallax between the first image and the second image is relatively large or the captured scene has a large deviation, which cannot be aligned, and one or more processors 300 cannot generate One or more processors 300 will stop working to prompt the user to take another shot.
  • the preset difference is 50
  • the average value of the pixel values of all pixels in the first image is 140
  • the average value of all pixels in the corresponding position in the second image is 200
  • the preset difference value is 50
  • the average value of the pixel values of all pixels in the first image is 160
  • the average value of all pixels in the corresponding position in the second image is 200
  • one or more processors 300 start to perform the work of the gain data map, and generate the target image according to the first image and the gain data map, so as to correct the skin color area in the target image Individual adjustments are made to obtain a target image with better skin representation and pinker colors.
  • the one or more processors 300 after acquiring the first image and the second image, if one or more processors 300 detect that both the acquired first image and the second image contain skin color areas, the one or more processors 300 will also Judging whether the average value of the pixel values of all pixels in the first image and the average value of all pixels in the corresponding position in the second image is greater than a preset threshold, if the pixel difference is greater than a preset threshold, it indicates There is a big difference between the performance of the skin color area in the first image and the second image, at this time, one or more processors 300 will continue to perform the work of generating the gain data map, and generate the target image according to the first image and the gain data map, thereby Individually adjust the skin color area in the target image to obtain a target image with better skin performance and pinker color. If the pixel difference is smaller than the preset threshold, it indicates that the skin color area in the first image and the second image are similar, and there is no need to generate a gain data map, and the imaging system 1000 directly uses the
  • the one or more processors 300 will generate a gain data map, and based on the first image and the gain data map , so as to separately adjust the skin color region whose skin color belongs to female, so as to obtain a target image with better female skin and pinker color.
  • the image processing method of the embodiment of the present application by detecting whether the first image and the second image contain a skin color area, and when both the first image and the second image contain a skin color area, then judge the skin color in the first image and the second image Whether the gender is female, so as to separately adjust the skin color of the target image containing female characters.
  • the present application also provides a terminal 5000.
  • the terminal 5000 may include the imaging module 100 in any of the above-mentioned embodiments.
  • the imaging module 100 may be installed in the housing of the terminal 5000, and It can be connected to the motherboard of Terminal 5000.
  • the present application may also provide a terminal 5000.
  • the terminal 5000 may include the imaging system 1000 in any of the above-mentioned embodiments.
  • the imaging system 1000 may be installed in the casing of the terminal 5000 and communicate with The main board of the terminal 5000 is connected, and the imaging system 1000 is used for imaging.
  • the present application may also provide a terminal 5000.
  • the terminal 5000 may include the imaging system 2000 in any of the above-mentioned embodiments.
  • the imaging system 2000 may be installed in the casing of the terminal 5000 and communicate with The main board of the terminal 5000 is connected, and the imaging system 1000 is used for imaging.
  • the present application may also provide a terminal 5000, the terminal 5000 includes one or more processors 300, and one or more processors 300 can be used to implement the image in any of the above embodiments
  • the terminal 5000 can be used to implement one or more steps in step 01, step 03, step 04, step 06, step 07, step 071, step 072, step 075, step 09, and step 091.
  • the terminal 5000 acquires an image through the imaging system 1000 or the imaging system 2000 and displays the image on the terminal 5000
  • the image displayed by the terminal 5000 is only filtered through an optical filter 40
  • an adjustment coefficient area 500 (namely N) may be displayed on the terminal 5000, and the user may drag from left to right to increase the value of N, thereby presenting an image filtered by a plurality of optical filters 40, when When the user adjusts an image that meets his own needs, the target image is obtained.
  • the terminal 5000 can also display an adjustment area 700.
  • the user Before the user drags the adjustment coefficient area 500, the user can select the skin color area 701 in the option of the adjustment area 700, so that the user can adjust the value of N When , only the part of the skin-colored area in the image changes.
  • the terminal 5000 can also display a gender option 900.
  • the user Before the user drags the adjustment coefficient area 500, the user can also select the female 901 in the gender option 900, so that the user can select the adjustment area option 700 After the skin color area 701, when the value of N is adjusted, only the skin color belonging to women in the skin color area in the image changes.
  • the adjustment coefficient area 500, the adjustment area 700, and the gender area 900 in the terminal 5000 can all be set by the user before the terminal 5000 acquires an image, so that the image acquired by the terminal 5000 is an image that meets the user's expectations , the target image.
  • first and second are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include at least one of these features.
  • “plurality” means at least two, such as two, three, etc., unless otherwise specifically defined.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Blocking Light For Cameras (AREA)
  • Studio Devices (AREA)

Abstract

一种成像模组(100)、成像系统(1000)、图像处理方法及终端(5000)。成像模组(100)包括模组本体(10)、收容于模组本体(10)内的透镜组(20)、收容于模组本体(10)内并位于透镜组(20)像侧的图像传感器(30)、及滤光器(40)。滤光器(40)用于过滤进入图像传感器(30)中波段范围为530nm~580nm的至少部分光线。

Description

成像模组、成像系统、图像处理方法及终端 技术领域
本申请涉及成像技术领域,更具体而言,涉及一种成像模组、成像系统、图像处理方法及终端。
背景技术
随着数码相机和带有摄像头的手机的增长,人们越来越重视拍摄出的图像的肤色的表现效果。目前,在相机对拍摄的人像皮肤进行调整时,需先识别图像的肤色区域,以将肤色区域和其他颜色区域进行区分,从而针对性地对肤色区域进行单独调整。肤色区域的识别和分割会增加计算负担,且识别肤色区域易受光源影响,导致人像皮肤调整的效率及效果都不能令人满意。
发明内容
本申请实施方式提供一种成像模组、成像系统、图像处理方法及终端。
本申请实施方式的成像模组包括模组本体、透镜组、图像传感器及滤光器。所述透镜组收容于所述模组本体内。所述图像传感器收容于所述模组本体内,并位于所述透镜组的像侧。所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。
本申请实施方式的成像系统包括成像组件、成像模组及一个或多个处理器。所述成像组件用于接收第一光线以获取第一图像。所述成像模组包括模组本体、透镜组、图像传感器及滤光器。所述透镜组收容于所述模组本体内。所述图像传感器收容于所述模组本体内,并位于所述透镜组的像侧。所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。所述成像模组用于接收所述第一光线中波段范围530nm~580nm以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。一个或多个所述处理器用于根据所述第一图像和所述第二图像获取目标图像。
本申请实施方式的成像系统包括成像模组及能够移动的滤光器。所述成像模组包括图像传感器。当所述滤光器移动至所述图像传感器的入射光路之外,所述图像传感器用于接收第一光线以获取第一图像;当所述滤光器移动至所述图像传感器的入射光路中时,所述图像传感器用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。
本申请实施方式的成像系统包括成像组件、成像模组及一个或多个处理器。所述成像组件用于接收第一光线以获取第一图像。所述成像模组用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。一个或多个所述处理器用于:根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图;及根据所述第一图像及所述增益数据图以获取目标图像。
本申请实施方式的图像处理方法包括:接收第一光线以获取第一图像;接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素;根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图,及根据所述第一图像及所述增益数据图以获取目标图像。
本申请实施方式的终端包括成像模组,所述成像模组包括模组本体、透镜组、图像传感器及滤光器。所述透镜组收容于所述模组本体内。所述图像传感器收容于所述模组本体内,并位于所 述透镜组的像侧。所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。
本申请实施方式的终端包括成像系统,所述成像系统包括成像组件、成像模组及一个或多个处理器。所述成像组件用于接收第一光线以获取第一图像。所述成像模组包括模组本体、透镜组、图像传感器及滤光器。所述透镜组收容于所述模组本体内。所述图像传感器收容于所述模组本体内,并位于所述透镜组的像侧。所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。所述成像模组用于接收所述第一光线中波段范围530nm~580nm以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。一个或多个所述处理器用于根据所述第一图像和所述第二图像获取目标图像。
本申请实施方式的终端包括成像系统,所述成像系统包括成像模组及能够移动的滤光器。所述成像模组包括图像传感器。当所述滤光器移动至所述图像传感器的入射光路之外,所述图像传感器用于接收第一光线以获取第一图像;当所述滤光器移动至所述图像传感器的入射光路中时,所述图像传感器用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。
本申请实施方式的终端包括成像系统,所述成像系统包括成像组件、成像模组及一个或多个处理器。所述成像组件用于接收第一光线以获取第一图像。所述成像模组用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。一个或多个所述处理器用于:根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图;及根据所述第一图像及所述增益数据图以获取目标图像。
本申请实施方式的终端包括一个或多个处理器,一个或多个所述处理器用于实现本申请实施方式所述的图像处理方法。所述图像处理方法包括:接收第一光线以获取第一图像;接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素;根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图,及根据所述第一图像及所述增益数据图以获取目标图像。
本申请实施方式的成像模组、成像系统、图像处理方法及终端通过滤光器过滤进入图像传感器中波段范围为530nm~580nm的至少部分光线。其中,人像的肤色的主要影响因素为黑色素和血红蛋白的含量,在该波段范围内的光线对应的黑色素的反射率高于血红蛋白的反射率,当进入图像传感器的光线被滤光器过滤该波段范围内的部分光线后,便可减小黑色素的反射率和血红蛋白的反射率的差值,且图像传感器的像素中绿色通道的相对灵敏度减少量大于红色通道和蓝色通道相对灵敏度的减少量,则可使皮肤的黑色素表现不明显,且肤色较粉。因此,在进入图像传感器的光线被滤光器过滤波段范围为530nm~580nm的部分光线后,则无需对成像模组所成图像进行肤色区域区分,便可针对性地对肤色区域进行单独调整,从而减少了计算负担,并能够得到用户满意的皮肤表现较好的图像,保证了人像皮肤调整的效率及效果。
本申请的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实施方式的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请某些实施方式的成像模组的结构示意图;
图2是本申请某些实施方式的成像模组在不同波段的光线与对应的黑色素及血红蛋白的反射率及透过率的关系示意图;
图3是本申请某些实施方式的成像模组通过滤光器过滤后,不同波段的光线与对应的黑色素及血红蛋白的反射率的关系示意图;
图4是本申请某些实施方式的成像模组未经滤光器过滤时,不同波段的光线与对应的红色通道、绿色通道及蓝色通道的相对灵敏度的关系示意图;
图5是本申请某些实施方式的成像模组经过滤光器过滤后,不同波段的光线与对应的红色通道、绿色通道及蓝色通道的相对灵敏度的关系示意图;
图6是本申请某些实施方式的成像系统的结构示意图;
图7是本申请某些实施方式的成像模组的透镜的结构示意图;
图8是本申请某些实施方式的成像模组的图像传感器的结构示意图;
图9至图11是本申请某些实施方式的成像模组的滤光器的安装示意图;
图12是本申请某些实施方式的成像系统的第一图像和第二图像的示意图;
图13是本申请某些实施方式的成像系统经过两个滤光器过滤后,不同波段的光线与对应的黑色素及血红蛋白的反射率的关系示意图;
图14是本申请某些实施方式的成像系统经过三个滤光器过滤后,不同波段的光线与对应的黑色素及血红蛋白的反射率的关系示意图;
图15是本申请某些实施方式的成像系统的结构示意图;
图16是本申请某些实施方式的成像系统的另一角度的结构示意图;
图17和图18是本申请某些实施方式的成像系统的驱动件驱动滤光器的场景示意图;
图19至图23是本申请某些实施方式的图像处理方法的流程示意图;
图24是本申请某些实施方式的终端的结构示意图;
图25至图27是本申请某些实施方式的终端的场景示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
请参阅图1,本申请实施方式提供一种成像模组100。成像模组100包括模组本体10、透镜组20、图像传感器30及滤光器40。透镜组20收容于所述模组本体10内。图像传感器30收容于模组本体10内,并位于透镜组20的像侧202。滤光器40用于过滤进入图像传感器30中波段范围为530nm~580nm的至少部分光线。
具体地,模组本体10开设有收容空间11,收容空间11用于收容透镜组20、图像传感器30及滤光器40。
需要说明的是,由于人体肤色的色素构成为黑色素、血红蛋白、胆红素及胡萝卜素等,且含量大并且易变化(个体差异大)的是黑色素及血红蛋白,此二者很大程度上左右成像模组100所成图像表现出来的肤色。当黑色素反射率明显高于血红蛋白时,则会导致皮肤中痣、癍等表现明显。
具体地,请参阅图2和图3,图2(a)表示成像模组100没有加滤光器40时,血红蛋白和 黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示光线的反射率,曲线H表示血红蛋白在不同波段下的反射率,曲线M代表黑色素在不同波段下的反射率。图2(b)表示被滤光器40过滤后的光线与光线的透光率的曲线图,横坐标表示光线的波长,纵坐标表示光线的透过率。图3为当成像模组100经过滤光器40过滤该波段范围为530nm~580nm的至少部分光线后,血红蛋白和黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示反射率,同样的,曲线H表示血红蛋白在不同波段下的反射率,曲线M代表黑色素在不同波段下的反射率。
通过图2(a)可以看出:在波段范围为530nm~580nm的光线照射下,黑色素对该波段范围的光线的反射率高于血红蛋白对该波段范围的光线的反射率。此时,则会导致成像模组100所成图像中皮肤中痣、癍等表现明显。通过图2(b)可以看出:增设滤光器40之后,波段范围为530nm~580nm的光线的透过率相较其他波段范围的光线的透过率较低,即进入图像传感器30中的波段范围为530nm~580nm的光线会相对较少。通过图3可以看出,增设滤光器40之后,在波段范围为530nm~580nm的光线照射下,黑色素对该波段范围的光线的反射率开始接近血红蛋白对该波段范围的光线的反射率,即两者的反射率差值逐渐减小。此时,则会导致成像模组100所成图像中皮肤中痣、癍等表现变得不明显。因此,成像模组100应该要减少波段范围为530nm~580nm的透过率,以使成像模组100所成图像中皮肤表现较好。
进一步地,请参阅图4和图5,图4表示成像模组100未经过滤光器40过滤时,R(红色)通道、G(绿色)通道、B(蓝色)通道的相对灵敏度,曲线R为R通道的相对灵敏度曲线,曲线G为G通道的相对灵敏度曲线,曲线B为B通道的相对灵敏度曲线。图5表示成像模组100经过滤光器40过滤后,R(红色)通道、G(绿色)通道、B(蓝色)通道的相对灵敏度的变化程度。曲线R、曲线G和曲线B分别为成像模组100未经过滤光器40过滤时,R通道、G通道、B通道的相对灵敏度曲线;曲线R1、曲线G1和B1分别为成像模组100经过滤光器40过滤后,R通道、G通道、B通道的相对灵敏度曲线。
通过图4可以看出,在成像模组100未经过滤光器40过滤时,波段范围为530nm~580nm的光线进入图像传感器30形成的图像的像素中,G通道的相对灵敏度要高于R通道的相对灵敏度,也要高于B通道的相对灵敏度。而通过图5可以看出,在成像模组100经过滤光器40过滤时,该波段范围的光线进入图像传感器30形成图像的像素中,绿色通道的相对灵敏度的下降程度(G曲线与G1之间的差距)要大于红色通道相对灵敏度的下降程度(R曲线与R1之间的差距),也要大于蓝色通道的相对灵敏度的下降程度(B曲线与B1之间的差距),此时,成像模组100所成图像相较于成像模组100未经过滤光器40过滤时所成的图像,皮肤肤色表现会向粉色偏移,皮肤表现效果更佳。
综上,通过结合图2和图3,可知当增设滤光器40以过滤进入图像传感器30中的波段范围为530nm~580nm的至少部分光线后,该波段范围的光线透光率降低,且黑色素对该波段范围为530nm~580nm的光线的反射率开始接近血红蛋白对该波段范围的光线的反射率。而通过图4和图5可知,在加入滤光器40后,虽然波段范围为400nm-700nm的光线对应的G通道、R通道和B通道的相对灵敏度均会下降,但波段范围为530nm-580nm的光线对应的G通道相对灵敏度的下降程度高于R通道和B通道的相对灵敏度的下降程度。
因此,选取通过滤光器40过滤波段范围为530nm~580nm的至少部分光线,可同时保证:该 波段范围的光线的透光率降低;该波段范围对应的黑色素反射率趋近于血红蛋白的反射率;及R1(红色)通道的相对灵敏度与G1(绿色)通道的相对灵敏度之间的比值增大,B1(蓝色)通道的相对灵敏度与G1(绿色)通道的相对灵敏度之间的比值也增大,从而使最终生成的图像中皮肤的肤色向粉色偏移,皮肤表现较好。
本申请实施方式的成像模组100通过滤光器40以过滤进入图像传感器30中波段范围为530nm~580nm的至少部分光线。其中,人像的肤色的主要影响因素为黑色素和血红蛋白的含量,在该波段范围内的光线对应的黑色素的反射率高于血红蛋白的反射率,当进入图像传感器30的光线被滤光器40过滤该波段范围内的部分光线后,便可减小黑色素的反射率和血红蛋白的反射率的差值,且图像传感器30的像素中绿色通道的相对灵敏度减少量大于红色通道相对灵敏度的减少量,也大于蓝色通道相对灵敏度的减少量,则可使皮肤的黑色素表现不明显,且肤色较粉。因此,在进入图像传感器30的光线被滤光器40过滤波段范围为530nm~580nm的部分光线后,则无需对成像模组100所成图像进行肤色区域区分,便可针对性地对肤色区域进行单独调整,从而减少了计算负担,并能够得到用户满意的皮肤表现较好的图像。
下面结合附图进一步说明。
请结合图6,本申请实施方式还提供一种成像系统1000。成像系统1000包括成像模组100、成像组件200及一个或多个处理器300。
成像组件200用于接收第一光线以获取第一图像。
成像模组100用于接收第一光线中特定波段范围以外的第二光线以获取第二图像,第二图像的像素具有与第一图像对应的像素。
其中,特定波段范围为黑色素的反射率高于血红蛋白的反射率的波段范围,且特定波段范围位于红色波段范围内。本申请实施方式中的特定波段范围为530nm~580nm。
请结合图2至图5,可以得出,当成像模组100经滤光器40过滤进入图像传感器30中波段范围为530nm~580nm的至少部分光线后,则会使光线的透光率降低,以使该波段范围对应的黑色素反射率趋近于血红蛋白的反射率,且在过滤掉该波段范围的部分光线后,红色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值增大,蓝色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值增大,从而使最终生成的目标图像中皮肤的肤色向粉色偏移,皮肤表现较好。
具体地,透镜组20包括至少一个透镜,透镜组20具有物侧201和像侧202。如图7所示,以透镜组20仅包括一个透镜为例,可以看出,物侧201和像侧202的位置与光线的入射方向有关,物侧201为光线还未入射至透镜时的入光面21所在的一侧,像侧202为光线入射至透镜后,并通过透镜汇聚再射出透镜时,透镜的出光面22所在的一侧。
请结合图1和图8,图像传感器30位于透镜组20的像侧202,当图像传感器30接收到经滤光器40过滤第一光线后的第二光线时,则可获取第二图像。其中,第二图像中的像素具有与第一图像对应的像素。
在一个例子中,图像传感器30可包括微透镜阵列31、滤光片阵列32和像素阵列33。微透镜阵列31、滤光片阵列32和像素阵列33依次层叠设置。
具体地,微透镜阵列31包括有多个微透镜311,微透镜311可汇聚经过透镜射出的光线,以将入射的光线更多地引导至滤光片阵列32。滤光片阵列32包括有多个滤光片321,滤光片321用于过滤部分光线,以让红、绿、蓝三通道(当然也可以红、黄、蓝三通道,或者是其他通道) 光线进入像素阵列31中。像素阵列33包括有多个像素331,多个像素331用于将接收到的光信号转换为电信号。
其中,滤光片阵列32设置在微透镜阵列31与像素阵列33之间,每个微透镜311对应一个滤光片321及一个像素331,沿图像传感器30的收光方向,光线经微透镜311到达滤光片321,再通过滤光片321过滤后到达对应的像素331。
请再参阅图1,滤光器40用于过滤进入图像传感器30中波段范围为530nm~580nm的至少部分光线。滤光器40可以设置于模组本体10的外部,滤光器40还可设置于模组本体10的内部。
具体地,当滤光器40设置于模组本体10的外部时,滤光器40需设置在透镜组20的物侧201。
在一个实施方式中,如图9(a)所示,滤光器40设置于模组本体10的顶壁,光线在射入模组本体10前,便会先经过滤光器40,以过滤波段范围为530nm~580nm的至少部分光线,则当光线射入模组本体10以进入图像传感器30后,射入图像传感器30的光线为第二光线,图像传感器30则可获取第二图像。
在另一个实施方式中,一般地,为保证成像模组100不易损坏,会在模组本体10外设置有保护盖板50,且保护盖板50不会影响成像模组100接收光线。因此,当滤光器40设置于模组本体10的外部时,如图9所示,滤光器40还可集成在保护盖板50。具体地,如图9(b)所示,滤光器40可嵌设于保护盖板50内;如图9(c)所示,滤光器40可设置在保护盖板50的下表面(靠近模组本体10的表面);如图9(d)所示,滤光器40还可设置在保护盖板50的上表面(远离模组本体10的表面)。由此,光线在射入模组本体10前,同样会先经过滤光器40,以过滤波段范围为530nm~580nm的至少部分光线,以使进入成像模组100的光线已经是第二光线,从而使成像模组100获取第二图像。需要说明的是,保护盖板50可以是集成有coverglass和显示屏的能够透光的盖板。
当滤光器40设置于模组本体10的内部时,滤光器40可以设置在透镜组20的物侧201、滤光器40还可以设置在透镜组20的像侧202且位于透镜组20和图像传感器30之间。
在一个实施方式中,如图10所示,滤光器40可以为滤光膜,滤光器40可设置于透镜组20中任意一个透镜的出光面22或入光面21(图7所示)。当光线经过透镜组20时,则光线会被设置在透镜的出光面22或入光面21上的滤光器40过滤,从而使进入图像传感器30的接收第二光线,以生成第二图像。
在另一个实施方式中,滤光器40还可以是集成于图像传感器30上的单体滤光结构。请结合图8,滤光器40可以设置在微透镜阵列31的每一个微透镜311上,即滤光器40、微透镜阵列31、滤光片阵列32和像素阵列33依次堆叠设置;滤光器40还可集成在滤光片阵列32上,如滤光器40可集成在滤光片阵列32与微透镜阵列31之间,或滤光器40可集成在滤光片阵列32与像素阵列33之间,即微透镜阵列31、滤光器40、滤光片阵列32和像素阵列33依次堆叠设置,或微透镜阵列31、滤光片阵列32、滤光器40和像素阵列33依次堆叠设置,以保证光线在进入像素阵列33前,已被滤光器40过滤,从而保证进入像素阵列33的光线为第二光线,以使成像装置获取第二图像。
进一步地,滤光器40还可为独立于图像传感器30及透镜组20的单体滤光结构。同样地,滤光器40可以设置于模组本体10的外部或内部,且滤光器40能够移动。其中,当滤光器40位于图像传感器30的入射光路之外时,图像传感器30用于获取第一光线;当滤光器40移动至图 像传感器30的入射光路中时,图像传感器30用于获取第一光线中波段范围530nm~580nm以外的第二光线。
在一个实施方式中,当滤光器40设置在模组本体10的外部时,当滤光器40设置于模组本体10的顶壁时,如图9(a)所示,滤光器40可相对模组本体10的顶壁移动,以使滤光器40移动至图像传感器30的入射光路之外或之内,从而可控制图像传感器30接收第一光线或第二光线,则可获取第一图像或第二图像。又如,在滤光器40集成在保护盖板50时,滤光器40可相对保护盖板50移动,如在滤光器40嵌设于保护盖板50内时,滤光器40可在保护盖板50内移动,又如,在滤光器40设置在保护盖板50的上表面或下表面时,滤光器40可在保护盖板50外相对保护盖板50移动,从而使滤光器40移动至图像传感器30的入射光路之外或之内。
在另一个实施方式中,当滤光器40收容于模组本体10的内部时,滤光器40可设置在模组本体10的顶壁与透镜组20之间(如图11(a)所示),滤光器40可设置在透镜组20中的任意两个透镜之间(如图11(b)所示),滤光器40还可设置在透镜组20与图像传感器30之间(如图1所示)。如此,当光线进入成像模组100后,则会先经过滤光器40过滤后,再进入图像传感器30的像素阵列33,以使进入图像传感器30获取第二光线,以生成第二图像。此外,滤光器40可相对透镜组20的光轴转动和/或平移,以使滤光器40选择性的进入图像传感器30的入射光路之外或之中,以选择性的获取第一图像或第二图像。
由此,在用户使用成像系统1000拍摄图像时,用户可灵活选择是否使用滤光器40过滤光线,当用户选择不使用滤光器40时,则成像组件200和成像模组100均获取第一图像,以使拍摄的图像更加完整、清晰、真实;当用户选择使用滤光器40时,则成像组件200获取第一图像,成像模组100获取第二图像,由于第二图像为肤色更粉、表现更好的图像,因此,成像系统1000可根据第二图像对第一图像进行融合,以使拍摄的图像皮肤表现更好。
成像组件200用于接收第一光线以获取第一图像。成像组件200可以包括互补金属氧化物半导体(CMOS,Complementary Metal Oxide Semiconductor)感光元件或者电荷耦合元件(CCD,Charge-coupled Device)感光元件。成像组件200获取的第一图像的像素数量可以大于或等于成像模组100获取的第二图像的像素数量,以使第二图像中的像素具有与第一图像对应的像素。
一个或多个处理器300用于根据第一图像和第二图像获取目标图像。具体地,第一图像是成像系统1000通过成像组件200接收第一光线生成的,而第二图像是成像系统1000通过成像模组100接收经滤光器40过滤第一光线中波段范围530nm~580nm以外的第二光线生成的。可以看出,第一图像和第二图像的区别点为接收的光线不同,由此,则可得到第二图像中的像素相对第一图像对应的像素在R像素、G像素和B像素的增益量,一个或多个处理器300则可根据第一图像和该增益量,以改变第一图像中R像素、G像素和B像素的像素值,从而获取目标图像。
请参阅图12,当成像系统2000获取第一图像60和第二图像70后,则可得到第一图像60和第二图像70中每个像素中R通道、G通道和B通道的像素值,以图12中第一图像60的像素P00、P01、P10、P11(下标第一个数字为行,第二个数字为列)和第二图像70中对应位置的像素P00’、P01’、P10’、P11’为例,如,第一图像60中像素P00为R通道,像素值为R1;P01和P10为G通道,像素值分别为G1和G1’;P11为B通道,像素值为B1,第二图像70中像素P00’为R通道,像素值为R2;P01’和P10’为G通道,像素值分别为G2和G2’;P11’为B通道,像素值为B2,则通过第二图像70的像素中与第一图像60对应位置的每个像素的比值,即可得到第二图像 70相对第一图像60的增益。例如,像素P00与像素P00’的增益值为R2/R1,像素P01与像素P01’的增益值为G2/G1,像素P10与像素P10’的增益值为G2’/G1’,像素P11与像素P11’的增益值为B2/B1,由此,则可知在添加一个滤光器40后,生成的第二图像70中的所有像素相对第一图像60对应位置的像素的像素值的变化,则根据此变化以生成第一图像60中每个像素的对比图80。
请再一并参阅图2(a)、图3、图13和图14,图13为当设置有2个滤光器40时,血红蛋白和黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示反射率;图14为当设置有3个滤光器40时,血红蛋白和黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示反射率。通过图2(a)和图3可以看出,在加入一个滤光镜后,相对于图2(a)来说,波段范围为530nm~580nm的部分光线对应黑色素的反射率和血红蛋白的反射率之间的差值减小,进一步地,在结合图13和图14,可以得出,在滤光器40的数量增多时,该波段范围对应的黑色素反射率趋近于血红蛋白的反射率,由此可知,当滤光器40的数量越多时,则目标图像中皮肤表现较好。同样地,当滤光器40的数量越多时,红色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值越大,蓝色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值也越大,则目标图像中皮肤肤色表现较粉。
由于,目标图像应该为用户期望的图像,一般地,成像模组100中的滤光器40的数量仅为1个,若需得到便于用户调节,以使用户得到自身期望的图像,则需根据下述公式,以可得到目标图像中每个像素的R通道、G通道和B通道的像素值。其中,目标图像为用户根据实际需求在第一图像的基础上,进行像素值调整后得到的。
Figure PCTCN2021115402-appb-000001
其中,Rnew为目标图像中每个R像素通道的像素值,R1为第一图像中每个R像素通道的像素值,R2为第二图像中每个R像素通道的像素值,N为用户根据实际需求需要调整的倍数,N可以为整数、小数、负数。
Figure PCTCN2021115402-appb-000002
即为R像素通道的增益数据。G像素通道和B像素通道的像素值计算公式与R像素通道的像素值计算公式类似,分别为
Figure PCTCN2021115402-appb-000003
由此,则可得到包含目标图像中所有像素的像素值。
根据上述公式,可以看出,当用户在应用该成像系统1000的终端5000上,选择N=0时,则Rnew=R1,即目标图像为第一图像,说明用户需要的目标图像无需滤光器40过滤的图像。当选择N=1时,则Rnew=R2,即目标图像为第二图像,说明用户需要的目标图像为需要一个滤光器40过滤后的图像。且用户选择的N越大时,则可使目标图像呈现出被数量更多的滤光器40过滤后的图像。
由此,当用户需要表现更好的肤色时,则可选择数值更大的N,且N可以为小数,则可使用户调整出较为满意的目标图像,以在对成像系统1000增加一个滤光器40的情况下,即可使成像系统1000生成的目标图像表现出加了多个滤光器40的成像效果。
请参阅图15,本申请还提供一种成像系统2000,该成像系统2000可包括成像模组100、能够移动的滤光器40、一个或多个处理器300及驱动件600。
请结合图16,成像模组100可包括模组本体10、透镜组20、图像传感器30。模组本体10开设有收容空间11,收容空间11用于收容透镜组20和图像传感器30。其中,滤光器40用于过滤进入图像传感器30的第一光线中特定波段范围内的部分光线,以使图像传感器30接收第二光 线。
其中,特定波段范围为黑色素的反射率高于血红蛋白的反射率的波段范围,且特定波段范围位于红色波段范围内。本申请实施方式中的特定波段范围为530nm~580nm。
请结合图2至图5,可以得出,当成像模组100经滤光器40过滤进入图像传感器30中波段范围为530nm~580nm的至少部分光线后,则会使光线的透光率降低,并使该波段范围对应的黑色素反射率趋近于血红蛋白的反射率,且在过滤掉该波段范围的部分光线后,红色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值增大,蓝色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值增大,从而使最终生成的目标图像中皮肤的肤色向粉色偏移,皮肤表现较好。
具体地,透镜组20包括至少一个透镜,透镜组20具有物侧201和像侧202。如图7所示,以透镜组20仅包括一个透镜为例,可以看出,物侧201和像侧202的位置与光线的入射方向有关,物侧201为光线还未入射至透镜时的入光面21所在的一侧,像侧202为光线入射至透镜后,并通过透镜汇聚再射出透镜时,透镜的出光面22所在的一侧。
请结合图1和图8,图像传感器30位于透镜组20的像侧202,当图像传感器30接收到未被滤光器40过滤的第一光线后,可获取第一图像,当图像传感器30接收到经滤光器40过滤第一光线后的第二光线时,可获取第二图像。其中,第二图像中的像素具有与第一图像对应的像素。
在一个例子中,图像传感器30包括微透镜阵列31、滤光片阵列32和像素阵列33。微透镜阵列31、滤光片阵列32和像素阵列33依次层叠设置。
具体地,微透镜阵列31包括有多个微透镜311,微透镜311可汇聚经过透镜射出的光线,以将入射的光线更多地引导至滤光片阵列32。滤光片阵列32包括有多个滤光片321,滤光片321用于过滤部分光线,以让红、绿、蓝三通道(当然也可以红、黄、蓝三通道,或者是其他通道)光线进入像素阵列31中。像素阵列33包括有多个像素331,多个像素331用于将接收到的光信号转换为电信号。
其中,滤光片阵列32设置在微透镜阵列31与像素阵列33之间,每个微透镜311对应一个滤光片321及一个像素331,沿图像传感器30的收光方向,光线经微透镜311到达滤光片321,再通过滤光片321过滤后到达对应像素331。
请结合图15及图16,驱动件600用于驱动滤光器40相对透镜组20的光轴23转动和/或平移,以使滤光器40选择性地进入图像传感器30的入射光路之外或之中。
具体地,驱动件600上可设置有与滤光器40连接的连接件601,且连接件601可在驱动件600上相对透镜组20的光轴转动或平移,以带动滤光器40相对透镜组20的光轴23转动或平移。
在一个实施方式中,如图16所示,当驱动件600驱动滤光器40相对透镜组20的光轴23平移,以向远离模组本体10的方向移动时,则连接件601向远离模组本体10的方向移动,以使滤光器40位于图像传感器30的入射光路之外。此时,进入成像模组100的第一光线则不会被滤光器40过滤特定波段范围内的部分光线,图像传感器30接收到的光线为第一光线,以使图像传感器30获取第一图像。
在另一个实施方式中,如图17所示,当驱动件600驱动滤光器40相对透镜组20的光轴23平移,以向靠近模组本体10的方向移动时,则连接件601向靠近模组本体10的方向移动,滤光器40会进入图像传感器30的入射光路之中。此时,进入成像模组100的第一光线会被滤光器40过滤特定波段范围内的部分光线,以使图像传感器30接收第二光线,以使图像传感器30获取第 二图像。
在某些实施方式中,驱动件600还可以是设置在成像模组100内的驱动件600,以驱动滤光器40在成像模组100的内部平移。如图18左图所示,当驱动件600未驱动滤光器40相对透镜组20的光轴23转动时,滤光器40可位于透镜组20与图像传感器30之间,且滤光器40位于图像传感器30的入射光路之中,从而使进入图像传感器30的光线为第二光线,以使图像传感器30获取第二图像,当驱动件600驱动滤光器40相对透镜组20的光轴转动后(如图18右图所示),则可使滤光器40位于图像传感器30的入射光路之外,从而使进入图像传感器30的光线为第一光线,以使图像传感器30获取第一图像。同样的,驱动件600和滤光器40还可设置在透镜组20中任意两个透镜组20之间,并能够选择性的进入图像传感器30的入射光路之外或之中,以使图像传感器30能够选择性的获取第一图像或第二图像。
滤光器40设置在驱动件600上,以通过驱动件600可以实现移动,当滤光器40移动至图像传感器30的入射光路之外,图像传感器30用于接收第一光线以获取第一图像;当滤光器40移动至图像传感器30的入射光路中时,图像传感器30用于接收第一光线中特定波段范围以外的第二光线以获取第二图像,第二图像的像素具有与第一图像对应的像素。
滤光器40可设置于模组本体10的外部,滤光器40还可收容于模组本体10的内部。
在一个实施方式中,当滤光器40设置于模组本体10的外部时,滤光器40可设置在模组本体10的顶壁,光线在射入模组本体10前,便会先经过滤光器40,以过滤特定波段范围内的至少部分光线,则当光线射入模组本体10以进入图像传感器30后,射入图像传感器30的光线为第二光线,图像传感器30则可获取第二图像。此时,如图14所示,驱动件600位于模组本体10的外部的一侧,驱动件600可驱动滤光器40相对透镜组20的光轴23平移,以使滤光器40在模组本体10的顶壁上平移,从而可选择性地遮挡模组本体10的顶壁,则射入图像传感器30的光线可以是第一光线或第二光线,图像传感器30则可获取第一图像或第二图像。
在另一个实施方式中,一般地,为保证成像模组100不易损坏,会在模组本体10外设置有保护盖板50,且保护盖板50不会影响成像模组100接收光线。因此,当滤光器40设置于模组本体10的外部时,如图9所示,滤光器40还可集成在保护盖板50上。具体地,如图9(b)所示,滤光器40可嵌设于保护盖板50内;如图9(c)所示,滤光器40可设置在保护盖板50的下表面(远离模组本体10的表面);如图9(d)所示,滤光器40还可设置在保护盖板50的下表面(靠近模组本体10的表面)。由此,光线在射入模组本体10前,同样会先经过滤光器40,以过滤特定波段范围内的至少部分光线,以使进入成像模组100的光线已经是第二光线,从而使成像模组100获取第二图像。需要说明的是,保护盖板50可以是集成有coverglass和显示屏的能够透光的盖板。
此外,驱动件600还可设置在保护盖板50上,且驱动件600可驱动滤光器40相对透镜组20的光轴转动和/或平移。例如,在滤光器40嵌设于保护盖板50内时,驱动件600可驱动滤光器40在保护盖板50内平移,以使滤光器40进入图像传感器30的光路之内或之外。又例如,在滤光器40设置在保护盖板50的上表面或下表面时,驱动件600可驱动滤光器40在保护盖板50上平移,以使滤光器40进入图像传感器30的光路之内或之外,从而选择性的获取第一图像或第二图像。
在又一个实施方式中,当滤光器40收容于模组本体10的内部时,滤光器40可设置在模组 本体10的顶壁与透镜组20之间(如图11(a)所示),滤光器40可设置在透镜组20中的任意两个透镜之间(如图11(b)所示),滤光器40还可设置在透镜组20与图像传感器30之间(如图1所示)。如此,当光线进入成像模组100后,则会先经过滤光器40过滤后,再进入图像传感器30的像素阵列33,以使进入图像传感器30获取第二光线,以生成第二图像。此时,驱动件600同样被收容于模组本体10的内部,以驱动滤光器40相对透镜组20的光轴23转动和/或平移,以使滤光器40选择性的进入图像传感器30的入射光路之外或之中,从而选择性的获取第一图像或第二图像。
由此,在用户使用成像系统拍摄图像时,用户可灵活选择是否使用滤光器40过滤光线,当用户选择不使用滤光器40时,则成像系统2000获取第一图像,以使拍摄的图像更加接近真实;当用户选择使用滤光器40时,则成像系统2000获取第二图像,由于第二图像为肤色更粉、表现更好的图像,此外,成像系统2000还可分别获取第一图像和第二图像,并根据第二图像对第一图像进行融合,以使拍摄的图像皮肤表现较好。
一个或多个处理器300用于根据第一图像中像素的像素值及第二图像中对应位置像素的像素值生成增益数据图;及根据第一图像及增益数据图以获取目标图像。
请参阅图12,当成像系统2000获取第一图像60和第二图像70后,则可得到第一图像60和第二图像70中每个像素中R通道、G通道和B通道的像素值,以图12中第一图像60的像素P00、P01、P10、P11(下标第一个数字为行,第二个数字为列)和第二图像70中对应位置的像素P00’、P01’、P10’、P11’为例,如,第一图像60中像素P00为R通道,像素值为R1;P01和P10为G通道,像素值分别为G1和G1’;P11为B通道,像素值为B1,第二图像70中像素P00’为R通道,像素值为R2;P01’和P10’为G通道,像素值分别为G2和G2’;P11’为B通道,像素值为B2,则通过第二图像70的像素中与第一图像60对应位置的每个像素的比值,即可得到第二图像70相对第一图像60的增益。例如,像素P00与像素P00’的增益值为R2/R1,像素P01与像素P01’的增益值为G2/G1,像素P10与像素P10’的增益值为G2’/G1’,像素P11与像素P11’的增益值为B2/B1,由此,则可知在添加一个滤光器40后,生成的第二图像70中的所有像素相对第一图像60对应位置的像素的像素值的变化,则根据此变化以生成第一图像60中每个像素的对比图80。
请再一并参阅图2(a)、图3、图13和图14,图13为当设置有2个滤光器40时,血红蛋白和黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示反射率;图14为当设置有3个滤光器40时,血红蛋白和黑色素在不同波段的光线下的反射率,横坐标表示光线的波长,纵坐标表示反射率。通过图2(a)和图3可以看出,在加入一个滤光镜后,相对于图2(a)来说,波段范围为530nm~580nm的部分光线对应黑色素的反射率和血红蛋白的反射率之间的差值减小,进一步地,在结合图13和图14,可以得出,在滤光器40的数量增多时,该波段范围对应的黑色素反射率趋近于血红蛋白的反射率,由此可知,当滤光器40的数量越多时,则目标图像中皮肤表现较好。同样地,当滤光器40的数量越多时,红色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值越大,蓝色通道的相对灵敏度与绿色通道的相对灵敏度之间的比值也越大,则目标图像中皮肤肤色表现较粉。
由于,目标图像应该为用户期望的图像,一般地,成像模组100中的滤光器40的数量仅为1个,若需得到便于用户调节,以使用户得到自身期望的图像,则需通过计算以调整目标图像中每个像素的R通道、G通道和B通道的像素值。具体计算方式与上述计算方式相同,在此不一一赘 述。
由此,在对成像系统2000增加一个滤光器40的情况下,即可使成像系统2000生成的目标图像表现出加了多个滤光器40的成像效果。
请结合图19,本申请实施方式还提供一种图像处理方法,图像处理方法包括步骤:
01:接收第一光线以获取第一图像;
03:接收第一光线中过滤或部分过滤特定波段范围的光线后的第二光线以获取第二图像,第二图像的像素具有与第一图像对应的像素;
07:根据第一图像中像素的像素值及第二图像中对应位置像素的像素值生成增益数据图;及
09:根据第一图像及增益数据图以获取目标图像。
具体地,第一图像和第二图像可分别由上述实施方式的成像组件200和成像模组100生成,也可以均由成像模组100(滤光器40可选择性移动至图像传感器30的收光光路中或之外时)生成。且经过成像模组100生成的第二图像的像素具有与第一图像对应的像素,即第二图像的像素数量至少等于第一图像中的像素数量。
其中,第一光线可以是环境光,还可以是经图像传感器30的滤光片过滤红外光、紫外光后的光线。第二光线为第一光线经过滤光器40过滤或部分过滤特定波段范围内的光线后得到的。那么,第一图像即为成像系统1000拍摄的原图像,第二图像为成像系统1000经滤光器40过滤特定波段范围内的部分光线后,获取的经过改善的图像。
其中,特定波段范围为黑色素的反射率高于血红蛋白的反射率的波段范围,且特定波段范围位于红色波段范围内。本申请实施方式中的特定波段范围为530nm~580nm。
根据图2至图5,可以得出,第二图像相较于第一图像,图像中皮肤的肤色会向粉色偏移,且会抑制皮肤中黑色素的反射率,使得第二图像中的皮肤表现更好。
进一步地,当第一图像和第二图像分别由上述实施方式的成像组件200和成像模组100生成,则在获取得到第一图像和第二图像后,由于第一图像和第二图像是根据不同的组件获取的,则会导致生成第一图像和第二图像的拍摄位置(有拍摄视差)、时间均不可能完全相同。因此,在计算增益数据图之前,需要使用尺度不变特征变换(Scale-invariant feature transform,SIFT)等方法,将第二图像中的像素和第一图像中对应位置的像素对齐,由此,则可根据第一图像中像素的像素值和第二图像中对应位置像素的像素值生成增益数据图。
更具体地,以第一图像和第二图像中对应位置的像素为例,分别对比第二图像中R像素通道、G像素通道及B像素通道的像素值与第一图像中R像素通道、G像素通道及B像素通道的像素值,则可得到第二图像的像素值相对第一图像的增益数据,由此,当对比完成第一图像和第二图像中对应位置的所有的像素后,则可得到所有像素在第二图像的像素值相对第一图像的比值。该比值可表明,经过一个滤光器40过滤后生成的第二图像中每个像素的像素值相对于,未经过滤光器40过滤后生成的第一图像中对应位置的像素的像素值的变化。
请再一并参阅图2(a)、图3、图17和图18,可以得出,在滤光器40的数量增多时,该波段范围对应的黑色素反射率趋近于血红蛋白的反射率,由此可知,当滤光器40的数量越多时,则目标图像中肤色的表现越粉,皮肤表现更好、更健康。因此,目标图像作为用户期望的图像,需要进一步调整上述比值的大小,如增加方式可以是倍数增加、指数增加等,以得到增益数据图,从而根据增益数据图,以调整第一图像中的每个像素的像素值,以得到符合用户期望的目标图像 中每个像素的像素值。
本申请实施方式的图像处理方法中,分别通过接收第一光线和第二光线,以获取第一图像和第二图像,从而根据第一图像中和第二图像对应的像素的像素值以生成增益数据图,最后根据第一图像和增益数据图以获取目标图像。其中,由于第二光线是根据第一光线过滤或部分过滤特定波段范围内的光线后得到的,并根据上述理由,生成的第二图像无需先对肤色区域进行识别区分,便可相较于第一图像的肤色区域皮肤表现更好、更健康,因此,在通过增益数据图以调整第一图像内的每个像素的像素值后,则可使获取得到的目标图像皮肤表现更好、更健康。
请结合图20,在某些方式中,步骤07:根据第一图像中像素的像素值及第二图像中对应位置像素的像素值生成增益数据图,还包括步骤:
071:获取第一图像中像素的像素值与第二图像中对应位置像素的像素值的比值,以生成比值图;
073:根据比值图中的比值及预选的调节系数生成增益数据图。
在获取第一图像和第二图像后,则可根据第一图像中像素的像素值与第二图像中对应位置像素的像素值的比值,以生成比值图(如图12(c)所示)。
具体地,在获取第一图像和第二图像后,则可得到第一图像和第二图像中每个像素的像素值,以图12中第一图像60的像素P00、P01、P10、P11(下标第一个数字为行,第二个数字为列)和第二图像70中对应位置的像素P00’、P01’、P10’、P11’为例,如,第一图像60中像素P00为R通道,像素值为R1;P01和P10为G通道,像素值分别为G1和G1’;P11为B通道,像素值为B1,第二图像70中像素P00’为R通道,像素值为R2;P01’和P10’为G通道,像素值分别为G2和G2’;P11’为B通道,像素值为B2,则通过第二图像70的像素中与第一图像60对应位置的每个像素的比值,即可得到第二图像70相对第一图像60的增益。例如,像素P00与像素P00’的增益值为R2/R1,像素P01与像素P01’的增益值为G2/G1,像素P10与像素P10’的增益值为G2’/G1’,像素P11与像素P11’的增益值为B2/B1,由此,则可生成比值图(如图12(c)所示),该比值图可表明,当通过一个滤光器40对第一光线过滤后,采集到的第一图像中每个像素相比于第二图像中对应位置的每个像素的像素值的变化。
由于当滤光器40仅为一个时,并不能准确得到用户所期望的图像,因此,需要对比值图中的每个比值进行调整,以获取用户期望的目标图像。则可根据对比值图中的比值进行预选的调节系数,以生成增益数据图。具体公式如下:
Figure PCTCN2021115402-appb-000004
其中,Rnew为目标图像中每个R像素通道的像素值,R1为第一图像中每个R像素通道的像素值,R2为第二图像中每个R像素通道的像素值,N为预选的调节系数,即用户根据实际需求需要调整的倍数,N可以为整数、小数、负数。
Figure PCTCN2021115402-appb-000005
即为R像素通道的增益数据。G像素通道和B像素通道的像素值计算公式与R像素通道的像素值计算公式类似,分别为
Figure PCTCN2021115402-appb-000006
Figure PCTCN2021115402-appb-000007
由此,在获取目标图像中每个像素的像素值之前,可根据调节N的大小,以获取目标图像中的每个像素在第一图像的基础上需要调节的增益数据,多个像素的增益数据即组成增益数据图。
需要说明得是,当N由1逐渐增大时,则最终得到的目标图像会呈现出被数量更多的滤光器 40过滤后的图像,即目标图像中的皮肤越来越好,肤色越来越来越粉;而当N由1逐渐减小时,则最终得到的目标图像会呈现出未被滤光器40过滤,甚至反向优化的图像,即目标图像中的皮肤越来越差,肤色越来越黄。如此,用户则可灵活调整预设的调节系数,以得到符合自身需求的目标图像。
请结合图21,在某些方式中,第一图像包括肤色区域,第二图像包括肤色区域,步骤07:根据第一图像中像素的像素值及第二图像中对应位置像素的像素值生成增益数据图,还包括步骤:
075:根据第一图像中肤色区域的像素的像素值及第二图像中肤色区域对应位置像素的像素值生成增益数据图;
步骤09:根据第一图像及增益数据图以获取目标图像,还包括步骤:
091:根据增益数据图调节第一图像中肤色区域的像素值以获取目标图像。
具体地,当第一图像和第二图像中均包含有人像时,则可仅对根据第一图像中人像的肤色区域的像素的像素值及第二图像中人像的肤色区域对应的像素的像素值,以生成增益数据图,从而使最终获取的目标图像,仅会根据第一图像中的肤色区域及与肤色区域对应的增益数据图,以单独对第一图像中的肤色区域的肤色进行调整,从而使第一图像中的肤色区域肤色颜色变粉,皮肤表现变好。
由此,则可不影响目标图像中背景区域的颜色,从而在保证背景颜色的前提下,实现目标图像中皮肤区域的改善。
请结合图22,在某些方式中,第一图像包括肤色区域,第二图像包括肤色区域,图像处理方法还包括步骤:
04:检测第一图像和第二图像中的肤色区域。
具体地,一个或多个处理器300在生成增益数据图前,会先检测采集到的第一图像和第二图像中是否包含有肤色区域,以判断是否生成增益数据图,及根据第一图像和增益数据图以获取目标图像。
在一个实施方式中,若检测到第一图像或第二图像中,存在一图像中未包含有肤色区域,则表明第一图像和第二图像视差较大或拍摄的场景偏差较大,无法对齐,一个或多个处理器300则无法生成与肤色区域对应的增益数据图,一个或多个处理器300便会停止工作,以提示用户重新拍摄。
在另一个实施方式中,若检测到第一图像和第二图像中均不存在肤色区域,则表明无需对单独肤色进行调整,则为保证目标图像的颜色的真实性,处理器300同样不会生成增益数据图,以直接输出第一图像或第二图像作为目标图像。
在又一个实施方式中,若检测到第一图像和第二图像中均存在肤色区域,此时,一个或多个处理器300才会继续执行生成增益数据图的工作,并根据第一图像和增益数据图生成目标图像,从而对目标图像中的肤色区域进行单独调整,以得到皮肤表现较好,颜色较粉的目标图像。
在再一个实施方式中,一个或多个处理器300还可计算第一图像中的所有像素的像素值的平均值,与第二图像中对应位置的所有像素的平均值的像素差值,若该像素差值大于预设差值,则表明第一图像和第二图像视差较大或拍摄的场景偏差较大,无法对齐,一个或多个处理器300则无法生成与肤色区域对应的增益数据图,一个或多个处理器300便会停止工作,以提示用户重新拍摄。例如,预设差值为50,第一图像中所有像素的像素值的平均值为140,第二图像中对应位 置的所有像素的平均值为200,则表明第一图像和第二图像视差较大或拍摄的场景偏差较大,无法对齐;又例如,预设差值为50,第一图像中所有像素的像素值的平均值为160,第二图像中对应位置的所有像素的平均值为200,则表明第一图像和第二图像正常,一个或多个处理器300便开始执行增益数据图的工作,并根据第一图像和增益数据图生成目标图像,从而对目标图像中的肤色区域进行单独调整,以得到皮肤表现较好,颜色较粉的目标图像。
在还一个实施方式中,若检测到第一图像和第二图像中均存在肤色区域,一个或多个处理器300还会判断第一图像中的所有像素的像素值的平均值,与第二图像中对应位置的所有像素的平均值的像素差值是否大于预设阈值,若该像素差值大于预设阈值,则表明第一图像和第二图像中肤色区域表现差距较大,此时,一个或多个处理器300才会继续执行生成增益数据图的工作,并根据第一图像和增益数据图生成目标图像,从而对目标图像中的肤色区域进行单独调整,以得到皮肤表现较好,颜色较粉的目标图像。若该像素差值小于预设阈值,则表明第一图像和第二图像中肤色区域表现相接近,无需生成增益数据图,成像系统1000直接以第二图像作为目标图像。
其中,肤色区域可包括人像中人体裸露在外的所有人体皮肤对应的区域,例如,人脸区域、裸露在服装外面的手部区域、腿部区域或脚部区域等,以保证当拍摄的图像为人像时,最终生成的目标区域仅会对图像中的肤色区域改善,并不会影响除肤色区域以外区域的效果。
本申请方式的图像处理方法中,通过检测第一图像和第二图像是否同时存在肤色区域,以保证最终生成的目标图像,仅会对肤色区域的地方进行调整,不会影响非肤色区域的颜色,以得到皮肤表现较好,颜色较粉的目标图像。
请结合图23,在某些方式中,图像处理方法还包括步骤:
04:检测第一图像和第二图像中的肤色区域;及
06:在第一图像和第二图像中均存在肤色区域时,检测第一图像和第二图像中肤色所属性别。
具体地,在获取第一图像和第二图像后,一个或多个处理器300会先检测采集到的第一图像和第二图像中是否包含有肤色区域,并检测第一图像和第二图像中肤色所属性别,以判断是否生成增益数据图,及根据第一图像和增益数据图以获取目标图像。
例如,在获取第一图像和第二图像后,若一个或多个处理器300检测采集到的第一图像和第二图像中,至少有一个图像不包含有肤色区域,则表明第一图像和第二图像视差较大或拍摄的场景偏差较大,无法对齐,一个或多个处理器300则无法生成与肤色区域对应的增益数据图,一个或多个处理器300会直接停止工作,并不会再判断含有肤色区域的肤色所属性别,并提示用户重新拍摄。
又例如,在获取第一图像和第二图像后,若一个或多个处理器300检测采集到的第一图像和第二图像中,均包含有肤色区域,则一个或多个处理器300会继续检测第一图像和第二图像中肤色所属性别。
再例如,在获取第一图像或第二图像后,一个或多个处理器300还可计算第一图像中的所有像素的像素值的平均值,与第二图像中对应位置的所有像素的平均值的像素差值,若该像素差值大于预设差值,则表明第一图像和第二图像视差较大或拍摄的场景偏差较大,无法对齐,一个或多个处理器300则无法生成与肤色区域对应的增益数据图,一个或多个处理器300便会停止工作,以提示用户重新拍摄。例如,预设差值为50,第一图像中所有像素的像素值的平均值为140,第二图像中对应位置的所有像素的平均值为200,则表明第一图像和第二图像视差较大或拍摄的场 景偏差较大,无法对齐;又例如,预设差值为50,第一图像中所有像素的像素值的平均值为160,第二图像中对应位置的所有像素的平均值为200,则表明第一图像和第二图像正常,一个或多个处理器300便开始执行增益数据图的工作,并根据第一图像和增益数据图生成目标图像,从而对目标图像中的肤色区域进行单独调整,以得到皮肤表现较好,颜色较粉的目标图像。
还例如,在获取第一图像和第二图像后,若一个或多个处理器300检测采集到的第一图像和第二图像中,均包含有肤色区域,一个或多个处理器300还会判断第一图像中的所有像素的像素值的平均值,与第二图像中对应位置的所有像素的平均值的像素差值是否大于预设阈值,若该像素差值大于预设阈值,则表明第一图像和第二图像中肤色区域表现差距较大,此时,一个或多个处理器300才会继续执行生成增益数据图的工作,并根据第一图像和增益数据图生成目标图像,从而对目标图像中的肤色区域进行单独调整,以得到皮肤表现较好,颜色较粉的目标图像。若该像素差值小于预设阈值,则表明第一图像和第二图像中肤色区域表现相接近,无需生成增益数据图,成像系统1000直接以第二图像作为目标图像。
接下来,在检测第一图像和第二图像中肤色所属性别中至少有一个肤色所属性别为女性时,一个或多个处理器300才会生成增益数据图,并根据第一图像及增益数据图,以将肤色所属性别为女性的肤色区域进行单独调整,以得到女性皮肤表现较好,颜色较粉的目标图像。
本申请实施方式的图像处理方法中,通过检测第一图像和第二图像是否包含肤色区域,并在第一图像和第二图像均包含肤色区域时,再判断第一图像和第二图像中肤色所属性别是否为女性,以实现单独对包含女性角色的目标图像进行肤色调整。
请参阅图24,在某些实施方式中,本申请还提供一种终端5000,终端5000可包括上述任一实施方式的成像模组100,成像模组100可安装于终端5000的壳体内,并可与终端5000的主板连接。
请参阅图24,在某些实施方式中,本申请还可提供一种终端5000,终端5000可包括上述任一实施方式的成像系统1000,成像系统1000可安装于终端5000的壳体内,并与终端5000的主板连接,成像系统1000用于成像。
请参阅图24,在某些实施方式中,本申请还可提供一种终端5000,终端5000可包括上述任一实施方式的成像系统2000,成像系统2000可安装于终端5000的壳体内,并与终端5000的主板连接,成像系统1000用于成像。
请参阅图24,在某些实施方式中,本申请还可提供一种终端5000,终端5000包括一个或多个处理器300,一个或多个处理器300可用于实现上述任一实施方式的图像处理方法,例如,终端5000可用于实现步骤01、步骤03、步骤04、步骤06、步骤07、步骤071、步骤072、步骤075、步骤09、步骤091中的一个步骤或多个步骤。
请参阅图25,在一个实施方式中,当终端5000通过成像系统1000或成像系统2000获取图像并在终端5000上进行图像展示后,终端5000所展示的图像仅为通过一个滤光器40进行过滤后的图像,即第二图像。此时,终端5000上可显示有调节系数区域500(即N),用户可从左往右拖动,以增大N的数值,从而呈现出被多个滤光器40过滤后的图像,当用户调节出符合自身需求的图像时,即得到目标图像。
此外,如图26所示,终端5000还可显示有调节区域700,在用户拖动调节系数区域500前,用户可通过选择调节区域700选项中的肤色区域701,以使用户在调节N的数值时,发生变化的 仅为图像中肤色区域的部分。
最后,如图27所示,终端5000还可显示有性别选项900,在用户拖动调节系数区域500前,用户还可通过选择性别选项900中的女性901,以使用户选择调节区域选项中700的肤色区域701后,在调节N的数值时,发生变化的仅为图像内肤色区域中属于女性的肤色。
需要说明的是,终端5000中的调节系数区域500、调节700及性别区域900,用户均可在终端5000在获取图像前设置好,以使终端5000获取到的图像,便是符合用户期望的图像,即目标图像。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (37)

  1. 一种成像模组,其特征在于,包括:
    模组本体;
    透镜组,收容于所述模组本体内;
    图像传感器,收容于所述模组本体内,并位于所述透镜组的像侧;及
    滤光器,所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。
  2. 根据权利要求1所述的成像模组,其特征在于,所述滤光器设置于所述模组本体的外部,并位于所述透镜组的物侧。
  3. 根据权利要求2所述的成像模组,其特征在于,所述滤光器设置于所述模组本体的顶壁;或
    所述滤光器集成于保护盖板上。
  4. 根据权利要求1所述的成像模组,其特征在于,所述滤光器收容于所述模组本体的内部。
  5. 根据权利要求4所述的成像模组,其特征在于,所述滤光器为独立于所述图像传感器及所述透镜组的单体滤光结构。
  6. 根据权利要求4所述的成像模组,其特征在于,所述透镜组包括至少一个透镜,所述滤光器为设置在任意一个所述透镜上的滤光膜。
  7. 根据权利要求4所述的成像模组,其特征在于,所述滤光器为集成于所述图像传感器上的单体滤光结构。
  8. 根据权利要求1-5任意一项所述的成像模组,其特征在于,所述滤光器能够移动,当所述滤光器位于所述图像传感器的入射光路之外,所述图像传感器用于获取第一光线;当所述滤光器移动至所述图像传感器的入射光路中时,所述图像传感器用于获取第一光线中波段范围530nm~580nm以外的第二光线。
  9. 一种成像系统,其特征在于,包括:
    成像组件,所述成像组件用于接收第一光线以获取第一图像;
    权利要求1-8任意一项所述的成像模组,所述成像模组用于接收所述第一光线中波段范围530nm~580nm以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素;及
    一个或多个处理器,一个或多个所述处理器用于根据所述第一图像和所述第二图像获取目标图像。
  10. 一种成像系统,其特征在于,包括:
    成像模组,所述成像模组包括图像传感器;及
    能够移动的滤光器,当所述滤光器移动至所述图像传感器的入射光路之外,所述图像传感器用于接收第一光线以获取第一图像;当所述滤光器移动至所述图像传感器的入射光路中时,所述图像传感器用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素。
  11. 根据权利要求10所述的成像系统,其特征在于,所述成像系统还包括:
    一个或多个处理器,一个或多个所述处理器用于:根据所述第一图像中像素的像素值及所述 第二图像中对应位置像素的像素值生成增益数据图;及根据所述第一图像及所述增益数据图以获取目标图像。
  12. 根据权利要求10所述的成像系统,其特征在于,所述特定波段范围为黑色素的反射率高于血红蛋白的反射率的波段范围。
  13. 根据权利要求12所述的成像系统,其特征在于,所述特定波段范围位于红色波段范围内。
  14. 根据权利要求12所述的成像系统,其特征在于,所述特定波段范围为530nm~580nm。
  15. 根据权利要求11所述的成像系统,其特征在于,所述滤光器设置于所述成像模组的外部。
  16. 根据权利要求15所述的成像系统,其特征在于,所述滤光器设置于所述成像模组的顶壁;或
    所述滤光器集成于保护盖板上。
  17. 根据权利要求14所述的成像系统,其特征在于,所述滤光器收容于所述成像模组的内部。
  18. 根据权利要求17所述成像系统,其特征在于,所述成像模组还包括:
    模组本体,所述图像传感器收容于所述模组本体内;及
    透镜组,收容于所述模组本体内,所述滤光器位于所述图像传感器与所述透镜组之间。
  19. 根据权利要求10所述成像系统,其特征在于,所述成像系统还包括:
    驱动件,所述驱动件用于驱动所述滤光器相对所述透镜组的光轴转动和/或平移,以使所述滤光器选择性地进入所述图像传感器的入射光路之外或之中。
  20. 一种成像系统,其特征在于,包括:
    成像组件,所述成像组件用于接收第一光线以获取第一图像;
    成像模组,所述成像模组用于接收所述第一光线中特定波段范围以外的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素;及
    一个或多个处理器,一个或多个所述处理器用于:根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图;及根据所述第一图像及所述增益数据图以获取目标图像。
  21. 根据权利要求20所述的成像系统,其特征在于,所述特定波段范围为黑色素的反射率高于血红蛋白的反射率的波段范围。
  22. 根据权利要求21所述的成像系统,其特征在于,所述特定波段范围位于红色波段范围内。
  23. 根据权利要求21所述的成像系统,其特征在于,所述特定波段范围为530nm~580nm。
  24. 根据权利要求20所述成像系统,其特征在于,所述成像模组包括:
    模组本体;
    透镜组,收容于所述模组本体内;
    图像传感器,收容于所述模组本体内,并位于所述透镜组的像侧;及
    滤光器,所述滤光器用于过滤进入所述图像传感器中波段范围为530nm~580nm的至少部分光线。
  25. 根据权利要求24所述的成像系统,其特征在于,所述滤光器设置于所述模组本体的外部,并位于所述透镜组的物侧。
  26. 根据权利要求25所述的成像系统,其特征在于,所述滤光器设置于所述模组本体的顶壁;或
    所述滤光器集成于保护盖板上。
  27. 根据权利要求24所述的成像系统,其特征在于,所述滤光器收容于所述模组本体的内部。
  28. 根据权利要求27所述的成像系统,其特征在于,所述滤光器为独立于所述图像传感器及所述透镜组的单体滤光结构。
  29. 根据权利要求27所述的成像系统,其特征在于,所述透镜组包括至少一个透镜,所述滤光器为设置在任意一个所述透镜上的滤光膜。
  30. 根据权利要求27所述的成像系统,其特征在于,所述滤光器为集成于所述图像传感器上的单体滤光结构。
  31. 根据权利要求24-28任意一项所述的成像系统,其特征在于,所述滤光器能够移动,当所述滤光器位于所述图像传感器的入射光路之外,所述图像传感器用于获取所述第一光线;当所述滤光器移动至所述图像传感器的入射光路中时,所述图像传感器用于获取所述第二光线。
  32. 一种图像处理方法,其特征在于,包括:
    接收第一光线以获取第一图像;
    接收所述第一光线中过滤或部分过滤特定波段范围的光后的第二光线以获取第二图像,所述第二图像的像素具有与所述第一图像对应的像素;
    根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图;及
    根据所述第一图像及所述增益数据图以获取目标图像。
  33. 根据权利要求32所述的图像处理方法,其特征在于,所述根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图,包括:
    获取所述第一图像中像素的像素值与所述第二图像中对应位置像素的像素值的比值,以生成比值图;及
    根据所述比值图中的比值及预选的调节系数生成所述增益数据图。
  34. 根据权利要求32所述的图像处理方法,其特征在于,所述第一图像包括肤色区域,所述第二图像包括肤色区域,所述根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图,包括:
    根据所述第一图像中肤色区域的像素的像素值及所述第二图像中肤色区域对应位置像素的像素值生成增益数据图;
    所述根据所述第一图像及所述增益数据图以获取目标图像,包括:
    根据所述增益数据图调节所述第一图像中肤色区域的像素值以获取目标图像。
  35. 根据权利要求32所述的图像处理方法,其特征在于,所述第一图像包括肤色区域,所述第二图像包括肤色区域,所述图像处理方法还包括:
    检测所述第一图像和所述第二图像中的肤色区域;
    所述根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图及所述根据所述第一图像及所述增益数据图以获取目标图像,是在所述第一图像和所述第二图像中均存在肤色区域时执行的。
  36. 根据权利要求32所述的图像处理方法,其特征在于,所述图像处理方法还包括:
    检测所述第一图像和所述第二图像中的肤色区域;
    在所述第一图像和所述第二图像中均存在肤色区域时,检测所述第一图像和所述第二图像中肤色所属性别;
    所述根据所述第一图像中像素的像素值及所述第二图像中对应位置像素的像素值生成增益数据图及所述根据所述第一图像及所述增益数据图以获取目标图像,是在所述肤色所属性别为女性时执行的。
  37. 一种终端,其特征在于,
    所述终端包括权利要求1-8任意一项所述的成像模组;或
    所述终端包括权利要求9-31任意一项所述的成像系统;或
    所述终端包括一个或多个处理器,一个或多个所述处理器用于实现权利要求32-36任意一项所述的图像处理方法。
PCT/CN2021/115402 2021-08-30 2021-08-30 成像模组、成像系统、图像处理方法及终端 WO2023028769A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/115402 WO2023028769A1 (zh) 2021-08-30 2021-08-30 成像模组、成像系统、图像处理方法及终端
CN202180099921.6A CN117581155A (zh) 2021-08-30 2021-08-30 成像模组、成像系统、图像处理方法及终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/115402 WO2023028769A1 (zh) 2021-08-30 2021-08-30 成像模组、成像系统、图像处理方法及终端

Publications (1)

Publication Number Publication Date
WO2023028769A1 true WO2023028769A1 (zh) 2023-03-09

Family

ID=85411770

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/115402 WO2023028769A1 (zh) 2021-08-30 2021-08-30 成像模组、成像系统、图像处理方法及终端

Country Status (2)

Country Link
CN (1) CN117581155A (zh)
WO (1) WO2023028769A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104781847A (zh) * 2012-09-28 2015-07-15 独立行政法人科学技术振兴机构 视错觉分析装置及方法、参考视错觉的图像生成装置及方法
CN104966267A (zh) * 2015-07-02 2015-10-07 广东欧珀移动通信有限公司 一种美颜用户图像的方法及装置
JP2016178600A (ja) * 2015-03-23 2016-10-06 カシオ計算機株式会社 画像処理装置、画像処理方法及びプログラム
CN107343188A (zh) * 2017-06-16 2017-11-10 广东欧珀移动通信有限公司 图像处理方法、装置及终端
CN107872608A (zh) * 2016-09-26 2018-04-03 华为技术有限公司 图像采集设备及图像处理方法
CN108460732A (zh) * 2016-12-22 2018-08-28 顶级公司 图像处理
CN110365891A (zh) * 2019-08-20 2019-10-22 惠州Tcl移动通信有限公司 一种摄像头模组及移动终端
CN113259546A (zh) * 2020-02-11 2021-08-13 华为技术有限公司 图像获取装置和图像获取方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104781847A (zh) * 2012-09-28 2015-07-15 独立行政法人科学技术振兴机构 视错觉分析装置及方法、参考视错觉的图像生成装置及方法
JP2016178600A (ja) * 2015-03-23 2016-10-06 カシオ計算機株式会社 画像処理装置、画像処理方法及びプログラム
CN104966267A (zh) * 2015-07-02 2015-10-07 广东欧珀移动通信有限公司 一种美颜用户图像的方法及装置
CN107872608A (zh) * 2016-09-26 2018-04-03 华为技术有限公司 图像采集设备及图像处理方法
CN108460732A (zh) * 2016-12-22 2018-08-28 顶级公司 图像处理
CN107343188A (zh) * 2017-06-16 2017-11-10 广东欧珀移动通信有限公司 图像处理方法、装置及终端
CN110365891A (zh) * 2019-08-20 2019-10-22 惠州Tcl移动通信有限公司 一种摄像头模组及移动终端
CN113259546A (zh) * 2020-02-11 2021-08-13 华为技术有限公司 图像获取装置和图像获取方法

Also Published As

Publication number Publication date
CN117581155A (zh) 2024-02-20

Similar Documents

Publication Publication Date Title
US9635275B2 (en) Flash system for multi-aperture imaging
US9495751B2 (en) Processing multi-aperture image data
US7460160B2 (en) Multispectral digital camera employing both visible light and non-visible light sensing on a single image sensor
CN103430551B (zh) 使用具有纵色像差的透镜单元的成像系统及其操作方法
US20130033579A1 (en) Processing multi-aperture image data
US20160042522A1 (en) Processing Multi-Aperture Image Data
EP3133646A2 (en) Sensor assembly with selective infrared filter array
US20130188057A1 (en) Image capturing device and method thereof
US20160088239A1 (en) Depth of field in an imaging system
CN107438148A (zh) 摄像系统
EP2630788A1 (en) System and method for imaging using multi aperture camera
CN113676628B (zh) 成像装置和图像处理方法
JP2006146194A (ja) 複数の開口を有するフィルタを用いるオートフォーカス
JP6547073B2 (ja) 改善されたオートフォーカス性能を有する撮像装置
CN105988215B (zh) 一种多光谱模组成像系统及其制造方法和应用
US20220174245A1 (en) Systems and methods for creating a full-color image in low light
CN107800965A (zh) 图像处理方法、装置、计算机可读存储介质和计算机设备
Wang et al. Stereoscopic dark flash for low-light photography
US20230342895A1 (en) Image processing method and related device thereof
Zhbanova FEATURES OF DIGITAL COLOURIMETRY APPLICATION IN MODERN SCIENTIFIC RESEARCH.
CN107563329A (zh) 图像处理方法、装置、计算机可读存储介质和移动终端
WO2023028769A1 (zh) 成像模组、成像系统、图像处理方法及终端
CN107172338A (zh) 一种摄像头及电子设备
CN109300186B (zh) 图像处理方法和装置、存储介质、电子设备
CN108429882B (zh) 拍摄装置、电子设备及图像获取方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21955349

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180099921.6

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE