CN110868506A - Image processing method and electronic device - Google Patents

Image processing method and electronic device Download PDF

Info

Publication number
CN110868506A
CN110868506A CN201810907479.5A CN201810907479A CN110868506A CN 110868506 A CN110868506 A CN 110868506A CN 201810907479 A CN201810907479 A CN 201810907479A CN 110868506 A CN110868506 A CN 110868506A
Authority
CN
China
Prior art keywords
image
camera module
infrared light
distribution information
light pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810907479.5A
Other languages
Chinese (zh)
Inventor
陈冠宏
李宗政
林君翰
周祥禾
詹明山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang OFilm Biometric Identification Technology Co Ltd
Original Assignee
Nanchang OFilm Biometric Identification Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang OFilm Biometric Identification Technology Co Ltd filed Critical Nanchang OFilm Biometric Identification Technology Co Ltd
Priority to CN201810907479.5A priority Critical patent/CN110868506A/en
Publication of CN110868506A publication Critical patent/CN110868506A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses an image processing method and an electronic device. The image processing method is applied to the electronic device. The electronic device includes a structured light projector and a camera module. The structured light projector is used for emitting infrared light patterns towards a target object when the structured light projector is started, and the camera module can receive visible light patterns and infrared light patterns for imaging. The image processing method comprises the following steps: controlling the structured light projector to start and acquiring a start image corresponding to the camera module; controlling the structured light projector to close and acquiring a closed image corresponding to the camera module; acquiring distribution information of the infrared light pattern according to the opening image and the closing image; and acquiring the depth information of the target object according to the distribution information of the infrared light pattern. According to the image processing method, the distribution information of the infrared light pattern which is emitted by the structured light projector and modulated by the target object is obtained according to the opening image and the closing image, so that the influence of ambient light is removed, and the detection precision of the depth information is improved.

Description

Image processing method and electronic device
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and an electronic device.
Background
Currently, electronic devices such as cell phones, tablets, and the like typically include structured light projectors and infrared cameras. The structured light projector is used for emitting infrared light patterns to a target object, and the infrared camera is used for receiving the infrared light patterns reflected by the target object so as to acquire depth information of the target object. However, when the ambient light source is noisy, the detection accuracy of the depth information may be affected.
Disclosure of Invention
The embodiment of the invention provides an image processing method and an electronic device.
An image processing method of an embodiment of the present invention is applied to an electronic device including a structured light projector for emitting an infrared light pattern toward a target object when turned on and a camera module capable of receiving visible light and the infrared light pattern to image, the image processing method including: controlling the structured light projector to be started and acquiring a starting image corresponding to the camera module; controlling the structured light projector to be closed and acquiring a closed image corresponding to the camera module; acquiring distribution information of the infrared light pattern according to the opening image and the closing image; and acquiring the depth information of the target object according to the distribution information of the infrared light pattern.
The image processing method of the embodiment of the invention obtains the opening image containing the visible light information, the infrared light pattern information modulated by the target object and the infrared light information with the same wave band as the infrared light pattern in the environment when the structured light projector is opened, obtains the closing image containing the visible light information and the infrared light information with the same wave band as the infrared light pattern in the environment when the structured light projector is closed, and then can remove the visible light information corresponding to the closing image and the infrared light information in the environment from the opening image, so that the distribution pattern of the infrared light pattern only comprises the infrared light pattern information which is emitted by the structured light projector and modulated by the target object, namely the distribution information of the infrared light pattern, the influence of the ambient light is removed, and the detection precision of the depth information is favorably improved.
In some embodiments, the step of controlling the structured light projector to turn on and acquire an on image corresponding to the camera module comprises: when the structured light projector is started, controlling the camera module to collect a plurality of frames of first initial images; and selecting the opening image from the plurality of frames of first initial images according to focusing data corresponding to the plurality of frames of first initial images. And/or the step of controlling the structured light projector to close and acquire a closed image corresponding to the camera module comprises: when the structured light projector is closed, controlling the camera module to collect a plurality of frames of second initial images; and selecting the closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
The opening image and the closing image are selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, and the selected opening image and the selected closing image are clear, so that the detection precision of the depth information is improved. Or, only the starting image is selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, so that the calculation amount can be reduced, and the detection speed of the depth information is improved. Or, only the closed image is selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, so that the calculation amount can be reduced, and the detection speed of the depth information is improved.
In some embodiments, before the step of acquiring the distribution information of the infrared light pattern according to the on image and the off image, the image processing method further includes: carrying out noise reduction processing on the opening image; and/or performing noise reduction processing on the closed image.
The opening image and the closing image are subjected to noise reduction processing, so that the quality of the opening image and the quality of the closing image can be improved, and the detection precision of the depth information is favorably improved. Or, only the noise reduction processing is carried out on the opening image, so that the calculation amount can be saved, and the detection speed of the depth information can be improved. Or, only the closed image is subjected to noise reduction processing, so that the calculation amount can be saved, and the detection speed of the depth information can be improved.
In some embodiments, the electronic device further includes an ambient light sensor for sensing a color temperature and a brightness of ambient light, and the image processing method further includes, before the step of acquiring distribution information of the infrared light pattern from the on image and the off image: carrying out color correction on the opening image according to the color temperature and the brightness; and/or performing color correction on the closed image according to the color temperature and the brightness.
The color correction is carried out on the opening image and the closing image according to the color temperature and the brightness of the ambient light, so that the colors of the opening image and the closing image are more accurate, and the detection speed of the depth information is improved. Or, only the start-up image is subjected to color correction according to the color temperature and the brightness of the ambient light, so that the calculation amount of color correction can be reduced, and the detection speed of the depth information is improved. Or, only the closed image is subjected to color correction according to the color temperature and the brightness of the ambient light, so that the calculation amount of color correction can be reduced, and the detection speed of the depth information can be improved.
In some embodiments, the electronic device has predetermined distribution information stored in a database, and the step of obtaining the depth information of the target object according to the distribution information of the infrared light pattern includes: and comparing the distribution information of the infrared light pattern with the preset distribution information to obtain the depth information.
By comparing the difference between the distribution information of the infrared light pattern and the preset distribution information, the depth information of the target object can be obtained more accurately.
In some embodiments, the camera modules include a first camera module and a second camera module, and the opening image includes a first opening image corresponding to the first camera module and a second opening image corresponding to the second camera module; the closed images comprise a first closed image corresponding to the first camera module and a second closed image corresponding to the second camera module; the step of acquiring the distribution information of the infrared light pattern according to the opening image and the closing image comprises the following steps: acquiring first distribution information of the infrared light pattern according to the first opening image and the first closing image and acquiring second distribution information of the infrared light pattern according to the second opening image and the second closing image; the step of acquiring the depth information of the target object according to the distribution information of the infrared light pattern includes: determining corresponding feature points in the first opening image and the second opening image in an auxiliary manner according to the first distribution information and the second distribution information; and obtaining the depth information of the target object through a binocular imaging algorithm according to the characteristic points.
The corresponding feature points in the first opening image and the second opening image are determined in an auxiliary mode through the first distribution information and the second distribution information, the matching speed of the feature points can be increased, and therefore the detection speed of the depth information is improved.
An electronic device of an embodiment of the invention includes a structured light projector, a camera module, and a processor. The structured light projector is for emitting an infrared light pattern towards a target object when switched on. The camera module can receive visible light and the infrared light pattern to image. The processor is used for controlling the structured light projector to be started and acquiring a starting image corresponding to the camera module, controlling the structured light projector to be stopped and acquiring a stopping image corresponding to the camera module, acquiring distribution information of the infrared light patterns according to the starting image and the stopping image, and acquiring depth information of the target object according to the distribution information of the infrared light patterns.
The electronic device of the embodiment of the invention acquires the opening image containing the visible light information, the infrared light pattern information modulated by the target object and the infrared light information with the same wave band as the infrared light pattern in the environment when the structured light projector is opened, acquires the closing image containing the visible light information and the infrared light information with the same wave band as the infrared light pattern in the environment when the structured light projector is closed, and then can remove the visible light information corresponding to the closing image and the infrared light information in the environment from the opening image, so that the obtained distribution pattern of the infrared light pattern only comprises the infrared light pattern information which is emitted by the structured light projector and modulated by the target object, namely the distribution information of the infrared light pattern, the influence in the ambient light is removed, and the detection precision of the depth information is favorably improved.
In some embodiments, the processor is further configured to control the camera module to capture a plurality of frames of first initial images and select the opening image from the plurality of frames of first initial images according to focusing data corresponding to the plurality of frames of first initial images when the structured light projector is turned on. And/or controlling the camera module to collect a plurality of frames of second initial images when the structured light projector is closed; and selecting the closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
The opening image and the closing image are selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, and the selected opening image and the selected closing image are clear, so that the detection precision of the depth information is improved. Or, only the starting image is selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, so that the calculation amount can be reduced, and the detection speed of the depth information is improved. Or, only the closed image is selected according to a plurality of continuous initial images and corresponding focusing data collected by the camera module, so that the calculation amount can be reduced, and the detection speed of the depth information is improved.
In some embodiments, the processor is further configured to perform noise reduction processing on the on image. And/or performing noise reduction processing on the closed image.
The electronic device performs noise reduction on the opening image and the closing image, so that the quality of the opening image and the quality of the closing image can be improved, and the detection precision of the depth information is improved. Or, only the noise reduction processing is carried out on the opening image, so that the calculation amount can be saved, and the detection speed of the depth information can be improved. Or, only the closed image is subjected to noise reduction processing, so that the calculation amount can be saved, and the detection speed of the depth information can be improved.
In some embodiments, the electronic device further comprises an ambient light sensor configured to sense a color temperature and a brightness of ambient light, and the processor is further configured to color correct the on image according to the color temperature and the brightness. And/or performing color correction on the closed image according to the color temperature and the brightness.
The electronic device performs color correction on the opening image and the closing image according to the color temperature and the brightness of the ambient light, so that the colors of the opening image and the closing image are more accurate, and the detection speed of the depth information is improved. Or, only the start-up image is subjected to color correction according to the color temperature and the brightness of the ambient light, so that the calculation amount of color correction can be reduced, and the detection speed of the depth information is improved. Or, only the closed image is subjected to color correction according to the color temperature and the brightness of the ambient light, so that the calculation amount of color correction can be reduced, and the detection speed of the depth information can be improved.
In some embodiments, the predetermined distribution information is stored in a database of the electronic device, and the processor is further configured to compare the distribution information of the infrared light pattern with the predetermined distribution information to obtain the depth information.
The electronic device can obtain the depth information of the target object more accurately by comparing the difference between the distribution information of the infrared light pattern and the preset distribution information.
In some embodiments, the camera modules include a first camera module and a second camera module, and the opening image includes a first opening image corresponding to the first camera module and a second opening image corresponding to the second camera module; the closed images comprise a first closed image corresponding to the first camera module and a second closed image corresponding to the second camera module; the processor is further configured to obtain first distribution information of the infrared light pattern according to the first opening image and the first closing image, obtain second distribution information of the infrared light pattern according to the second opening image and the second closing image, assist in determining corresponding feature points in the first opening image and the second opening image according to the first distribution information and the second distribution information, and obtain depth information of the target object through a binocular imaging algorithm according to the feature points.
The electronic device determines the corresponding feature points in the first opening image and the second opening image in an auxiliary mode through the first distribution information and the second distribution information, so that the matching speed of the feature points can be increased, and the detection speed of the depth information is improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram of an image processing method according to some embodiments of the invention;
FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present invention;
FIG. 3 is a schematic view of a camera module according to some embodiments of the present invention;
FIG. 4 is a schematic flow chart diagram of an image processing method according to some embodiments of the invention;
FIG. 5 is a schematic flow chart diagram of an image processing method according to some embodiments of the invention;
FIG. 6 is a schematic flow chart diagram of an image processing method according to some embodiments of the invention;
FIG. 7 is a schematic flow chart diagram of an image processing method according to some embodiments of the invention;
fig. 8 to 9 are schematic structural views of an electronic device according to some embodiments of the invention;
FIG. 10 is a schematic flow chart diagram of an image processing method according to some embodiments of the invention;
fig. 11 to 12 are schematic structural views of electronic devices according to some embodiments of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
Referring to fig. 1 and fig. 2, an image processing method according to an embodiment of the invention is applied to an electronic device 100. The electronic device 100 includes a structured light projector 20 and a camera module 10. The structured light projector 20 is operative to emit an infrared light pattern toward a target object when turned on. The camera module 10 is capable of receiving visible light and infrared light patterns for imaging. The image processing method comprises the following steps:
012: controlling the structured light projector 20 to start and acquire a start image corresponding to the camera module 10;
014: controlling the structured light projector 20 to close and acquire a closed image corresponding to the camera module 10;
016: acquiring distribution information of the infrared light pattern according to the opening image and the closing image; and
018: and acquiring the depth information of the target object according to the distribution information of the infrared light pattern.
Referring to fig. 2, an electronic device 100 according to an embodiment of the invention includes a structured light projector 20, a camera module 10, and a processor 30. The structured light projector 20 is operative to emit an infrared light pattern toward a target object when turned on. The camera module 10 is capable of receiving visible light and infrared light patterns for imaging. The processor 30 is configured to control the structured light projector 20 to turn on and acquire an on image corresponding to the camera module 10, control the structured light projector 20 to turn off and acquire an off image corresponding to the camera module 10, acquire distribution information of the infrared light pattern according to the on image and the off image, and acquire depth information of the target object according to the distribution information of the infrared light pattern.
That is, 012, 014, 016, and 018 can be implemented by processor 30.
Specifically, the processor 30 is electrically connected to both the structured light projector 20 and the camera module 10 (as shown in FIG. 8). Firstly, the processor 30 controls the structured light projector 20 to be started, the structured light projector 20 emits an infrared light pattern to a target object after being started, the camera module 10 collects visible light in the environment, the infrared light pattern modulated by the target object and infrared light in the environment with the same wave band as the infrared light pattern to form a starting image, and the processor 30 acquires the starting image formed by the camera module 10; then, the processor 30 controls the structured light projector 20 to be turned off, at this time, the structured light projector 20 does not emit an infrared light pattern outwards, the camera module 10 collects visible light in the environment and infrared light in the same wave band as the infrared light pattern in the environment to form a turn-off image, and the processor 30 acquires the turn-off image formed by the camera module 10; then, the processor 30 may obtain distribution information of the infrared light pattern according to an on-image (the on-image includes visible light information in the environment, infrared light pattern information modulated by the target object, and infrared light information in the environment having the same wavelength band as the infrared light pattern), and an off-image (the off-image includes visible light information in the environment and infrared light information in the environment having the same wavelength band as the infrared light pattern), where the on-image includes only infrared light pattern information more than the off-image, so that the visible light information in the environment corresponding to the off-image and the infrared light information in the environment having the same wavelength band as the infrared light pattern are removed from the on-image, and the obtained image includes only infrared light pattern information emitted by the structured light projector 20 and modulated by the target object, that is, distribution information of the infrared light pattern; finally, the depth information of the target object can be obtained according to the distribution information of the infrared light pattern.
The image processing method and the electronic device 100 according to the embodiment of the present invention obtain the distribution information only including the infrared light pattern emitted by the structured light projector 20 and modulated by the target object according to the on image and the off image, remove the influence of the ambient light (i.e., the visible light in the environment and the infrared light in the environment having the same wavelength band as the infrared light pattern), and are beneficial to improving the detection accuracy of the depth information.
The electronic device 100 according to the embodiment of the present invention may be a monitoring camera, a mobile phone, a tablet computer, a laptop computer, a game machine, a head display device, a smart watch, an access control system, a teller machine, etc., and the embodiment of the present invention is described by taking the electronic device 100 as a mobile phone as an example, it is understood that the specific form of the electronic device 100 may be other, and is not limited herein.
Referring to fig. 3, in some embodiments, the camera module 10 includes a substrate 11, a filter assembly 12, an image sensor 14, a lens holder 15 and a lens assembly 16. The lens assembly 16 is disposed on the lens holder 15 and corresponds to the image sensor 14. The filter assembly 12 is disposed in the lens holder 15 and corresponds to the image sensor 14. The lens assembly 16, the filter assembly 12, and the image sensor 14 are sequentially disposed along the light incident direction. The filter assembly 12 filters light entering from the lens assembly 16 such that the filtered light enters the image sensor 14.
The filter assembly 12 includes a filter 122, a bandpass coating 124, and a bandstop coating 126. The filter 122 is a full-transmission glass sheet and has a high transmittance. The filter 122 includes an object side 1222 and an image side 1224, and in this embodiment, the bandpass coating film 124 is disposed on the object side 1222 and the band-stop coating film 126 is disposed on the image side 1224. The bandpass coating 124 is used to allow only visible light and infrared light to pass through. The band-stop coating 126 is used for filtering infrared light with a predetermined cut-off waveband.
In imaging, light entering from the lens module 16 passes through the bandpass coating 124 disposed on the object side 1222, the filter 122, and the band-stop coating 126 disposed on the image side 1224. When the light passes through the band-pass coating 124, the light except for the visible light and the infrared light is filtered, in other words, only the visible light and the infrared light can pass through the band-pass coating 124, and then the visible light and the infrared light pass through the filter 122 and reach the band-stop coating 126, the infrared light of the predetermined cut-off band is filtered, so that only the visible light and the infrared light of the predetermined pass-through band pass through and reach the image sensor 14 for imaging.
In other embodiments, the filter assembly 12 may be flipped over to incorporate the lens assembly 16, i.e., the band stop coating 126 is disposed on the object side 1222 and the bandpass coating 124 is disposed on the image side 1224. At this time, the light beam sequentially passes through the band-stop coating film 126, the filter 122, and the band-pass coating film 124 and then reaches the image sensor 14 for imaging, when the light beam passes through the band-stop coating film 126, the infrared light of the predetermined cut-off band is filtered, that is, the light beam except the infrared light of the predetermined cut-off band passes through the filter 122 and reaches the band-pass coating film 124, and after passing through the band-pass coating film 124, only the visible light and the infrared light of the predetermined pass-through band pass through and enter the image sensor 14 for. It will be appreciated that the wavelength band of infrared light is typically 700 nm to 950 nm. In the embodiment of the invention, the predetermined passing waveband is 930 nm to 950 nm, and the predetermined cutoff waveband is 700 nm to 930 nm. That is, the filter assembly 12 only allows the infrared light and the visible light (400 nm to 700 nm) in the wavelength range of 930 nm to 950 nm to pass through to the image sensor 14 for imaging. The electronic device 100 of the embodiment of the invention can receive the visible light and the infrared light patterns for imaging only by arranging one camera module 10, and compared with the arrangement that one visible light camera is arranged for receiving the visible light for imaging and the other infrared light camera is arranged for receiving the infrared light patterns for imaging, the number of the camera modules 10 is reduced, which is beneficial to reducing the volume of the electronic device 100.
In some embodiments, the bandpass coating 124 has a transmittance of greater than 85% for light in the 400 nm to 955 nm band and the band-stop coating 126 has a transmittance of less than 15% for light in the 700 nm to 925 nm band.
Since the band-pass plating film 124 has a high transmittance for light between 400 nm and 955 nm, and the band-stop plating film 126 has a low transmittance for light between 700 nm and 925 nm, the transmittances of the visible light band of 400 nm to 700 nm and the predetermined pass band of 925 nm to 955 nm are high. Redundant wave bands (925 nm-930 nm and 950 nm-955 nm) are arranged in the 930 nm to 950 nm, namely the predetermined passing wave band, so that errors caused by manufacturing of the coating film are reduced, the passing rate of infrared light in the 925 to 955 wave bands is ensured, and the penetration rate of the infrared light in the predetermined passing wave band is ensured.
Referring to fig. 3, the substrate 11 may be a flexible circuit board, a hard circuit board or a rigid-flex board, and has a wide application range. The image sensor 14 is disposed on the substrate 11 and electrically connected to the substrate 11, and the image sensor 14 is used for receiving light to form an image. The image sensor 14 may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor chip or a Charge-coupled Device (CCD) image sensor chip.
A mirror mount 15 is disposed on the substrate 11, and a lens assembly 16 is mounted on an end of the mirror mount 15 remote from the substrate 11. Lens assembly 16 includes a barrel 162 and a lens group 164. The lens barrel 162 is combined with the lens base 15 and forms a receiving cavity 110 together with the substrate 11. The connection method of the lens barrel 162 and the lens holder 15 includes screwing, gluing, engaging, and the like. The lens assembly 164, the filter assembly 12 and the image sensor 14 are all accommodated in the accommodating cavity 110, and the lens assembly 164, the filter assembly 12 and the image sensor 14 are sequentially arranged along the light incident path. Lens group 164 may be a single lens, either a convex lens or a concave lens; or the lens is a plurality of lenses which can be convex lenses or concave lenses, or part of the lenses is convex lenses and part of the lenses is concave lenses.
The lens assembly 16 is configured with a suitable lens having the same focal length in the visible light band and the predetermined pass band, for example, the lens assembly 16 is a day and night confocal lens. In this way, clear images can be obtained in different environments (day and night) without refocusing.
The camera module 10 according to the embodiment of the invention allows visible light and infrared light of a specific wavelength band to pass through by respectively disposing the band-pass coating film 124 and the band-stop coating film 126 on two surfaces of the optical filter 122, so that the visible light and the infrared light can pass through no matter in daytime or at night, the light utilization scene is not limited, and an ICR filter (dual filter switcher) is not required, which is beneficial to reducing the volume of the electronic device 100. In addition, since the camera module 10 only allows infrared light of a specific wavelength band to pass through, for example, the above-mentioned 930 nm to 950 nm infrared light, and is not affected by infrared light of other wavelength bands in the environment, the imaging quality is better.
In some embodiments, the wavelength band of the infrared light pattern emitted by the structured light projector 20 is within a predetermined pass wavelength band range that the camera module 10 allows to pass.
That is to say, the infrared light pattern emitted by the structured light projector 20 can be received by the camera module 10, and the camera module 10 can not only avoid the influence of the infrared light of other non-predetermined passing bands in the environment, but also can cooperate with the structured light projector 20 to obtain the opening image containing the infrared light pattern information, thereby realizing that one camera module 10 can obtain the opening image containing the visible light information in the environment, the infrared light pattern information modulated by the target object and the infrared light information of the predetermined passing band in the environment, and obtain the closing image containing the visible light information in the environment and the infrared light information of the predetermined passing band in the environment, and further obtain the depth information of the target object by the image processing method of the embodiment of the present invention. In addition, when the structured light projector 20 is turned off, the camera module 10 can avoid the influence of the infrared light of the non-predetermined passing waveband, and the acquired image only includes the visible light information and the infrared light information of the predetermined passing waveband in a small part of the ambient light, so that the camera module can be used as a visible light image, and the imaging quality is not greatly influenced, that is, one camera module 10 can realize the detection of the depth information of the target object (the acquisition of the depth image), and can also realize the acquisition of the visible light image, thereby realizing the function multiplexing, and reducing the volume and the cost of the electronic device 100.
Referring to fig. 4, in some embodiments, 012 includes:
0122: when the structured light projector 20 is turned on, controlling the camera module 10 to collect a plurality of frames of first initial images;
0124: selecting a starting image from the multiple frames of first initial images according to focusing data corresponding to the multiple frames of first initial images; and/or
014 comprises:
0142: when the structured light projector 20 is turned off, controlling the camera module 10 to collect a plurality of frames of second initial images;
0144: and selecting a closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
In some embodiments, in particular, the processor 30 is configured to control the camera module 10 to capture a plurality of frames of the first initial image when the structured light projector 20 is turned on; selecting a starting image from the multiple frames of first initial images according to focusing data corresponding to the multiple frames of first initial images; and/or, when the structured light projector 20 is turned off, controlling the camera module 10 to collect a plurality of frames of second initial images; and selecting a closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
That is, 0122, 0124, 0142, and 0144 may be implemented by the processor 30.
More specifically, when the structured light projector 20 is turned on, the camera module 10 collects a plurality of consecutive frames of first initial images, the camera module 10 has a focusing process during shooting to obtain a clearer image, and the processor 30 selects a frame of clearer first initial image as a turn-on image according to focusing data corresponding to each frame of first initial image; when the structured light projector 20 is turned off, the camera module 10 may also collect a plurality of continuous frames of second initial images, and then the processor 30 selects a clear frame of second initial image from the second initial images as a closed image according to the focusing data corresponding to each frame of second initial image, and the selected open image and the selected closed image are clear, which is beneficial to improving the detection accuracy of the depth information; or, only when the structured light projector 20 is turned on, the processor 30 selects a clear first initial image from the first initial images as a turn-on image according to the focusing data corresponding to each first initial image, so that the amount of calculation can be reduced, and the detection speed of the depth information can be improved; or, only when the structured light projector 20 is turned off, the processor 30 selects a clear frame of the second initial image from the frames of the second initial image as the turned-off image according to the focusing data corresponding to each frame of the second initial image, so that the amount of calculation can be reduced, and the detection speed of the depth information can be improved.
In some embodiments, the turning on and off of the structured light projector 20 is performed at predetermined periods.
Specifically, for example, within one period T, the temporal structured light projector 20 of (3/4) T is in an on state, and the temporal structured light projector 20 of (1/4) T is in an off state. Further, the time for turning on the structured light projector 20 is longer than the time for turning off the structured light projector 20, that is, in one period T, more time is used for obtaining the opening image, so that the obtained opening image containing the infrared light pattern information is ensured to be clearer, and the detection accuracy of the depth information is favorably improved. Of course, the time allocation of the on state and the off state may be determined according to actual requirements.
Referring to fig. 5, in some embodiments, the image processing method further includes, before 016:
011: carrying out noise reduction processing on the opening image; and/or
013: and performing noise reduction processing on the closed image.
In some embodiments, processor 30 is also configured to perform noise reduction processing on the on image; and/or, performing noise reduction processing on the closed image.
That is, 011 and 013 can be implemented by processor 30.
Specifically, when the camera module 10 collects an image, the collected image may have noise due to the ambient light and the influence of the image sensor 14 (shown in fig. 3) of the camera module 10 itself, and the processor 30 performs noise reduction on the opening image and the closing image after obtaining the opening image and the closing image according to the image collected by the camera module 10, and may implement noise reduction on the opening image and the closing image by using methods such as filter noise reduction, so as to improve the quality of the opening image and the closing image, and facilitate improvement of the detection accuracy of the depth information. Or, the processor 30 only performs noise reduction processing on the opening image, so that the calculation amount can be saved, and the detection speed of the depth information can be improved. Alternatively, the processor 30 only performs noise reduction processing on the closed image, which can save the amount of calculation and increase the detection speed of the depth information.
Referring to fig. 6, in some embodiments, the electronic device 100 further includes an ambient light sensor 40. The ambient light sensor 40 is used to sense the color temperature and brightness of ambient light. The image processing method comprises before 016:
015: carrying out color correction on the opening image according to the color temperature and the brightness of the ambient light; and/or
017: the closed image is color corrected according to the color temperature and brightness of the ambient light.
In some embodiments, the processor 30 is further configured to color correct the on-image according to the color temperature and brightness of the ambient light; and/or, color correcting the closed image according to the color temperature and brightness of the ambient light.
That is, 015 and 017 may be implemented by processor 30.
Specifically, since the camera module 10 can simultaneously acquire visible light and infrared light of a predetermined wavelength band, the color shift and other problems may occur in the image acquired by the camera module 10, and in the case of different ambient light sources, the color temperature and the brightness of the ambient light may also be different, and the color shift or the uneven distribution of light and shade may also occur in the image acquired by the camera module 10. In the embodiment of the present invention, the processor 30 controls the ambient light sensor 40 to detect the color temperature and the brightness of the ambient light, and then performs color correction on the open image and the close image collected by the camera module 10 according to the color temperature and the brightness of the ambient light, so that the colors of the open image and the close image are more accurate, and the detection speed of the depth information is increased. For example, when the color temperature of the current ambient light is high, the image will be red, and the color correction may be to correct the colors of the on-image and the off-image by using a white balance method. For another example, when the brightness of the current environment is too high or too low, the brightness of the on image and the off image may be corrected by decreasing or increasing the brightness, or the like. Alternatively, the processor 30 may perform color correction only on the startup image according to the color temperature and brightness of the ambient light; alternatively, the processor 30 may color correct only the off image based on the color temperature and brightness of the ambient light. The color correction is only carried out on the opening image or only carried out on the closing image, so that the calculated amount of the color correction can be reduced, and the detection speed of the depth information is improved.
Referring to fig. 7, in some embodiments, the predetermined distribution information is stored in a database of the electronic device 100. 018 includes:
0182: and comparing the distribution information of the infrared light pattern with the preset distribution information to obtain depth information.
In some embodiments, the processor 30 is specifically configured to compare the distribution information of the infrared light pattern with predetermined distribution information to obtain depth information.
That is, 0182 may be implemented by processor 30.
Specifically, after obtaining the distribution information of the infrared light pattern, the processor 30 compares the distribution information of the infrared light pattern with the predetermined distribution information stored in the database of the electronic device 100, and based on the difference between the distribution information of the infrared light pattern and the predetermined distribution information, for example, the change in size, shape, arrangement, position, and the like, the depth information of the target object can be obtained more accurately.
Referring to fig. 8 and 9, in one example, the electronic device 100 may have a projection window 50 corresponding to the structured light projector 20 and a collection window 60 corresponding to the camera module 10. The structured light projector 20 is configured to project an infrared light pattern of a predetermined passing wavelength band to the target object through the projection window 50, and the camera module 10 is configured to receive visible light and infrared light of the predetermined passing wavelength band to image. When the structured light projector 20 is turned on, as shown in fig. 8, the infrared light pattern emitted by the structured light projector 20 is a speckle pattern, the camera module 10 is configured to collect multiple frames of first initial images through the collection window 60, where the multiple frames of first initial images all include visible light information in the environment, infrared light pattern information modulated by the target object, and infrared light information in a predetermined passing waveband in the environment. When the structured light projector 20 is turned off, as shown in fig. 9, the camera module 10 is configured to capture multiple frames of second initial images through the capture window 60, where the second initial images include visible light information in the environment and infrared light information in a predetermined passing band in the environment, and because only the visible light and the infrared light in the predetermined passing band enter the image sensor 14 together for imaging, the multiple frames of second initial images can be used as visible light images, and the imaging quality is not greatly affected. The processor 30 is electrically connected to the camera module 10 and the structured light projector 20, and the processor 30 is configured to process the plurality of first initial images to select an on image and process the plurality of second initial images to select an off image. After obtaining the on image and the off image, the processor 30 may obtain distribution information of the infrared light pattern according to the on image (the on image includes visible light information in the environment, infrared light pattern information modulated by the target object, and infrared light information in the environment with the same wavelength band as the infrared light pattern) and the off image (the off image includes visible light information in the environment and infrared light information in the environment with the same wavelength band as the infrared light pattern). Since the on-image contains only more infrared light pattern information than the off-image, visible light information in the environment and infrared light information in a predetermined wavelength band in the environment corresponding to the off-image are removed from the on-image, and the obtained image contains only infrared light pattern information, i.e., distribution information of the infrared light pattern, which is emitted by the structured light projector 20 and modulated by the target object. The visible light information in the environment and the infrared light information in the predetermined waveband in the environment corresponding to the closed image are removed from the open image by a method of differentiating the open image and the closed image. Finally, the processor 30 obtains distribution information of the infrared light pattern (for example, the infrared light pattern is a speckle pattern or a coded structured light pattern with a specific code), compares the distribution information of the infrared light pattern with predetermined distribution information, and obtains depth information of the target object according to a difference between the distribution information of the infrared light pattern and the predetermined distribution information, such as a change in size, shape, brightness, arrangement, position, and the like of the distribution information of the infrared light pattern.
Referring to fig. 10 to 12, in some embodiments, the camera module 10 includes a first camera module 17 and a second camera module 18. The opening image includes a first opening image corresponding to the first camera module 17 and a second opening image corresponding to the second camera module 18. The closed image includes a first closed image corresponding to the first camera module 17 and a second closed image corresponding to the second camera module 18. 016 includes:
0162: and acquiring first distribution information of the infrared light pattern according to the first opening image and the first closing image and acquiring second distribution information of the infrared light pattern according to the second opening image and the second closing image.
018 includes:
0184: determining corresponding feature points in the first opening image and the second opening image in an auxiliary manner according to the first distribution information and the second distribution information;
0186; and obtaining the depth information of the target object through a binocular imaging algorithm according to the characteristic points.
In some embodiments, the camera module 10 includes a first camera module 17 and a second camera module 18. The opening image includes a first opening image corresponding to the first camera module 17 and a second opening image corresponding to the second camera module 18. The closed image includes a first closed image corresponding to the first camera module 17 and a second closed image corresponding to the second camera module 18. The processor 30 is specifically configured to obtain first distribution information of the infrared light pattern according to the first opening image and the first closing image, obtain second distribution information of the infrared light pattern according to the second opening image and the second closing image, assist in determining corresponding feature points in the first opening image and the second opening image according to the first distribution information and the second distribution information, and obtain depth information of the target object according to the feature points through a binocular imaging algorithm.
That is, 0162, 0184, and 0186 may be implemented by processor 30.
Specifically, the processor 30 may select a first open image from consecutive multiple frames of first initial open images collected by the first camera module 17 and a first close image from consecutive multiple frames of first initial close images collected by the first camera module 17, and may select a second open image from consecutive multiple frames of second initial open images collected by the second camera module 18 and a second close image from consecutive multiple frames of second initial close images collected by the second camera module 18. Then, first distribution information of the infrared light pattern can be obtained according to the first opening image and the first closing image, and second distribution information of the infrared light pattern can be obtained according to the second opening image and the second closing image. When the feature points are compared, all parts of the first opening image and the second opening image need to be compared one by one, a large amount of calculation is needed, because infrared light pattern information exists in the first opening image and the second opening image, the feature points can be determined in an auxiliary mode through first distribution information of the infrared light patterns and second distribution information of the infrared light patterns, the feature point positions corresponding to the first opening image and the second opening image are determined rapidly, finally, the depth information of the target object is rapidly acquired according to the visual angle difference and the corresponding feature point positions of the two images based on a binocular ranging algorithm, and the detection speed of the depth information is favorably improved.
In one example, referring to fig. 11 and 12, the first camera module 17 and the second camera module 18 may constitute a binocular imaging system. When the number of the camera modules 10 is greater than two, a plurality of camera modules 10 may form a multi-view imaging system. The embodiment of the present invention is described with the number of the camera modules 10 being two, and the principle that the number of the camera modules 10 is multiple is similar, and will not be described herein again.
When image capturing is performed, the first camera module 17 and the second camera module 18 are generally mounted at different positions, and the image capturing ranges of the images obtained by the first camera module and the second camera module are generally different. In the embodiment of the present invention, the field ranges of the first camera module 17 and the second camera module 18 have an overlapping portion, that is, the first camera module 17 and the second camera module 18 can photograph the same target object. The processor 30 may combine the two images into one image having a larger shooting range. In other embodiments, when the number of camera modules included in the camera module 10 is large enough, a panoramic camera may be configured to acquire a panoramic image.
As shown in fig. 11, when the structured light projector 20 is turned on, the infrared light pattern of the predetermined passing wavelength band emitted by the structured light projector 20 through the projection window 50 is a speckle pattern, and the first camera module 17 is configured to collect a first turned-on image through the collection window 62, where the first turned-on image includes visible light information in the environment, infrared light pattern information modulated by the target object, and infrared light information of the predetermined passing wavelength band in the environment. The second camera module 18 is configured to collect a second opening image through the collection window 64, where the second opening image includes visible light information in the environment, infrared light pattern information modulated by the target object, and infrared light information in a predetermined passing band in the environment. As shown in fig. 12, when the structured light projector 20 is turned off, the first camera module 17 is configured to capture a first off image through the capture window 62, the first off image including visible light information in the environment and infrared light information in a predetermined pass band in the environment. The second camera module 18 is configured to capture a second closed image through the capture window 64, where the second closed image includes visible light information in the environment and infrared light information in a predetermined pass band in the environment. The first open image, the first close image, the second open image and the second close image may be selected from a plurality of consecutive initial images, and the principle is similar to the above-mentioned acquisition of the open image and the close image, and is not described herein again. The processor 30 is configured to obtain first distribution information of the infrared light pattern according to the first on image and the first off image, and obtain second distribution information of the infrared light pattern according to the second on image and the second off image. The principle is similar to the above-mentioned acquisition of distribution information of the infrared light pattern, and is not described herein again. The binocular imaging algorithm generally compares two images shot at different viewing angles, and first determines feature points corresponding to each other in the two images, that is, parts of the two images where the target object is shot are the same, for example, when a human face is shot, it is determined that the parts of the two images where the two images are compared are corresponding feature points of the same part, for example, a nose of a first image corresponds to a nose of a second image, eyes of the first image correspond to eyes of the second image, and the like. When the feature points are compared, all parts of the first opening image and the second opening image need to be compared one by one, a large amount of calculation is needed, because infrared light pattern information exists in the first opening image and the second opening image, the feature points can be determined in an auxiliary mode through first distribution information of the infrared light patterns and second distribution information of the infrared light patterns, the feature point positions corresponding to the first opening image and the second opening image are determined rapidly, finally, the depth information of the target object is obtained rapidly according to the visual angle difference and the corresponding feature point positions of the two images based on a binocular ranging algorithm, and the detection speed of the depth information is improved beneficially.
In the description of the present specification, reference to the description of the terms "one embodiment", "some embodiments", "an illustrative embodiment", "an example", "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for performing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the above method may be implemented by hardware that is configured to be instructed by a program, and the program may be stored in a computer-readable storage medium, and when executed, may include one or a combination of the steps of the method embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be executed in the form of hardware or in the form of a software functional module. The integrated module, if executed in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (12)

1. An image processing method applied to an electronic device, wherein the electronic device comprises a structured light projector and a camera module, the structured light projector is used for emitting an infrared light pattern towards a target object when the electronic device is started, the camera module can receive visible light and the infrared light pattern for imaging, and the image processing method comprises the following steps:
controlling the structured light projector to be started and acquiring a starting image corresponding to the camera module;
controlling the structured light projector to be closed and acquiring a closed image corresponding to the camera module;
acquiring distribution information of the infrared light pattern according to the opening image and the closing image; and
and acquiring the depth information of the target object according to the distribution information of the infrared light pattern.
2. The image processing method according to claim 1,
the step of controlling the structured light projector to open and acquire an open image corresponding to the camera module comprises:
when the structured light projector is started, controlling the camera module to collect a plurality of frames of first initial images;
selecting the opening image from the plurality of frames of first initial images according to focusing data corresponding to the plurality of frames of first initial images;
and/or
The step of controlling the structured light projector to close and acquire a closed image corresponding to the camera module comprises:
when the structured light projector is closed, controlling the camera module to collect a plurality of frames of second initial images;
and selecting the closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
3. The image processing method according to claim 1, wherein the image processing method further comprises, before the step of acquiring distribution information of the infrared light pattern from the on image and the off image:
carrying out noise reduction processing on the opening image; and/or
And carrying out noise reduction processing on the closed image.
4. The image processing method according to claim 1, wherein the electronic device further includes an ambient light sensor for sensing a color temperature and a brightness of ambient light, and before the step of acquiring distribution information of the infrared light pattern from the on image and the off image, the image processing method further includes:
carrying out color correction on the opening image according to the color temperature and the brightness; and/or
And carrying out color correction on the closed image according to the color temperature and the brightness.
5. The image processing method according to claim 1, wherein predetermined distribution information is stored in a database of the electronic device, and the step of obtaining the depth information of the target object according to the distribution information of the infrared light pattern comprises:
and comparing the distribution information of the infrared light pattern with the preset distribution information to obtain the depth information.
6. The image processing method according to claim 1, wherein the camera module comprises a first camera module and a second camera module, and the opening image comprises a first opening image corresponding to the first camera module and a second opening image corresponding to the second camera module; the closed images comprise a first closed image corresponding to the first camera module and a second closed image corresponding to the second camera module; the step of acquiring the distribution information of the infrared light pattern according to the opening image and the closing image comprises the following steps:
acquiring first distribution information of the infrared light pattern according to the first opening image and the first closing image and acquiring second distribution information of the infrared light pattern according to the second opening image and the second closing image;
the step of acquiring the depth information of the target object according to the distribution information of the infrared light pattern includes:
determining corresponding feature points in the first opening image and the second opening image in an auxiliary manner according to the first distribution information and the second distribution information;
and obtaining the depth information of the target object through a binocular imaging algorithm according to the characteristic points.
7. An electronic device comprising a structured light projector for emitting an infrared light pattern towards a target object when switched on, a camera module capable of receiving visible light and the infrared light pattern for imaging, and a processor for:
controlling the structured light projector to be started and acquiring a starting image corresponding to the camera module;
controlling the structured light projector to be closed and acquiring a closed image corresponding to the camera module;
acquiring distribution information of the infrared light pattern according to the opening image and the closing image; and
and acquiring the depth information of the target object according to the distribution information of the infrared light pattern.
8. The electronic device of claim 7, wherein the processor is further configured to:
when the structured light projector is started, controlling the camera module to collect a plurality of frames of first initial images;
selecting the opening image from the plurality of frames of first initial images according to focusing data corresponding to the plurality of frames of first initial images;
and/or
When the structured light projector is closed, controlling the camera module to collect a plurality of frames of second initial images;
and selecting the closed image from the plurality of frames of second initial images according to focusing data corresponding to the plurality of frames of second initial images.
9. The electronic device of claim 7, wherein the processor is further configured to:
carrying out noise reduction processing on the opening image; and/or
And carrying out noise reduction processing on the closed image.
10. The electronic device of claim 7, further comprising an ambient light sensor to sense a color temperature and a brightness of ambient light, the processor further to:
carrying out color correction on the opening image according to the color temperature and the brightness; and/or
And carrying out color correction on the closed image according to the color temperature and the brightness.
11. The electronic device of claim 7, wherein the predetermined distribution information is stored in a database of the electronic device, and the processor is further configured to:
and comparing the distribution information of the infrared light pattern with the preset distribution information to obtain the depth information.
12. The electronic device of claim 7, wherein the camera module comprises a first camera module and a second camera module, and the opening image comprises a first opening image corresponding to the first camera module and a second opening image corresponding to the second camera module; the closed images comprise a first closed image corresponding to the first camera module and a second closed image corresponding to the second camera module; the processor is further configured to:
acquiring first distribution information of the infrared light pattern according to the first opening image and the first closing image and acquiring second distribution information of the infrared light pattern according to the second opening image and the second closing image;
determining corresponding feature points in the first opening image and the second opening image in an auxiliary manner according to the first distribution information and the second distribution information;
and obtaining the depth information of the target object through a binocular imaging algorithm according to the characteristic points.
CN201810907479.5A 2018-08-10 2018-08-10 Image processing method and electronic device Pending CN110868506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810907479.5A CN110868506A (en) 2018-08-10 2018-08-10 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810907479.5A CN110868506A (en) 2018-08-10 2018-08-10 Image processing method and electronic device

Publications (1)

Publication Number Publication Date
CN110868506A true CN110868506A (en) 2020-03-06

Family

ID=69650844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810907479.5A Pending CN110868506A (en) 2018-08-10 2018-08-10 Image processing method and electronic device

Country Status (1)

Country Link
CN (1) CN110868506A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112615979A (en) * 2020-12-07 2021-04-06 江西欧迈斯微电子有限公司 Image acquisition method, image acquisition apparatus, electronic apparatus, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120262553A1 (en) * 2011-04-14 2012-10-18 Industrial Technology Research Institute Depth image acquiring device, system and method
CN104885451A (en) * 2012-11-23 2015-09-02 Lg电子株式会社 Method and apparatus for obtaining 3D image
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method
CN106896370A (en) * 2017-04-10 2017-06-27 上海图漾信息科技有限公司 Structure light measurement device and method
WO2018135315A1 (en) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 Image capturing device, image processing method, and image processing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120262553A1 (en) * 2011-04-14 2012-10-18 Industrial Technology Research Institute Depth image acquiring device, system and method
CN104885451A (en) * 2012-11-23 2015-09-02 Lg电子株式会社 Method and apparatus for obtaining 3D image
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method
WO2018135315A1 (en) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 Image capturing device, image processing method, and image processing system
US20200112662A1 (en) * 2017-01-20 2020-04-09 Sony Semiconductor Solutions Corporation Imaging device, image processing method, and image processing system
CN106896370A (en) * 2017-04-10 2017-06-27 上海图漾信息科技有限公司 Structure light measurement device and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112615979A (en) * 2020-12-07 2021-04-06 江西欧迈斯微电子有限公司 Image acquisition method, image acquisition apparatus, electronic apparatus, and storage medium

Similar Documents

Publication Publication Date Title
US9503616B2 (en) Image capturing apparatus
US7414664B2 (en) Image taking apparatus and lens apparatus
KR102124832B1 (en) Auto focus system of camera device, and camera device using the same
CN101491081B (en) Dome type monitor camera device
US10326941B2 (en) Image generating apparatus, imaging observing apparatus, imaging apparatus, and storage medium storing image processing program
US9264612B2 (en) Imaging apparatus and photographing support method
JP6559031B2 (en) Imaging device or surveillance camera device
US10386632B2 (en) Lens, camera, package inspection system and image processing method
JP7156352B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
US10564391B2 (en) Imaging device and control method therefor
CN110691176A (en) Filter assembly, camera module, image capturing device and electronic device
US9635242B2 (en) Imaging apparatus
JP2006317595A (en) Optical apparatus and its control method
CN110868506A (en) Image processing method and electronic device
CN116339049A (en) TOF sensor and projection correction method and system based on TOF sensor
KR20020004523A (en) Method for controlling of ccd camera
JP5451333B2 (en) TV camera device for surveillance
US9723207B2 (en) Imaging device and control method therefor
US9832364B2 (en) Automatic focal adjustment apparatus and method of controlling automatic focal adjustment apparatus, and image capture apparatus
JP2009109792A (en) Autofocusing device and camera using it
JP2008170748A (en) Imaging apparatus and its control method
JP2005024858A (en) Digital single lens reflex camera
WO2020007169A1 (en) Filtering assembly, camera module, image capturing device and electronic device
US10567662B2 (en) Imaging device and control method therefor using shift direction calculation
JP5078779B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 330096 No.699 Tianxiang North Avenue, Nanchang hi tech Industrial Development Zone, Nanchang City, Jiangxi Province

Applicant after: Jiangxi OMS Microelectronics Co.,Ltd.

Address before: 330013 No.698 Tianxiang Avenue, high tech Zone, Nanchang City, Jiangxi Province

Applicant before: OFilm Microelectronics Technology Co.,Ltd.

Address after: 330013 No.698 Tianxiang Avenue, high tech Zone, Nanchang City, Jiangxi Province

Applicant after: OFilm Microelectronics Technology Co.,Ltd.

Address before: 330013 No.698 Tianxiang Avenue, high tech Zone, Nanchang City, Jiangxi Province

Applicant before: NANCHANG OFILM BIO-IDENTIFICATION TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200306