CN107370951B - Image processing system and method - Google Patents

Image processing system and method Download PDF

Info

Publication number
CN107370951B
CN107370951B CN201710676499.1A CN201710676499A CN107370951B CN 107370951 B CN107370951 B CN 107370951B CN 201710676499 A CN201710676499 A CN 201710676499A CN 107370951 B CN107370951 B CN 107370951B
Authority
CN
China
Prior art keywords
structured light
image
projection pattern
depth
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710676499.1A
Other languages
Chinese (zh)
Other versions
CN107370951A (en
Inventor
周意保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710676499.1A priority Critical patent/CN107370951B/en
Publication of CN107370951A publication Critical patent/CN107370951A/en
Application granted granted Critical
Publication of CN107370951B publication Critical patent/CN107370951B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Abstract

The invention discloses an image processing system and method, wherein the system comprises: the first structured light projector is used for projecting first projection pattern structured light to the shooting subject; the second structured light projector is used for projecting second projection pattern structured light to the shooting main body; the camera is used for shooting a first structured light image of the first projection pattern structured light modulated by the shooting main body and shooting a second structured light image of the second projection pattern structured light modulated by the shooting main body; the image signal processor is configured to demodulate the first structured light image to extract first depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, demodulate the second structured light image to extract second depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, and perform image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image. The target image obtained by the system reflects the shooting subject more truly, and the system has good shooting effect and good user experience.

Description

Image processing system and method
Technical Field
The present invention relates to the field of information processing technologies, and in particular, to an image processing system and method.
Background
With the increasingly powerful photographing function of mobile terminals such as mobile phones and tablets, more and more people can take pictures by using the mobile phones to replace cameras, and the high-performance image processing system configured on the mobile terminals can improve the photographing effect and ensure good user experience.
In the current image processing system, there is a single-camera image processing system, and compared with a dual-camera image processing system, the single-camera image processing system has one less camera, and accordingly, the mobile terminal of the image processing system configured with the single camera is thinner.
However, the photographing performance of the existing single-camera image processing system is inferior to that of the dual-camera image processing system, and manufacturers of mobile terminals often need to make a choice in the thickness and photographing performance of the mobile terminals. Therefore, how to improve the performance of the image processing system with a single camera becomes an urgent technical problem to be solved.
Disclosure of Invention
The invention provides an image processing system and method, which aim to solve the problem that in the prior art, the user experience of an image processing system is poor.
A first aspect of an embodiment of the present invention provides an image processing system, including: the device comprises a first structured light projector, a second structured light projector, a camera and an image signal processor; the camera, the first structured light projector and the second structured light projector are arranged in a common transverse center line; the first structured light projector is used for projecting first projection pattern structured light to the shooting main body; the second structured light projector is used for projecting second projection pattern structured light to the shooting main body; the camera is used for shooting a first structured light image of the first projection pattern structured light modulated by the shooting main body and shooting a second structured light image of the second projection pattern structured light modulated by the shooting main body; the image signal processor is configured to demodulate the first structured light image to extract first depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, demodulate the second structured light image to extract second depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, and perform image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image.
A second aspect of the embodiments of the present invention provides an image processing method, including: the device comprises a first structured light projector, a second structured light projector, a camera and an image signal processor; the camera, the first structured light projector and the second structured light projector are arranged in a common transverse center line; the first structured light projector projects first projection pattern structured light to the photographic subject; the second structured light projector is used for projecting second projection pattern structured light to the shooting subject; the camera shoots a first structured light image of the first projection pattern structured light modulated by the shooting main body, and shoots a second structured light image of the second projection pattern structured light modulated by the shooting main body; the image signal processor demodulates the first structured light image to extract first depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, demodulates the second structured light image to extract second depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image.
A third aspect of the embodiments of the present invention provides a terminal device, including a memory and a processor, where the memory stores computer-readable instructions, and the instructions, when executed by the processor, cause the processor to execute the image processing method according to the second aspect of the present invention.
A fourth aspect of the embodiments of the present invention provides a non-transitory computer-readable storage medium on which a computer program is stored, the computer program, when executed by a processor, implementing an image processing method according to an embodiment of the second aspect of the present invention.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
firstly, a first structured light projector projects first projection pattern structured light to a shooting main body; a second structured light projector projecting a second projection pattern structured light to the photographic subject; then, shooting a first structured light image of the first projection pattern structured light modulated by the shooting main body and shooting a second structured light image of the second projection pattern structured light modulated by the shooting main body by the camera; finally, the image signal processor demodulates the first structured light image to extract first depth-of-field information with a depth value larger than the foreground threshold and smaller than the background threshold, demodulates the second structured light image to extract second depth-of-field information with a depth value larger than the foreground threshold and smaller than the background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image; the obtained target image reflects the shooting subject more truly, and the shooting effect is good and the user experience is good.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of an image processing system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an exemplary arrangement of a first structured light projector, a second structured light projector, and a camera of FIG. 1;
FIG. 3 is a schematic illustration of a further exemplary arrangement of a first structured light projector, a second structured light projector, and a camera of FIG. 1;
FIG. 4 is a flow chart of an image processing method according to an embodiment of the invention;
fig. 5 is a schematic structural diagram of an image processing circuit in a terminal device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
An image processing system and method of an embodiment of the present invention are described below with reference to the drawings. Wherein the image processing system may be integrated into a mobile terminal. Before describing the image processing system of the embodiment of the present invention, a brief description will be given of the structured light technology. The structured light technology is to use a pattern with a special structure (such as discrete light spots, stripe light, coded structured light, and the like) designed in advance, then project the pattern onto the surface of a three-dimensional space object, and observe the distortion of the image formed on the three-dimensional physical surface by using an imaging device such as a camera 1110. If the projected pattern of structured light is a plane on the surface of the object, the observed pattern of structured light in the image is similar to the projected pattern, without distortion, but with a certain scale variation depending on distance. However, if the surface of the object is not flat, the observed structured light pattern will be distorted differently due to the different geometry of the surface of the object, and the three-dimensional shape and depth information of the object can be calculated according to the algorithm based on the known structured light pattern and the observed distortion, which are different according to the distance. That is to say, when the structured light of a certain projection pattern is projected to an object in the real world, the structured light of the certain projection pattern is reflected on the surface of the object, because the object in the real world is three-dimensional, the structured light reflected by the object does not have the same pattern as the structured light before reflection any more, by comparing the projection pattern of the structured light before reflection with the deformation pattern of the structured light after reflection, the three-dimensional spatial information of the object in the real world can be rapidly and accurately acquired, the structured light technology is applied to a shooting scene, and the imaging quality and the precision of the shot image are good.
Fig. 1 is a schematic structural diagram of an image processing system according to an embodiment of the present invention. As shown in fig. 1, the image processing system provided in the present embodiment includes: a first structured light projector 1116, a second structured light projector 1117, a camera 1110, an image signal processor 1130.
Fig. 2 is a schematic diagram of an exemplary arrangement of the first structured light projector 1116, the second structured light projector 1117, and the camera 1110 of fig. 1. Fig. 3 is a schematic diagram of an arrangement of yet another exemplary first structured light projector 1116, second structured light projector 1117, camera 1110 of fig. 1. Referring to fig. 2, the camera 1110, the first structured light projector 1116 and the second structured light projector 1117 are arranged in sequence along a transverse center line. Referring to fig. 3, the first structured light projector 1116, the camera 1110, and the second structured light projector 1117 are arranged in sequence along a transverse center line. Referring to fig. 2 or 3, the camera 1110, the first structured light projector 1116 and the second structured light projector 1117 are sequentially arranged in the mobile terminal in a horizontal direction, and since the mutual distance between the camera 1110, the first structured light projector 1116 and the second structured light projector 1117 directly affects the imaging quality, the mutual distance between the camera 1110, the first structured light projector 1116 and the second structured light projector 1117 is set according to the photographing quality parameters of the mobile terminal strictly by the manufacturer of the mobile terminal.
Specifically, a first structured light projector 1116 for projecting a first projection pattern structured light toward the photographic subject; and a second structured light projector 1117 for projecting a second projection pattern structured light to the photographic subject. The first projection pattern may be the same as or different from the second projection pattern. For example, the projection pattern in the present embodiment may be a laser stripe, a gray code, a sinusoidal stripe, or a randomly arranged speckle pattern. The first structured light projector 1116 and the second structured light projector 1117 in this embodiment can project projection pattern structured light of various shapes. For example, the first structured light projector 1116 or the second structured light projector 1117 in this embodiment can be designed by using a light source controller and an array light source, the light source controller can control the light emitting state of each light source in the array light source according to the selected projection pattern, and the light emitted by all the illuminated light sources is the projection pattern structured light.
In one possible implementation, the first structured light projector 1116 includes a first light source controller, a first array of light sources; a first light source controller for controlling a light emitting state of each light source of the first array of light sources according to the first projection pattern, the light emitting state including: a lit state and a extinguished state; the first array light source is used for emitting first projection pattern structured light to be projected onto the shooting main body, and the first projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
In one possible implementation, the second structured light projector 1117 includes a second light source controller, a second array of light sources; a second light source controller for controlling a light emitting state of each light source of the second array of light sources according to a second projection pattern, the light emitting state including: a lit state and a extinguished state; and the second array light source is used for emitting second projection pattern structured light to be projected onto the shooting main body, and the second projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
Specifically, objects in the real world are very different, and the surface topography of some objects is very uneven, so that the imaging quality of a target image can be ensured only by projecting structured light with a relatively simple pattern onto the object and also by modulating the structured light by the object by a relatively large deformation amount; some objects have flat surface appearance, and structured light with complex patterns needs to be projected to the objects at this time, so that the structured light modulated by the objects has larger deformation, and the imaging quality of target images is ensured. Compared with the mode that different shapes of gratings are used for forming different shapes of projection pattern structured light, the mode that the light source controller and the array light source are used for forming the projection pattern structured light in the different shapes is simple, efficient and easy to control, and shooting requirements under different scenes can be met. Preferably, the array Light source is an LED (Light-Emitting Diode) array Light source. The LED (Light-Emitting Diode) array Light source has the advantages of low power consumption, high brightness, low heat, small volume, long service life, environmental protection, low voltage and the like. When an LED (Light-Emitting Diode) array Light source is applied to the first structured Light projector 1116 or the second structured Light projector 1117 in the present embodiment, a miniaturized first structured Light projector 1116 or second structured Light projector 1117 can be manufactured, and accordingly, when the first structured Light projector 1116 or the second structured Light projector 1117 is integrated into a mobile terminal, it does not occupy too much space, and the trend of the mobile terminal to be lighter and thinner is satisfied.
The camera 1110 is configured to capture a first structured light image of the first projection pattern structured light modulated by the shooting subject, and capture a second structured light image of the second projection pattern structured light modulated by the shooting subject.
Specifically, the first structured light projector 1116 is turned on first to project the first projection pattern structured light to the photographic subject, the camera 1110 photographs the first structured light image, the second structured light projector 1117 is turned on at intervals of a set time to project the second projection pattern structured light to the photographic subject, and the camera 1110 photographs the second structured light image. It should be noted that the first structured light projector 1116 and the second structured light projector 1117 are controlled to operate alternately. For example, the on-time and on-sequence of the first structured light projector 1116 and the second structured light projector 1117 are controlled strictly according to the photographing quality parameters of the mobile terminal by the manufacturer of the mobile terminal. For example, the control of the first structured light projector 1116 and the second structured light projector 1117 to alternately operate may be performed by the image signal processor 1130 without an additional special control device, thereby not increasing the thickness of the mobile terminal.
Because the first structured light projector 1116 and the second structured light projector 1117 are arranged at different positions of the camera 1110, the light path of the first projection pattern structured light projected from the first structured light projector 1116 onto the photographic subject is different from the light path of the first projection pattern structured light projected from the first structured light projector 1116 onto the photographic subject, on the basis of controlling the distance between the camera 1110, the first structured light projector 1116 and the second structured light projector 1117, the whole photographic subject can be irradiated by the structured light, and then the obtained target image can be reflected on the photographic subject more truly through image processing of the first structured light image and the second structured light image, which is beneficial to improving the photographic effect and user experience. Preferably, a first projection pattern corresponding to the first projection pattern structured light and a second projection pattern corresponding to the second projection pattern structured light are different from each other, and accordingly, a degree of deformation of the first projection pattern structured light modulated by the subject with respect to the first projection pattern is different from a degree of deformation of the second projection pattern structured light modulated by the subject with respect to the second projection pattern, and there may be a difference between the captured first structured light image and the captured second structured light image. Compared with the mode that the main body is shot by one-time structured light projection or the main body is shot by structured light projection of the same projection pattern, accidental errors can be eliminated by the aid of the main body shot by structured light projection of different projection patterns, the main body can be shot by the aid of the obtained target image through image processing of the first structured light image and the second structured light image, and accordingly shooting effect and user experience can be improved.
Specifically, the type of the camera 1110 in this embodiment is not limited, and may be the wide-angle camera 1110 or the telephoto camera 1110. The basic imaging principle of the camera 1110 is as follows: the image sensor built in the camera 1110 receives light reflected by a real-world object, and because the image sensor is an optical sensor, the received light reflected by the object is converted into an electrical signal representing the object, and the electrical signal representing the object is subjected to information processing to output an object image. Therefore, the process of forming the first structured-light image in this embodiment is roughly: the first structured light projector 1116 is used to project first projection pattern structured light to the photographic subject, the first projection pattern structured light is reflected on the photographic subject (i.e. the first projection pattern structured light is modulated by the photographic subject), and the camera 1110 receives the structured light reflected by the photographic subject by using a built-in image sensor, and outputs a first structured light image after information processing. Accordingly, the process of forming the second structured-light image in this embodiment is roughly: the second structured light projector 1117 projects second projection pattern structured light to the photographic subject, the second projection pattern structured light is reflected on the photographic subject (i.e. the second projection pattern structured light is modulated by the photographic subject), the camera 1110 receives the structured light reflected by the photographic subject by using a built-in image sensor, and outputs a second structured light image after information processing.
The image signal processor 1130 is configured to demodulate the first structured light image to extract first depth-of-field information having a depth value greater than the foreground threshold and smaller than the background threshold, demodulate the second structured light image to extract second depth-of-field information having a depth value greater than the foreground threshold and smaller than the background threshold, and perform image fusion according to the first depth-of-field information and the second depth-of-field information to generate the target image.
Specifically, by demodulating the structured light image, the depth values of the respective points of the photographic subject can be obtained by calculating the degree of deformation of the projection pattern and using the principle of trigonometric geometry. The depth values of the points in the shot scene obtained by the structured light technology are referred to in the prior art and are not described herein in detail.
The foreground is a character or a scene in front of the shooting subject, has the function of supporting the subject or directly helping to express the theme, and can enhance the space depth of the picture, balance and beautify the picture. The foreground threshold value in this embodiment can be set according to the photographing quality parameter of the mobile terminal according to the manufacturer of the mobile terminal, the foreground threshold value can be set to be multiple, different foreground threshold values correspond to the photographing effect, and the photographing requirements of different users can be met. For example, if the user selects a foreground threshold, then the captured image with depth values greater than the foreground threshold will appear in the final target image: a subject is photographed and a person or a scene having a depth value greater than a foreground threshold value in front of the subject is photographed.
The background refers to a character or a scene close to the back of the shooting subject, so that the picture has a modeling effect of a plurality of layers of scenes, and the sense of spatial depth is enhanced. The background threshold in this embodiment may be set according to the photographing quality parameter of the mobile terminal according to the manufacturer of the mobile terminal, the background threshold may be set in a plurality of numbers, and different background thresholds correspond to the photographing effect, so that the photographing requirements of different users can be met. For example, if the user selects a background threshold, then the captured image with depth values less than the background threshold will appear in the final target image, where: a subject is photographed and a person or a scene having a depth value behind the subject smaller than a background threshold value is photographed.
Specifically, the depth information is a range of distances between the front and rear of the subject measured at the front edge of the camera lens or other imager, where a sharp image can be obtained. In short, after the focusing is completed, a clear image can be formed in the range before and after the focal point, and the distance range before and after the focal point is the depth information. In general, the larger the depth of field is, the farther the distance between the foreground and background is, the stronger the depth sense of the picture is, and the more the multi-level and overall sense of the living environment in the film can be increased. In this embodiment, the imaged image corresponding to the first depth information or the second depth information includes: a subject, a person or a scene having a depth value greater than a foreground threshold value in front of the subject, and a person or a scene having a depth value less than a background threshold value behind the subject. In the embodiment, the first depth of field information and the second depth of field information are acquired by comparing the depth value with the foreground threshold value and the background threshold value to select a clear image in the current shooting picture, so that the imaging quality of a target image obtained subsequently based on the first depth of field information and the second depth of field information is good, and the shooting effect is good. In addition, a plurality of foreground threshold values and a plurality of background threshold values can be set to control the foreground and the background in the target image, so that scene scheduling during shooting is facilitated, and the shooting requirements of different users can be met.
In addition to ensuring the definition of the final target image by controlling the acquired first depth of field information and the acquired second depth of field information, it is also required to ensure that the degree of overlapping of the target images generated from the first structured light image and the second structured light image is high. Further, the image signal processor 1130 is specifically configured to: determining image overlapping area information according to the first depth of field information and the first depth of field information; performing image segmentation on the first structured light image according to the image overlapping area information to obtain a first overlapping image and a first non-overlapping image; performing image segmentation on the second structured light image according to the image overlapping area information to obtain a second overlapped image and a second non-overlapped image; and selecting the first overlapped image and the second overlapped image for image fusion to generate a target image. It should be noted that image fusion is an important part in image processing, and can output a fused image more suitable for human visual perception or further processing and analysis by a computer by cooperatively utilizing image information of multiple sensors in the same scene. The method can obviously improve the defect of a single sensor, improve the definition and the information content of the image, and is beneficial to more accurately, more reliably and more comprehensively acquiring the information of the target or the scene. According to the image fusion method, the target image generated by image fusion after the first overlapped image and the second overlapped image are determined has better definition and overlapping degree, and the dislocation in the image fusion process can be eliminated as much as possible.
In order to implement the above embodiments, the present invention further provides an image processing method, and an execution subject of the method is an image processing system. The image processing system includes: a first structured light projector 1116, a second structured light projector 1117, a camera 1110, an image signal processor 1130; the camera 1110, the first structured light projector 1116 and the second structured light projector 1117 are arranged in sequence along the transverse center line; alternatively, the first structured light projector 1116, the camera 1110 and the second structured light projector 1117 are arranged in sequence along the transverse center line. Fig. 4 is a flowchart of an image processing method according to an embodiment of the present invention, and as shown in fig. 4, the method includes:
in step 101, the first structured light projector 1116 projects a first projection pattern structured light toward the photographic subject.
In particular, the first structured light projector 1116 comprises a first light source controller, a first array of light sources;
the first light source controller controls a light emitting state of each light source of the first array of light sources according to the first projection pattern, the light emitting state including: a lit state and a extinguished state;
the first array light source emits first projection pattern structured light to be projected onto the shooting main body, and the first projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
In step 102, the second structured light projector 1117 projects the second projection pattern structured light to the photographic subject.
Specifically, the second structured light projector 1117 includes a second light source controller, a second array of light sources;
the second light source controller controls a light emitting state of each light source of the second array light sources according to the second projection pattern, the light emitting state including: a lit state and a extinguished state;
the second array light source emits second projection pattern structured light to be projected onto the shooting main body, and the second projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
Step 103, the camera 1110 captures a first structured light image of the first projection pattern structured light modulated by the shooting subject, and captures a second structured light image of the second projection pattern structured light modulated by the shooting subject.
Step 104, the image signal processor 1130 demodulates the first structured light image to extract first depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, demodulates the second structured light image to extract second depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate the target image.
Specifically, a possible implementation manner of performing image fusion according to the first depth information and the second depth information to generate the target image in step 104 is as follows: determining image overlapping area information according to the first depth of field information and the first depth of field information; performing image segmentation on the first structured light image according to the image overlapping area information to obtain a first overlapping image and a first non-overlapping image; performing image segmentation on the second structured light image according to the image overlapping area information to obtain a second overlapped image and a second non-overlapped image; and selecting the first overlapped image and the second overlapped image for image fusion to generate a target image.
Further, the first projection pattern and the second projection pattern are different.
Further, the image signal processor 1130 controls the first structured light projector 1116 and the second structured light projector 1117 to operate alternately.
The specific manner in which the method of the present embodiment, and the operations performed by the various steps, are described in detail in the embodiments of the system, and will not be described in detail herein.
In the image processing method according to the embodiment of the present invention, first, the first structured light projector 1116 projects first projection pattern structured light to the photographic subject; the second structured light projector 1117 projects a second projection pattern structured light toward the photographic subject; next, the camera 1110 captures a first structured light image of the first projection pattern structured light modulated by the photographic subject, and captures a second structured light image of the second projection pattern structured light modulated by the photographic subject; finally, the image signal processor 1130 demodulates the first structured light image to extract first depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, demodulates the second structured light image to extract second depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image; the obtained target image reflects the shooting subject more truly, and the shooting effect is good and the user experience is good.
In order to implement the above embodiments, the present invention also proposes a terminal device, which includes therein an Image processing circuit, which may be implemented by hardware and/or software components, and may include various processing units defining an ISP (Image signal processing) pipeline. Fig. 5 is a schematic structural diagram of an image processing circuit in a terminal device according to an embodiment of the present invention. As shown in fig. 5, for ease of explanation, only aspects of the image processing techniques associated with embodiments of the present invention are shown.
As shown in FIG. 5, the image processing circuit 110 includes an imaging device 1110, an ISP processor 1130 and control logic 1140, a first structured light projector 1116, a second structured light projector 1117. Imaging device (e.g., camera) 1110 may include one or more lenses 1112, an image sensor 1114. The first and second structured light projectors 1116 and 1117 project structured light onto an object to be measured. The structured light pattern may be a laser stripe, a gray code, a sinusoidal stripe, or a randomly arranged speckle pattern. The image sensor 1114 captures a structured light image projected onto the object to be measured, and transmits the structured light image to the ISP processor 1130, and the ISP processor 1130 demodulates the structured light image to obtain depth information of the object to be measured. At the same time, the image sensor 1114 can also capture color information of the object under test. Of course, the structured light image and the color information of the object to be measured may be captured by the two image sensors 1114, respectively.
Taking speckle structured light as an example, the ISP processor 1130 demodulates the structured light image, specifically including acquiring a speckle image of the measured object from the structured light image, performing image data calculation on the speckle image of the measured object and the reference speckle image according to a predetermined algorithm, and acquiring a moving distance of each scattered spot of the speckle image on the measured object relative to a reference scattered spot in the reference speckle image. And (4) converting and calculating by using a trigonometry method to obtain the depth value of each scattered spot of the speckle image, and obtaining the depth information of the measured object according to the depth value.
Of course, the depth image information and the like may be acquired by a binocular vision method or a method based on the time difference of flight TOF, and the method is not limited thereto, as long as the depth information of the object to be measured can be acquired or obtained by calculation, and all methods fall within the scope of the present embodiment.
After the ISP processor 1130 receives the color information of the object to be measured captured by the image sensor 1114, image data corresponding to the color information of the object to be measured may be processed. ISP processor 1130 analyzes the image data to obtain image statistics that may be used to determine one or more control parameters of imaging device 1110. The image sensor 1114 may include an array of color filters (e.g., Bayer filters), and the image sensor 1114 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 1114 and provide a set of raw image data that may be processed by the ISP processor 1130.
ISP processor 1130 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 1130 may perform one or more image processing operations on the raw image data, collecting image statistics about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 1130 may also receive pixel data from image memory 1120. The image memory 1120 may be a portion of a memory device, a storage device, or a separate dedicated memory within an electronic device, and may include a DMA (direct memory Access) feature.
Upon receiving the raw image data, ISP processor 1130 may perform one or more image processing operations.
After the ISP processor 1130 obtains the color information and the depth information of the object to be measured, it may be fused to obtain a three-dimensional image. The feature of the corresponding object to be measured can be extracted by at least one of an appearance contour extraction method or a contour feature extraction method. For example, the features of the object to be measured are extracted by methods such as an active shape model method ASM, an active appearance model method AAM, a principal component analysis method PCA, and a discrete cosine transform method DCT, which are not limited herein. And then the characteristics of the measured object extracted from the depth information and the characteristics of the measured object extracted from the color information are subjected to registration and characteristic fusion processing. The fusion processing may be a process of directly combining the features extracted from the depth information and the color information, a process of combining the same features in different images after weight setting, or a process of generating a three-dimensional image based on the features after fusion in other fusion modes.
Image data for a three-dimensional image may be sent to image memory 1120 for additional processing before being displayed. ISP processor 1130 receives processed data from image memory 1120 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. Image data for a three-dimensional image may be output to a display 1160 for viewing by a user and/or for further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 1130 may also be sent to image memory 1120, and display 1160 may read image data from image memory 1120. In one embodiment, image memory 1120 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 1130 may be transmitted to an encoder/decoder 1150 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on the display 1160 device. The encoder/decoder 1150 may be implemented by a CPU or GPU or coprocessor.
The image statistics determined by the ISP processor 1130 may be sent to the control logic processor 1140 unit. Control logic 1140 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters for imaging device 1110 based on the received image statistics.
The following steps are used for realizing the image processing method by using the image processing technology in the figure 5:
in step 101, the first structured light projector 1116 projects a first projected pattern structured light toward the photographic subject.
In step 102, the second structured light projector 1117 projects a second projection pattern structured light to the photographic subject.
In step 103, the camera 1110 captures a first structured light image of the first projection pattern structured light modulated by the subject, and captures a second structured light image of the second projection pattern structured light modulated by the subject.
In step 104, the image signal processor 1130 demodulates the first structured light image to extract first depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, demodulates the second structured light image to extract second depth-of-field information having a depth value greater than the foreground threshold and less than the background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate the target image.
In order to implement the above embodiments, the present invention also proposes a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is capable of implementing the image processing method as described in the foregoing embodiments.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the specification, reference to the description of the term "one embodiment", "some embodiments", "an example", "a specific example", or "some examples", etc., means that a particular feature or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (14)

1. An image processing system, comprising: the device comprises a first structured light projector, a second structured light projector, a camera and an image signal processor;
the camera, the first structured light projector and the second structured light projector are arranged in a common transverse center line;
the first structured light projector is used for projecting first projection pattern structured light to the shooting main body;
the second structured light projector is used for projecting second projection pattern structured light to the shooting main body;
the camera is used for shooting a first structured light image of the first projection pattern structured light modulated by the shooting main body and shooting a second structured light image of the second projection pattern structured light modulated by the shooting main body;
the image signal processor is configured to demodulate the first structured light image to extract first depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, demodulate the second structured light image to extract second depth-of-field information having a depth value greater than a foreground threshold and less than a background threshold, and perform image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image;
wherein the image signal processor is specifically configured to:
determining image overlapping area information according to the first depth of field information and the second depth of field information;
performing image segmentation on the first structured light image according to the image overlapping area information to obtain a first overlapping image and a first non-overlapping image;
performing image segmentation on the second structured light image according to the image overlapping area information to obtain a second overlapping image and a second non-overlapping image;
and selecting the first overlapped image and the second overlapped image to perform image fusion so as to generate a target image.
2. The system of claim 1 wherein the first structured light projector comprises a first light source controller, a first array of light sources;
the first light source controller is configured to control a light emitting state of each light source of the first array of light sources according to a first projection pattern, where the light emitting state includes: a lit state and a extinguished state;
the first array light source is used for emitting first projection pattern structured light to be projected onto the shooting main body, and the first projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
3. The system of claim 1 wherein the second structured light projector comprises a second light source controller, a second array of light sources;
the second light source controller is configured to control a light emitting state of each light source of the second array of light sources according to a second projection pattern, where the light emitting state includes: a lit state and a extinguished state;
the second array light source is used for emitting second projection pattern structured light to be projected onto the shooting main body, and the second projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
4. The system of claim 1,
the first projection pattern and the second projection pattern are different.
5. The system of claim 1,
the image signal processor is further configured to control the first structured light projector and the second structured light projector to alternately operate.
6. The system of claim 1 wherein the camera, the first structured light projector, and the second structured light projector are aligned along a common transverse centerline, comprising: the camera, the first structured light projector and the second structured light projector are arranged in sequence along a common transverse center line;
or the first structured light projector, the camera and the second structured light projector are arranged in sequence along the common transverse center line.
7. An image processing method, comprising: the device comprises a first structured light projector, a second structured light projector, a camera and an image signal processor;
the camera and the first structured light projector are sequentially arranged along a transverse central line, and the camera and the second structured light projector are sequentially arranged along a longitudinal central line;
the first structured light projector projects first projection pattern structured light to the photographic subject;
the second structured light projector projects second projection pattern structured light to the photographic subject;
the camera shoots a first structured light image of the first projection pattern structured light modulated by the shooting main body, and shoots a second structured light image of the second projection pattern structured light modulated by the shooting main body;
the image signal processor demodulates the first structured light image to extract first depth-of-field information with a depth value larger than a foreground threshold and smaller than a background threshold, demodulates the second structured light image to extract second depth-of-field information with a depth value larger than the foreground threshold and smaller than the background threshold, and performs image fusion according to the first depth-of-field information and the second depth-of-field information to generate a target image;
wherein the performing image fusion according to the first depth of field information and the second depth of field information to generate a target image includes:
determining image overlapping area information according to the first depth of field information and the second depth of field information;
performing image segmentation on the first structured light image according to the image overlapping area information to obtain a first overlapping image and a first non-overlapping image;
performing image segmentation on the second structured light image according to the image overlapping area information to obtain a second overlapping image and a second non-overlapping image;
and selecting the first overlapped image and the second overlapped image to perform image fusion so as to generate a target image.
8. The method of claim 7 wherein the first structured light projector comprises a first light source controller, a first array of light sources;
the first light source controller controls a light emitting state of each light source of the first array of light sources according to a first projection pattern, the light emitting state including: a lit state and a extinguished state;
the first array light source emits first projection pattern structured light to be projected onto the shooting main body, and the first projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
9. The method of claim 7 wherein the second structured light projector includes a second light source controller, a second array of light sources;
the second light source controller controls a light emitting state of each light source of the second array of light sources according to a second projection pattern, the light emitting state including: a lit state and a extinguished state;
the second array light source emits second projection pattern structured light to be projected onto the shooting main body, and the second projection pattern structured light is structured light formed by light rays emitted by the light sources in all lighting states.
10. The method of claim 7,
the first projection pattern and the second projection pattern are different.
11. The method of claim 7,
the image signal processor also controls the first structured light projector and the second structured light projector to work alternately.
12. The method of claim 7, wherein the camera, the first structured light projector, and the second structured light projector are aligned along a common transverse centerline, comprising: the camera, the first structured light projector and the second structured light projector are arranged in sequence along a common transverse center line;
or the first structured light projector, the camera and the second structured light projector are arranged in sequence along the common transverse center line.
13. A terminal device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the image processing method of any one of claims 7 to 12.
14. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the image processing method according to any one of claims 7 to 12.
CN201710676499.1A 2017-08-09 2017-08-09 Image processing system and method Active CN107370951B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710676499.1A CN107370951B (en) 2017-08-09 2017-08-09 Image processing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710676499.1A CN107370951B (en) 2017-08-09 2017-08-09 Image processing system and method

Publications (2)

Publication Number Publication Date
CN107370951A CN107370951A (en) 2017-11-21
CN107370951B true CN107370951B (en) 2019-12-27

Family

ID=60309512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710676499.1A Active CN107370951B (en) 2017-08-09 2017-08-09 Image processing system and method

Country Status (1)

Country Link
CN (1) CN107370951B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109842789A (en) * 2017-11-28 2019-06-04 奇景光电股份有限公司 Depth sensing device and depth sensing method
CN108154514B (en) * 2017-12-06 2021-08-13 Oppo广东移动通信有限公司 Image processing method, device and equipment
CN108259722A (en) * 2018-02-27 2018-07-06 厦门美图移动科技有限公司 Imaging method, device and electronic equipment
CN108564614B (en) * 2018-04-03 2020-09-18 Oppo广东移动通信有限公司 Depth acquisition method and apparatus, computer-readable storage medium, and computer device
CN113325437A (en) * 2020-02-29 2021-08-31 华为技术有限公司 Image generation method and device
CN113985425A (en) 2020-07-10 2022-01-28 广州印芯半导体技术有限公司 Distance measuring device and distance measuring method
CN113973167A (en) * 2021-10-28 2022-01-25 维沃移动通信有限公司 Camera assembly, electronic device and image generation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
WO2014183385A1 (en) * 2013-05-17 2014-11-20 中兴通讯股份有限公司 Terminal and image processing method therefor
CN105096283A (en) * 2014-04-29 2015-11-25 华为技术有限公司 Panoramic image acquisition method and device
CN105407280A (en) * 2015-11-11 2016-03-16 广东欧珀移动通信有限公司 Panoramic image synthesizing method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
WO2014183385A1 (en) * 2013-05-17 2014-11-20 中兴通讯股份有限公司 Terminal and image processing method therefor
CN105096283A (en) * 2014-04-29 2015-11-25 华为技术有限公司 Panoramic image acquisition method and device
CN105407280A (en) * 2015-11-11 2016-03-16 广东欧珀移动通信有限公司 Panoramic image synthesizing method and system

Also Published As

Publication number Publication date
CN107370951A (en) 2017-11-21

Similar Documents

Publication Publication Date Title
CN107370951B (en) Image processing system and method
TWI584634B (en) Electronic apparatus and method of generating depth map
US11115633B2 (en) Method and system for projector calibration
CN108765542B (en) Image rendering method, electronic device, and computer-readable storage medium
CN106851124B (en) Image processing method and device based on depth of field and electronic device
US8334893B2 (en) Method and apparatus for combining range information with an optical image
CN107563304B (en) Terminal equipment unlocking method and device and terminal equipment
CN107734267B (en) Image processing method and device
US11503228B2 (en) Image processing method, image processing apparatus and computer readable storage medium
CN107610080B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium
CN107610171B (en) Image processing method and device
CN107864337B (en) Sketch image processing method, device and equipment and computer readable storage medium
CN107452034B (en) Image processing method and device
CN107517346B (en) Photographing method and device based on structured light and mobile device
JP2003083730A (en) 3-dimensional information acquisition device, projection pattern in 3-dimensional information acquisition and 3- dimensional information acquisition method
CN107395974B (en) Image processing system and method
CN107564050B (en) Control method and device based on structured light and terminal equipment
CN107370950B (en) Focusing process method, apparatus and mobile terminal
CN107493411B (en) Image processing system and method
CN107480615B (en) Beauty treatment method and device and mobile equipment
CN107392874B (en) Beauty treatment method and device and mobile equipment
CN107705278B (en) Dynamic effect adding method and terminal equipment
CN107734264B (en) Image processing method and device
CN107590828B (en) Blurring processing method and device for shot image
CN107483815B (en) Method and device for shooting moving object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

GR01 Patent grant
GR01 Patent grant