CN114520866A - Picture shot processing method, electronic device and storage medium - Google Patents

Picture shot processing method, electronic device and storage medium Download PDF

Info

Publication number
CN114520866A
CN114520866A CN202011306105.1A CN202011306105A CN114520866A CN 114520866 A CN114520866 A CN 114520866A CN 202011306105 A CN202011306105 A CN 202011306105A CN 114520866 A CN114520866 A CN 114520866A
Authority
CN
China
Prior art keywords
image
shot
goodness
depth
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011306105.1A
Other languages
Chinese (zh)
Inventor
普罗申齐德·班纳吉
维罗妮卡·沙玛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oneplus Technology Shenzhen Co Ltd
Original Assignee
Oneplus Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oneplus Technology Shenzhen Co Ltd filed Critical Oneplus Technology Shenzhen Co Ltd
Priority to CN202011306105.1A priority Critical patent/CN114520866A/en
Publication of CN114520866A publication Critical patent/CN114520866A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Abstract

The application discloses an image shot processing method, an image shot processing device and a storage medium, relates to the technical field of terminals, and can solve the problem that a user can obtain a satisfactory shot image only by taking pictures for multiple times and performing shot processing for multiple times. The image scene processing method of the present application includes: acquiring the shot goodness of a source image, wherein the source image is an image to be subjected to shot processing, and the shot goodness is positively correlated with the quality of a shot image generated based on the source image; and when the shot goodness is larger than or equal to the preset threshold, performing shot processing on the source image to obtain a shot image.

Description

Picture shot processing method, electronic device and storage medium
Technical Field
The application relates to the technical field of terminals, in particular to an image shot processing method, electronic equipment and a storage medium.
Background
A shot image focuses the focus on the object while defocusing the background, which makes the image look sharp and visually appealing. Most smart phones now also have a shot image capture function.
The shot image can not be directly obtained after the mobile phone normally takes the picture. Generally, the shot processing is needed. The process of shot scene processing generally involves first identifying the foreground and background of an image, then keeping the subject of the image, called the foreground, in sharp focus and gradually blurring the background. The quality of the shot image depends on the successful segmentation of the edge pattern between the foreground and the background. However, in actual shot processing, it is often difficult to correctly segment, so that a user often needs to take a lot of photos before obtaining a satisfactory shot image, and perform multiple shot processing, which is not favorable for maintaining the storage space of the mobile phone and prolonging the battery life.
Disclosure of Invention
In view of this, the present application provides an image shot processing method, an electronic device, and a storage medium, which can solve the problem that a user needs to take a picture many times and perform shot processing many times to obtain a satisfactory shot image.
An image shot processing method provided by an embodiment of the application includes:
acquiring shot goodness of a source image, wherein the source image is an image to be subjected to shot processing, and the shot goodness is positively correlated with the quality of a shot image generated based on the source image;
and when the shot goodness is larger than or equal to the preset threshold, performing shot processing on the source image to obtain a shot image.
Embodiments of the present application further provide an electronic device, including a memory and a processor, where the memory stores a program for execution by the processor to perform any one of the image shot processing methods described above.
Embodiments of the present application also provide a readable storage medium having a program stored therein for execution by a processor to perform the image shot processing method as in any one of the above.
According to the image shot processing method, the image shot processing device and the storage medium, before a shot image is generated based on a source image, shot goodness of the source image is evaluated, and when the shot goodness is larger than or equal to a preset threshold value, shot processing is performed on the source image, so that the shot image is obtained. Because the shot goodness is positively correlated with the quality of the shot image generated based on the source image, when the shot goodness is greater than or equal to the preset threshold, the quality of the shot image generated based on the source image meets a certain requirement (the requirement is reflected by the preset threshold), and then the source image is subjected to shot processing. Therefore, unnecessary shot processing on the pictures with low shot excellence can be avoided, the shooting time is saved, the problem that the user needs to shoot the pictures for multiple times and carry out shot processing for multiple times to obtain satisfactory shot images is solved, the calculation capacity and the storage space of the equipment are saved, and the cruising ability of the battery is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a flowchart of an image shot processing method according to an embodiment of the present application;
FIG. 3 is a flow chart of the shot goodness acquisition provided by the embodiment of the present application;
fig. 4 is a flowchart of another shot processing method provided in an embodiment of the present application;
FIG. 5 is a color image as a source image of the present embodiment;
FIG. 6 is a schematic diagram of the foreground, background and edge patterns of FIG. 5;
FIG. 7 is a schematic diagram of an electronic device provided by an embodiment of the application;
fig. 8 is a schematic diagram of a hardware structure of a mobile phone related to scene processing according to an embodiment of the present disclosure;
fig. 9 is a flowchart of another mobile phone shooting a shot image according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first edge image may also be referred to as a second edge image, and similarly, the second edge image may also be referred to as a first edge image, without departing from the scope herein.
It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, items, species, and/or groups thereof.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or partially with other steps or at least some of the sub-steps or stages of other steps.
It should be noted that step numbers such as S101 and S102 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S104 first and then perform S101 in specific implementations, but these should be within the scope of the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves.
The device in the following description may be any electronic device that requires image shot processing. In general, the devices in the above description may be mobile terminals. The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal or other electronic devices, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 can receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that may optionally adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the device posture (such as mobile phone horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch direction of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, transmits the touch point coordinates to the processor 110, and can receive and execute commands transmitted from the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby integrally monitoring the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally the application processor primarily handles operating systems, user interfaces, application programs, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
Based on the above mobile terminal hardware structure, various embodiments of the present application are provided.
It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application. It should also be understood that the mobile terminal hardware architecture described herein is only provided to facilitate understanding of the present application and is not intended to limit the present application. Devices such as smartphones, tablets and any other similar device described below have one or more camera sensors. Optionally, the device may have a depth sensor, such as a depth camera. Optionally, the device may have a display device and a processor.
An image shot processing method provided by an embodiment of the present application is, as shown in fig. 2, the method including:
s1, obtaining the shot goodness of a source image, wherein the source image is an image to be processed in a shot mode, and the shot goodness is positively correlated with the quality of a shot image generated based on the source image.
The source image is an image to be processed by the shot scene, and can be an existing image stored by equipment or an image preliminarily acquired by an equipment camera. In some embodiments, the source image contains color information and depth information of the image to be processed. For example, the source images include color images and depth images; for another example, although the source image includes a color image, the color information and the depth information contained in the source image can be obtained through image processing, information extraction and other steps.
In the step, before the shot processing is performed on the source image, the quality of the shot image generated based on the source image is pre-evaluated. The result of this preliminary evaluation is represented by the shot goodness. The shot goodness is positively correlated to the quality of a shot image generated based on the source image. If the shot goodness is relatively high, the quality of a shot image generated based on the source image is expected to be good, and the probability of satisfaction of a user is high; if the shot goodness is relatively low, the quality of a shot image generated based on the source image is poor, and the probability of dissatisfaction of a user is high. The quality of a shot image, i.e., the goodness of the shot image, generally includes whether the foreground is clear, how the background is blurred, whether the foreground and the background are successfully segmented, and the like.
It is understood that one skilled in the art may select one or more parameters affecting the quality or the shot effect of the shot image generated after the shot processing, or a combination of the parameters as the shot goodness in this step, according to the specific method of the shot processing. The method is not particularly limited to specific parameters included in the shot goodness, the shot goodness can reflect the quality (or shot effect) of a shot image generated based on a source image, and one or more critical parameters influencing the shot processing effect are generally selected.
For example, when the source image contains color information and depth information of the image to be processed, in general, the quality of the shot image depends on the successful segmentation of the edge pattern between the foreground and the background, and therefore, the accuracy of the depth image and the overlapping degree of the edge pattern between the foreground and the background in the color image and the edge pattern between the foreground and the background in the depth image are critical parameters influencing the shot processing effect to a large extent, and the quality of the shot image generated after the shot processing is positively correlated. The accuracy of the depth image refers to the reliability of distance (or depth) information from a photographic subject to the image pickup device obtained from the depth image, and is generally described by the confidence of pixel depth values. The degree of overlap of the edge pattern of the color image and the edge pattern of the depth image is defined as the consistency of the edge information acquired using different means or devices. The degree of overlap of the edge pattern of the color image and the edge pattern of the depth image may reflect the focus of the source image.
We have known that a shot image is a suitable separation of the background from the foreground. This can happen in a very ideal situation where the subject (i.e. foreground) and background are well separated and the foreground is very close to the camera (self-portrait condition).
Those skilled in the art will appreciate that the accuracy of the depth information in the depth image is related to the distance of the camera, with the greater the distance from the camera, the less accurate (i.e., reliable) the corresponding acquired depth information. Conversely, the higher the accuracy of the depth image, the closer the photographic subject is to the camera.
The degree of overlap (for convenience of description, simply referred to as the degree of overlap) between the edge pattern between the foreground and background in the color image and the edge pattern between the foreground and background in the depth image can be a measure of the focus. A greater degree of overlap indicates better focus, i.e. no movement or blurring of the appropriate details in the image.
When the accuracy of the depth image is higher, the overlapping degree of the edge pattern in the color image and the edge pattern in the depth image is larger, the shooting object is close to the camera, the focusing condition is good, and accordingly the quality of the shot image generated based on the source image is better.
Thus, in some embodiments, the shot goodness comprises at least a parameter reflecting the accuracy of the depth image, and a parameter reflecting the degree of overlap of the edge pattern between the foreground and the background in the color image and the edge pattern between the foreground and the background in the depth image.
Fig. 5 shows a color image as a source image. Fig. 6 is a schematic diagram of the segmentation of the foreground and the background, where a white area (corresponding to a portrait area) is the foreground, a black area is the background, and a gray area (corresponding to a portrait edge) between the foreground and the background is an edge pattern between the foreground and the background.
And S2, when the shot goodness is larger than or equal to a preset threshold value, performing shot processing on the source image to obtain a shot image.
The preset threshold in this step may be preset according to the needs of practical application. The preset threshold may be set according to the actual shot effect, for example, determined in advance through experiments. The user can also be referred to the favorite settings of the user, for example, a plurality of shot images with different image parameters can be provided for the user to select, the preference of the user on the image shot effect is extracted according to the result selected by the user, the preference is converted into the image parameters, and then the preset threshold is generated on the basis.
In some embodiments, the source images may include color images and depth images. The color image may be obtained, for example, by a color camera or the like. The depth image may be obtained by a specialized depth measurement sensor, such as a depth camera. The depth measurement sensor may use, for example, a laser radar depth imaging method, a structured light method, a time-of-flight method, binocular stereo vision, a monocular image depth estimation algorithm, or the like.
In some embodiments, the source image includes a bounding box region of a color image and a bounding box region of a depth image, and as shown in fig. 3, the obtaining of the shot goodness of the source image includes:
s11, extracting the edge pattern of the border frame area of the color image, and generating a first edge image according to the extraction result;
and extracting an edge pattern in a bounding box region (bounding box region) of the color image, wherein the extraction result is embodied in the form of a first edge image. The bounding box region of a color image refers herein to the smallest region of the color image that surrounds the image subject called the foreground, for example, in fig. 5, the bounding box region of the color image is a dashed box in the figure. In the step, an edge pattern, namely a contour line of a foreground in the color image, is extracted from a boundary frame area of the color image and is recorded as a first edge image. The method for extracting the first edge image in this step is generally performed by comparing and screening pixel gray levels. Exemplary edge image extraction methods include, for example, differential operator method, laplacian gaussian operator method, Canny operator, fitting method, relaxation method, neural network method, wavelet transform method, wavelet packet decomposition method, morphology method (fractal theory), and the like.
S12, extracting the edge pattern of the bounding box area of the depth image, and generating a second edge image according to the extraction result;
and extracting an edge pattern in the boundary frame area of the depth image, wherein the extraction result exists in the form of a second edge image. The bounding box region of the depth image refers to the smallest region of the depth image that encloses an image subject called foreground. In the step, the edge pattern is extracted from the border area of the depth image and recorded as a second edge image, namely, the edge contour line of the foreground in the depth image is obtained in the step. The method for extracting the first edge image in this step generally includes comparing and screening the depth values of the pixels. Exemplary include, for example, differential operator method, laplacian gaussian operator method, Canny operator, fitting method, relaxation method, neural network method, wavelet transform method, wavelet packet decomposition method, morphology method (fractal theory), and the like.
S13, screening out pixels which are represented as edges in the first edge image and the second edge image, and generating an overlapped edge image according to the screening result;
in this step, pixels which are represented as edges in both the first edge image and the second edge image are screened out to remove pixels which are not actually corresponding to the contour line of the image main body in the first edge image and the second edge image and image noise. The screening results are in the form of overlapping edge images. For example, a logical and operation may be performed on the first edge image and the second edge image on a pixel-by-pixel basis, and the operation result may be referred to as an overlap edge image. This calculation method will be further explained in the embodiment of the picture processing method applied to the mobile phone provided below.
S14, extracting depth values of a plurality of pixels corresponding to the edge of the overlapped edge image and confidence degrees of the depth values in the boundary frame area of the depth image;
in this step, a plurality of pixels corresponding to the edge of the overlap edge image obtained in step S13 are screened out from the bounding box region of the depth image, and the depth information corresponding to the pixels and the confidence degrees of the depth information are obtained. Confidence, also known as confidence level, refers to the confidence level of a measured value of a measured parameter, such as the depth values of a plurality of pixels corresponding to an edge of an overlapping edge image.
For example, in consideration of the amount of calculation and the response speed, the present step may select to acquire depth information and confidence of several representative pixels corresponding to the contour line of the image subject. For example, several representative pixels may be selected at equal intervals along the contour line of the image subject, and the depth information of the representative pixels and the confidence of the depth information may be obtained.
Depth images are acquired by methods such as lidar depth imaging, structured light, time-of-flight, binocular stereo, monocular image depth estimation algorithms, etc., while these methods provide a confidence value (or reliability score) for each pixel in the image. The confidence value represents the correct reliability of the depth measurement for each pixel of the image. In general, the error of the depth measurement gradually changes as the distance of the shooting scene from the depth sensor increases. The confidence value information can be directly provided by the depth sensor, and can also be calculated by a built-in calculating module according to a depth image acquiring method of the depth sensor. The confidence value is generally between 0 and 1.
In some embodiments, the depth image includes depth information for each pixel, and also includes confidence information for each pixel depth information.
S15, calculating a weighted average value X of the depths and the confidences of the plurality of pixels;
optionally, the weighted average X of the calculated confidence and the edge position distance is:
Figure BDA0002787420590000141
wherein, CiFor the ith pixel (x) corresponding to an edge in the overlapping edge imagei,yi) The confidence corresponding to the depth information of, DiIs the depth of the ith pixel, n is the number of a plurality of pixels corresponding to the overlapped edge image, and i and n are both natural numbers. (x)i,yi) Is the coordinate of the ith pixel.
S16, calculating the pixel number ratio Y of the overlapped edge image and the second edge image;
s17, calculating the goodness of shot G according to the weighted average value X and the pixel number ratio Y, wherein the goodness of shot is in direct proportion to the weighted average value X and the pixel number ratio Y.
The pixel number ratio Y of the present application may represent, at least in part, a degree of overlap of an edge pattern of the color image and an edge pattern of the depth image. The weighted average X of the present application may represent, at least in part, the accuracy of the depth image. Accordingly, the quality of a shot image generated based on the source image can be measured or predicted by the weighted average value X and the pixel number ratio Y. Therefore, the quality of the shot image can be known in advance before the shot effect processing is carried out on the source image, the problem that the satisfactory shot image can be obtained only by taking a plurality of pictures during the same framing is solved, the user friendliness of the equipment can be improved, and the maintenance of the storage space of the mobile phone and the battery endurance are facilitated. The scheme of the application can be applied to the image processing or the photographing process of the electronic equipment.
In some embodiments, the goodness of divergence G may be the product of the weighted average X and the pixel number ratio Y, G ═ XY. In some other embodiments, the weighted average value X and the pixel number ratio Y may also be obtained by other functional relations to obtain the goodness of shot G. The present embodiment does not limit other functional relations. Under the condition of giving the value range of the panoramic goodness G, other functional relational expressions can be given through fitting.
In some embodiments, the goodness of astigmatism G may be a product of the weighted average value X and the pixel number ratio Y, and the preset threshold is 0.85-0.95.
In some embodiments, the goodness of astigmatism G may be a product of the weighted average value X and the pixel number ratio Y, wherein the preset threshold is 0.9. And when the shot goodness G is greater than or equal to 0.9, the shot image generated based on the source image has high quality and a good shot effect.
In some embodiments, before extracting the first edge image from the bounding box region of the color image, the shot processing method further comprises: acquiring a bounding box area of the color image; before extracting the second edge image from the bounding box region of the depth image, the shot processing method further includes: and acquiring a bounding box area of the depth image.
In some embodiments, before the obtaining of the shot goodness of the source image, the shot processing method further comprises: acquiring the color image and the depth image; and carrying out field matching and pixel corresponding matching processing on the color image and the depth image.
In this step, the distortion is corrected by performing the field matching and the pixel corresponding matching processing on the color image and the depth image, and the implementation effect of the present invention is not affected by the specific method of the field matching and the pixel corresponding matching processing, so the present application does not limit this.
The color image and the depth image which are processed by the field matching and the pixel corresponding matching have a one-to-one corresponding pixel correspondence relationship, that is, any pixel (x, y) in the color image area will store the color information, and the corresponding pixel corresponding to the depth image will store the depth information of the pixel.
In some embodiments, as shown in fig. 4, the shot processing method further includes: and S3, when the shot goodness is smaller than the preset threshold value, deleting the source image and sending information for reminding a user to change the shooting angle.
The shooting angle includes a shooting height, a shooting direction, and a shooting distance. The shooting height is divided into three types, namely horizontal shooting, horizontal shooting and upward shooting. The shooting direction is divided into a front angle, a side angle, an oblique side angle, a back angle, and the like. The shooting angle of a user can affect the illumination distribution, the depth distribution and the like of a source image, and can also affect the quality of a shot image generated based on the source image. And when the shot goodness is smaller than the preset threshold value, prompting a user to change the shooting angle in time until the shot goodness meets the requirement. The information for reminding the user to change the shooting angle can be displayed on a monitor or a display, and can also be reminded through voice broadcast.
In some embodiments, when the shot goodness is smaller than the preset threshold, the shot processing method further includes:
detecting whether the shooting angle of the equipment is changed;
and when the shooting angle of the equipment is detected to be changed, continuously executing the step of acquiring the shot goodness of the source image.
The step detects whether the user changes the shooting angle in real time, and gives a corresponding prompt when the user changes the shooting angle. For example, whether the device changes the shooting angle may be detected in real time by a sensor mounted on the device, and for example, whether the device changes the shooting angle may be detected by measuring the position, the degree of tilt, and the like of the device by a position sensor, a velocity sensor, a gyro sensor, a light sensor, and the like mounted on the device. It is also possible to detect whether the device has changed the shooting angle by detecting whether there is a change in the newly received source image.
In some embodiments, after evaluating the shot goodness of the source image, the shot processing method further comprises: and providing the shot goodness to a user. For example, the shot goodness may be displayed on a monitor or a display, or may be broadcasted by voice.
According to the image shot processing method, before a shot image is generated based on a source image, shot goodness of the source image is evaluated, and when the shot goodness is larger than or equal to a preset threshold value, shot processing is performed on the source image to obtain the shot image. Therefore, unnecessary shot processing on the pictures with low shot excellence can be avoided, the shooting time is saved, the problem that the user can obtain satisfactory shot images only by shooting the pictures for multiple times and performing the shot processing for multiple times is solved, the calculation capacity and the storage space of the equipment are saved, and the cruising ability of the battery is improved.
As shown in fig. 7, an embodiment of the present application also provides an electronic device 50, including;
the acquiring unit 51 is used for acquiring the shot goodness of the source image;
a shot processing unit 52, configured to perform shot processing on the source image to obtain a shot image when the shot goodness is greater than or equal to a preset threshold; the shot goodness is positively correlated to the quality of a shot image generated based on the source image.
Electronic devices of the present application include, but are not limited to, cameras, cell phones, tablet computers, mobile computers, notebook computers, electronic book players, and the like, having one or more attached camera sensors. The electronic equipment predicts the quality of the shot image generated based on the source image through shot goodness before the shot processing is carried out on the source image, the shot processing is carried out on the source image only when the shot goodness of the source image is larger than or equal to a preset threshold value, otherwise, the source image is deleted, and therefore the problem that a user needs to shoot a plurality of photos before capturing the final satisfactory shot image can be solved.
Embodiments of the present application further provide an electronic device, including a memory and a processor, where the memory stores a program for being executed by the processor to perform the image shot processing method described in any one of the above embodiments.
The electronic equipment predicts the quality of the shot image generated based on the source image through shot goodness before the shot processing is carried out on the source image, the shot processing is carried out on the source image only when the shot goodness of the source image is larger than or equal to a preset threshold value, otherwise, the source image is deleted, and therefore the problem that a user needs to shoot a plurality of photos before capturing the final satisfactory shot image can be solved.
The electronic device may include: the device comprises a color camera and a depth camera, wherein the color camera is used for obtaining a color image, the depth camera is used for obtaining a depth image, and a source image comprises at least part of the color image and at least part of the depth image. Optionally, the electronic device comprises a display.
If the electronic device adopts a monocular image depth estimation algorithm method to obtain the depth image, the electronic device can also only have one color camera or the like.
In some embodiments, the electronic device of the present application may be a handheld device with one or more attached camera sensors, storage media, and a display for capturing a panoramic image. Preferably, the electronic device of the present application may be a handheld device with one or more attached camera sensors, depth measurement sensors, storage media and a display.
Embodiments of the present application also provide a readable storage medium having a program stored therein for execution by a processor to perform the image shot processing method as in any one of the above.
Embodiments of the present application also provide a computer program stored in a readable storage medium for execution by a processor to perform the image shot processing method as in any one of the above.
In the following, the technical solutions of the present application are described by taking a mobile phone as an example and combining with specific embodiments, and those skilled in the art will understand that the mobile phone described in the present application is only an example, and any other suitable electronic devices are included in the scope of the present application. The mobile phone of the embodiment comprises a color camera and a depth camera.
Fig. 8 is a schematic diagram of a hardware structure of a mobile phone related to scene processing according to an embodiment of the present disclosure. Fig. 9 is a flowchart of shooting a shot image by a mobile phone according to the present application.
The color image is acquired by the color image acquisition means 61, and the depth image is acquired by the depth image acquisition means 62. The calibration module 63 then calibrates the color image and the depth image. The calibration module 63 includes a field of view matching module 631 and a pixel calibration module 632. The calibration process is an off-line process that is performed once and stored as mathematical equations. The field-of-view matching module 631 performs field-of-view matching on the color image and the depth image, and the pixel calibration module 632 performs pixel correspondence matching on the color image and the depth image. The color image and the depth image which are subjected to the field matching and the pixel corresponding matching have a one-to-one corresponding pixel corresponding relationship. The dump module 64 dumps the color image and the depth image after the matching processing into the cache, and then the shot goodness detection module 65 estimates the shot goodness according to any one of the shot goodness measurement methods mentioned above and determines whether the shot goodness is greater than or equal to a preset threshold. When the shot goodness is greater than or equal to the preset threshold, the shot processing module 66 continues to perform the shot processing. The matting module 661 identifies the foreground and background of the color image. Training of the matting module 661 is done offline and the model is saved for future testing. The blurring module 662 then gradually blurs the background while the subject of the image, referred to as the foreground, remains in sharp focus, and the output module 67 then outputs a bokeh image. And when the shot goodness is smaller than the preset threshold value, reminding the user to change the shooting angle of the camera. The photographing angle detection module 68 detects whether the photographing angle of the camera is changed. When it is detected that the shooting angle of the camera has changed, the color image acquisition means 61 and the depth image acquisition means 62 are instructed to re-acquire the color image and the depth image, and field-of-view matching and pixel correspondence matching and subsequent shot goodness detection are continued.
In addition, the image is divided to detect the shot goodness of the source image (i.e., the color image and the depth image described above); meanwhile, the shot goodness can be displayed on the display, if the shot goodness is larger than or equal to a preset threshold value, the image is stored in a memory (non-cache), and shot processing is continuously carried out and output. And if the shot goodness is smaller than the preset threshold value, deleting the source image from the cache, and prompting the user to change the shooting angle.
In some embodiments, the image may be stored in the memory by touching a "shoot" or "pop" button, and the subsequent steps of pop processing may be continued.
In some embodiments, the deletion of the source image, i.e., the color image and the depth image captured above, may be triggered by touching the "abandon" button, and detection may continue as to whether the user changes the shooting angle.
In some embodiments, if the shot goodness is greater than or equal to the preset threshold, a message for reminding the user to select and trigger the "proceed with shot processing" button may be broadcasted in voice or presented at the user's interactive interface. For example, the shooting may be broadcasted by voice or displayed on a display to indicate that "the shot effect is good, please continue shooting shot". If the shot goodness is smaller than the preset threshold, voice broadcasting or information for reminding the user of changing the shooting angle is given on an interactive interface of the user, for example, the information can be voice broadcast or the information can be displayed on a display that the shot effect is not good and the shooting angle is required to be changed in a trial mode.
The shot goodness detection module 65 may calculate or obtain the shot goodness by the following steps, including:
step one, receiving a boundary box area of a matched color image and a boundary box area of a matched depth image as input.
In practice, the input color image bounding box region and the depth image bounding box region are two image regions of the same size, the color image bounding box region is cut out from the color image, and the depth image bounding box region is from the depth image. The two image areas have a one-to-one correspondence of pixels, which is established off-line by the calibration module. Wherein, the pixels in the border frame area of the color image will store the color information, and the pixels in the border frame area of the depth image will store the corresponding depth information.
And step two, extracting edge information in a boundary frame area of the depth image, and generating a first edge image according to an extraction result. The edge image is a binary image having edge information.
And step three, extracting edge information in a boundary frame area in the color image, and generating a second edge image according to an extraction result.
And step four, carrying out logical AND operation on the first edge image and the second edge image obtained in the step two and the step three, and storing the operation result in the overlapped edge image.
The logical and operation in this step searches for pixels with pixel values of 0 or 1(255) in the first and second edge images, i.e. selects effective pixels that appear as edges in both the first and second edge images. Whether 0 or 1 (or 255) is specifically found depends on which of the two values are used in the binary image and to which of the two values the edge information is set.
The overlapping edge image holds information of pixels that appear as edges in both the first edge image and the second edge image. The overlapping edge image may also be a binary image.
For example, if the pixel value of the edge in the first edge image and the second edge image is 1, the values of the other pixels are 0. The pixels with the pixel value of 1 in the first edge image and the second edge image are effective pixels.
And performing logical and operation on the first edge image and the second edge image to find a pixel with a pixel value of 1 in the first edge image and the second edge image. If the pixel value of the first pixel in the first edge image is 1 and the pixel value corresponding to the second edge image is 1, the first pixel appears as an edge in both the first and second edge images, and the corresponding pixel in the overlapping edge image stores the information of the first pixel, for example, the corresponding pixel in the overlapping edge image is assigned to the valid pixel value of 1. If the pixel value of the first pixel in the first and second edge images is zero, the condition of representing an edge in both the first edge image and the second edge image is not satisfied, and the information of the first pixel will not be saved or reflected in the overlapping edge image, for example, the corresponding pixel in the overlapping edge image may be assigned 0. And performing similar logical AND operation on each pixel in the first edge image and the second edge image to obtain an overlapped edge image.
And step five, extracting the depth value and the confidence coefficient corresponding to each pixel of the edge position in the overlapped edge image in the step four in the depth image. The confidence therein is given by the depth sensor, between 0 and 1.
And step six, calculating the weighted average value X of the confidence coefficient and the depth value of each pixel at the edge position.
Figure BDA0002787420590000211
Wherein, CiIs the ith pixel (x) of the plurality of pixels corresponding to the edge position of the overlapped edge imagei,yi) Depth confidence of, DiAnd the depth value of the ith pixel is obtained, n is the number of the plurality of pixels in the overlapped edge image, and i and n are both natural numbers.
And step seven, calculating the number ratio Y of the edge pixels in the overlapped edge image and the second edge image. And Y is E/D. Where E represents the number of edge pixels in the overlapping edge image and D represents the number of edge pixels in the second edge image.
And step eight, the shot goodness G is the product of X and Y. G ═ X × Y.
Step nine, the preset threshold is set to be 0.9 according to experience. If G is larger than or equal to 0.9, the self-timer shot effect is good, and shot processing is recommended to be continued. For G <0.9, it is stated that the self-timer shot effect may not be good, and the image viewpoint needs to be changed, suggesting the user to change the shooting angle.
A shot image is an image that properly separates the background from the foreground. If the background and the foreground in the source image are easily separated and the focusing is good, the shot image generated by executing the shot processing based on the source image is good and satisfactory.
X and Y are required together to judge the shot goodness, and X or Y alone cannot be used for measuring the shot goodness. Consider the following:
in the first case: y is 0.9, X is 0.3, and the shot fineness G is 0.27.
A high value of Y means that the edge information is displayed well in the color image, but a low value of X indicates that the subject is farther from the camera. In this case, it is very difficult to separate the background and the foreground, since both may be at approximately the same distance.
In the second case: y is 0.3, X is 0.9, and the shot fineness G is 0.27.
Here, although the photographic subjects are quite close (the X value is high), the color image may not be well photographed due to lack of edge details (some blurring). Therefore, shot processing is performed on the source image at this time, and the generated shot image may not be a satisfactory result.
In the third case, X is 0.99 and Y is 0.99. In this case, the shot fineness G was 0.98.
This indicates that the photographic subject is close to the camera and the image is well focused without blurring and the photographic effect is good.
When both the values of X and Y are low, the shot excellence G is low. When the shot object is far away from the camera and the image edge is blurred, the shot image is not good when the shot processing is performed on the source image.
The embodiment of the application provides an excellent method for determining a shot image before actually processing an image, and allows a user to adjust the shooting angle of a mobile phone camera to obtain a better shot image. The method can also be applied to the photographing or shooting process of other electronic devices such as cameras, video cameras, tablet computers and the like, and guides a user to capture a good shot image, thereby saving computing power, storage space and batteries.
Although the application has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. This application is intended to embrace all such modifications and variations and is limited only by the scope of the appended claims. In particular regard to the various functions performed by the above described components, the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the specification.
That is, the above description is only an embodiment of the present application, and not intended to limit the scope of the present application, and all equivalent structures or equivalent flow transformations made by using the contents of the specification and the drawings, such as mutual combination of technical features between various embodiments, or direct or indirect application to other related technical fields, are included in the scope of the present application.
In this application, "exemplary" and "in some embodiments" are used to mean "serving as an example, instance, or illustration. Any embodiment described herein as "exemplary" or "in some embodiments" is not necessarily to be construed as preferred or advantageous over other embodiments. The previous description is provided to enable any person skilled in the art to make and use the present application. In the foregoing description, various details have been set forth for the purpose of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known structures and processes are not shown in detail to avoid obscuring the description of the present application with unnecessary detail. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.

Claims (13)

1. An image bokeh processing method, comprising:
acquiring the shot goodness of a source image, wherein the source image is an image to be subjected to shot processing, and the shot goodness is positively correlated with the quality of a shot image generated based on the source image;
and when the shot goodness is larger than or equal to a preset threshold value, performing shot processing on the source image to obtain a shot image.
2. The image bokeh processing method according to claim 1,
the source image comprises a color image and a depth image which are acquired by the camera equipment on the same scene;
the step of obtaining the shot goodness of the source image comprises the following steps: and generating the shot goodness in direct proportion according to the accuracy of the depth information of the depth image and the overlapping degree of the edge pattern between the foreground and the background in the color image and the edge pattern between the foreground and the background in the depth image.
3. The image shot processing method according to claim 2, wherein the source image comprises a bounding box region of a color image and a bounding box region of a depth image;
the step of generating the shot goodness comprises:
extracting an edge pattern of a border frame area of the color image, and generating a first edge image according to an extraction result;
extracting an edge pattern of a bounding box area of the depth image, and generating a second edge image according to an extraction result;
screening out pixels which are shown as edges in the first edge image and the second edge image, and generating an overlapped edge image according to a screening result;
extracting depth values of a plurality of pixels corresponding to edges of the overlapping edge image and confidence degrees of the depth values in a bounding box region of the depth image;
carrying out weighted average calculation on the depth values and the confidence degrees of the plurality of pixels to obtain a weighted average value;
calculating a ratio of the number of pixels of the overlapping edge image and the second edge image;
calculating the goodness of shot according to the weighted average and the pixel number ratio, wherein the goodness of shot is in direct proportion to the weighted average and the pixel number ratio.
4. The image bokeh processing method according to claim 3, wherein the weighted average value X is:
Figure FDA0002787420580000021
wherein, CiIs the ith pixel (x) of the plurality of pixels corresponding to the edge in the overlapping edge imagei,yi) The depth value of (D) corresponds to a confidence level, DiAnd the depth value of the ith pixel is obtained, n is the number of the plurality of pixels in the overlapped edge image, and i and n are both natural numbers.
5. The image shot processing method according to claim 3, wherein the shot goodness is a product of the weighted average and the pixel number ratio.
6. The image shot processing method according to claim 4, wherein before extracting the first boundary map in the bounding box region of the color image, the image shot processing method further comprises: acquiring a bounding box area of the color image;
before extracting the second boundary map from the bounding box region of the depth image, the map scene processing method further includes: and acquiring a bounding box area of the depth image.
7. The image bokeh processing method according to claim 3, wherein before the obtaining of bokeh goodness of the source image, the image bokeh processing method further comprises:
acquiring the color image and the depth image;
and carrying out field matching and pixel corresponding matching processing on the color image and the depth image.
8. The image bokeh processing method according to claim 1, further comprising: and when the shot goodness is smaller than the preset threshold value, deleting the source image and outputting information for reminding a user to change the shooting angle.
9. The image shot processing method according to any one of claims 1 to 7, wherein when the shot goodness is less than the preset threshold, the image shot processing method further comprises:
detecting whether the shooting angle of the equipment is changed;
and when the shooting angle of the equipment is detected to be changed, continuously executing the step of acquiring the shot goodness of the source image.
10. The image shot processing method according to any of claims 1-7, wherein prior to said obtaining the shot goodness of the source image, the method further comprises: storing the source image in a cache of a device;
when the shot goodness is greater than or equal to the preset threshold, the method further comprises: transferring the source image or the shot image generated based on the source image from the cache to a non-cache memory of the device;
and when the shot goodness is smaller than the preset threshold value, deleting the source image from the cache.
11. An electronic device, comprising;
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring the shot goodness of a source image, the source image is an image to be processed in a shot mode, and the shot goodness is positively correlated with the quality of a shot image generated based on the source image;
and the shot processing unit is used for performing shot processing on the source image to obtain a shot image when the shot goodness is greater than or equal to a preset threshold value.
12. An electronic device comprising a memory and a processor, the memory storing a program for execution by the processor to perform the image shot processing method of any one of claims 1 to 10.
13. A readable storage medium, characterized in that a program is stored therein for execution by a processor to perform the image shot processing method according to any one of claims 1 to 10.
CN202011306105.1A 2020-11-19 2020-11-19 Picture shot processing method, electronic device and storage medium Pending CN114520866A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011306105.1A CN114520866A (en) 2020-11-19 2020-11-19 Picture shot processing method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011306105.1A CN114520866A (en) 2020-11-19 2020-11-19 Picture shot processing method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN114520866A true CN114520866A (en) 2022-05-20

Family

ID=81594868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011306105.1A Pending CN114520866A (en) 2020-11-19 2020-11-19 Picture shot processing method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114520866A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5440197A (en) * 1993-10-05 1995-08-08 Tir Technologies, Inc. Backlighting apparatus for uniformly illuminating a display panel
US20090067707A1 (en) * 2007-09-11 2009-03-12 Samsung Electronics Co., Ltd. Apparatus and method for matching 2D color image and depth image
CN101968880A (en) * 2010-09-10 2011-02-09 中国科学院软件研究所 Method for producing image bokeh effect
CN102999901A (en) * 2012-10-17 2013-03-27 中国科学院计算技术研究所 Method and system for processing split online video on the basis of depth sensor
CN104657936A (en) * 2013-11-15 2015-05-27 宏达国际电子股份有限公司 Method, electronic device and medium for adjusting depth values
CN106898048A (en) * 2017-01-19 2017-06-27 大连理工大学 A kind of undistorted integration imaging 3 D displaying method for being suitable for complex scene
KR20170082228A (en) * 2016-01-06 2017-07-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20170230638A1 (en) * 2015-02-26 2017-08-10 Dual Aperture International Co. Ltd. Depth Measurement Techniques for a Multi-Aperture Imaging System
CN108520241A (en) * 2018-04-11 2018-09-11 深圳市万普拉斯科技有限公司 Fingerprint collecting method, device based on optical finger print technology and user terminal
US20180357813A1 (en) * 2017-06-09 2018-12-13 Foundation Of Soongsil University-Industry Cooperation Hole filling method using estimated spatio-temporal background information, and recording medium and apparatus for performing the same
CN110009556A (en) * 2018-01-05 2019-07-12 广东欧珀移动通信有限公司 Image background weakening method, device, storage medium and electronic equipment
CN110322411A (en) * 2019-06-27 2019-10-11 Oppo广东移动通信有限公司 Optimization method, terminal and the storage medium of depth image
CN111539960A (en) * 2019-03-25 2020-08-14 华为技术有限公司 Image processing method and related device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5440197A (en) * 1993-10-05 1995-08-08 Tir Technologies, Inc. Backlighting apparatus for uniformly illuminating a display panel
US20090067707A1 (en) * 2007-09-11 2009-03-12 Samsung Electronics Co., Ltd. Apparatus and method for matching 2D color image and depth image
CN101968880A (en) * 2010-09-10 2011-02-09 中国科学院软件研究所 Method for producing image bokeh effect
CN102999901A (en) * 2012-10-17 2013-03-27 中国科学院计算技术研究所 Method and system for processing split online video on the basis of depth sensor
CN104657936A (en) * 2013-11-15 2015-05-27 宏达国际电子股份有限公司 Method, electronic device and medium for adjusting depth values
US20170230638A1 (en) * 2015-02-26 2017-08-10 Dual Aperture International Co. Ltd. Depth Measurement Techniques for a Multi-Aperture Imaging System
KR20170082228A (en) * 2016-01-06 2017-07-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106898048A (en) * 2017-01-19 2017-06-27 大连理工大学 A kind of undistorted integration imaging 3 D displaying method for being suitable for complex scene
US20180357813A1 (en) * 2017-06-09 2018-12-13 Foundation Of Soongsil University-Industry Cooperation Hole filling method using estimated spatio-temporal background information, and recording medium and apparatus for performing the same
CN110009556A (en) * 2018-01-05 2019-07-12 广东欧珀移动通信有限公司 Image background weakening method, device, storage medium and electronic equipment
CN108520241A (en) * 2018-04-11 2018-09-11 深圳市万普拉斯科技有限公司 Fingerprint collecting method, device based on optical finger print technology and user terminal
CN111539960A (en) * 2019-03-25 2020-08-14 华为技术有限公司 Image processing method and related device
CN110322411A (en) * 2019-06-27 2019-10-11 Oppo广东移动通信有限公司 Optimization method, terminal and the storage medium of depth image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KUSHAL SINGLA, ET AL: "Bokeh Effect in Images on Objects Based on User Interest", 《2019 IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, COMPUTING AND COMMUNICATION TECHNOLOGIES (CONECCT)》, 28 February 2020 (2020-02-28) *
彭艳红: "基于双目视觉的景深渲染系统研究", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》, 15 December 2019 (2019-12-15) *

Similar Documents

Publication Publication Date Title
CN107172364B (en) Image exposure compensation method and device and computer readable storage medium
CN109361865B (en) Shooting method and terminal
CN108491775B (en) Image correction method and mobile terminal
CN107592466B (en) Photographing method and mobile terminal
CN108234882B (en) Image blurring method and mobile terminal
CN109151442B (en) Image shooting method and terminal
CN108989672B (en) Shooting method and mobile terminal
CN108712603B (en) Image processing method and mobile terminal
CN107566749B (en) Shooting method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN110930329B (en) Star image processing method and device
CN107948498B (en) A kind of elimination camera Morie fringe method and mobile terminal
CN108776822B (en) Target area detection method, device, terminal and storage medium
CN109241832B (en) Face living body detection method and terminal equipment
CN111031234B (en) Image processing method and electronic equipment
CN109819166B (en) Image processing method and electronic equipment
CN109246351B (en) Composition method and terminal equipment
CN108174110B (en) Photographing method and flexible screen terminal
CN111145087A (en) Image processing method and electronic equipment
CN110636225B (en) Photographing method and electronic equipment
CN109639981B (en) Image shooting method and mobile terminal
CN109840476B (en) Face shape detection method and terminal equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN107798662B (en) Image processing method and mobile terminal
CN108197560B (en) Face image recognition method, mobile terminal and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination