CN107454332B - Image processing method and device and electronic equipment - Google Patents

Image processing method and device and electronic equipment Download PDF

Info

Publication number
CN107454332B
CN107454332B CN201710752944.8A CN201710752944A CN107454332B CN 107454332 B CN107454332 B CN 107454332B CN 201710752944 A CN201710752944 A CN 201710752944A CN 107454332 B CN107454332 B CN 107454332B
Authority
CN
China
Prior art keywords
background
foreground
light source
strong light
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710752944.8A
Other languages
Chinese (zh)
Other versions
CN107454332A (en
Inventor
侯剑堃
李骈臻
张长定
叶志鸿
许清泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Meitu Technology Co Ltd
Original Assignee
Xiamen Meitu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Meitu Technology Co Ltd filed Critical Xiamen Meitu Technology Co Ltd
Priority to CN201710752944.8A priority Critical patent/CN107454332B/en
Publication of CN107454332A publication Critical patent/CN107454332A/en
Application granted granted Critical
Publication of CN107454332B publication Critical patent/CN107454332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides an image processing method and device and electronic equipment, and relates to the technical field of image processing. The image processing method comprises the following steps: obtaining the focusing position of the picture; calculating the front and back depth of field corresponding to the focusing position and the imaging clear range; obtaining the foreground, the middle scene and the background of the photo based on the clear range; and respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene. By using the image processing method, the image processing device and the electronic equipment, the implementation is convenient and fast, and the blurring effect of the picture can be improved.

Description

Image processing method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and an electronic device.
Background
In order to focus the focus of a shot image on a shot subject, a partial area in a shot picture needs to be blurred, and in the prior art, the blurring problem is a big problem in a double-shot system, the calculation is complex, and the blurring effect needs to be improved.
Disclosure of Invention
In view of the above, embodiments of the present invention provide an image processing method, an image processing apparatus, and an electronic device, so as to solve the problems in the prior art that image blurring calculation is complex and an effect needs to be improved.
The preferred embodiment of the present invention provides an image processing method, including:
obtaining the focusing position of the picture;
calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
obtaining the foreground, the middle scene and the background of the photo based on the clear range;
and respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene.
Optionally, the step of calculating the front-back depth of field and the clear range of the image corresponding to the in-focus position includes:
obtaining depth information of the focusing position through a depth map;
and calculating the front and back depth of field of the focusing position and the clear range of the imaging according to the depth information based on the focusing principle of the single lens reflex.
Optionally, the step of obtaining the foreground, the middle view and the background of the photo based on the clear range includes:
calculating to obtain an object distance range corresponding to the clear range based on the clear range;
taking the image part of the picture with the corresponding object distance smaller than the object distance range as a foreground;
taking the image part of the corresponding object distance in the picture in the object distance range as a medium scene;
and taking the image part of the picture with the corresponding object distance larger than the object distance range as a background.
Optionally, the step of performing blur processing on the foreground and the background respectively includes: respectively carrying out fuzzy processing on the foreground and the background in the YUV three channels;
before the background is blurred, the method further comprises:
searching the position of a strong light source in the image corresponding to the background in the photo;
and performing light source enhancement on the position of the strong light source in the background on a Y channel.
Optionally, the step of finding a position of a strong light source in the image corresponding to the background in the photo includes:
searching all strong light points in the picture, wherein the Y channel value of the strong light points is larger than a preset value;
and performing de-agglomeration and de-noising point processing on all the strong light points to obtain the positions of the strong light sources.
Optionally, the method further comprises:
obtaining different out-of-focus imaging intensities by changing the intensity of light source enhancement on the position of the strong light source in the background on a Y channel; and/or the first and/or second light sources,
and obtaining different out-of-focus imaging shapes by changing the shape of a filter kernel for removing the lumps and the noise points of all the strong light spots.
Optionally, the step of performing blur processing on the foreground and the background respectively further includes:
obtaining the size of the photograph;
obtaining a maximum fuzzy radius corresponding to the size of the photo according to a preset rule, wherein the maximum fuzzy radius corresponds to the distance between the middle scene and the foreground or the background which is farthest;
obtaining the adjustment information of the maximum fuzzy radius;
and adjusting the maximum blur radius according to the adjusting information so as to enhance or weaken the blur degree of the photo.
Optionally, before blurring the foreground and background, the method further comprises:
zooming the photo according to a preset proportion;
after blurring the foreground and background, the method further comprises: and restoring the picture according to the proportion corresponding to the preset proportion.
Another preferred embodiment of the present invention provides an image processing apparatus, including:
the information acquisition module is used for acquiring the focusing position of the photo;
the analysis module is used for calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
the information acquisition module is used for acquiring the foreground, the middle scene and the back scene of the photo based on the clear range;
and the fuzzy processing module is used for respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is less than that of the pixel farther from the middle scene.
A further preferred embodiment of the present invention provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the computer program to perform the following steps:
obtaining the focusing position of the picture;
calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
obtaining the foreground, the middle scene and the background of the photo based on the clear range;
and respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene.
Still another preferred embodiment of the present invention provides a readable storage medium, where the readable storage medium includes a computer program, and the computer program controls, when running, an apparatus in which the readable storage medium is located to execute the image processing method provided in the embodiment of the present invention.
According to the image processing method, the image processing device and the electronic equipment, the fuzzy processing is respectively carried out on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, so that the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene, the smooth transition among the foreground, the middle scene and the background is realized, the blurring effect of different degrees rich in layering is achieved, the implementation is convenient and fast, and the blurring effect is better.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a block diagram of an electronic device 10 according to a preferred embodiment of the present invention.
Fig. 2 is a flowchart of an image processing method according to a preferred embodiment of the present invention.
FIG. 3 is a diagram illustrating the sub-steps included in the step S22 shown in FIG. 2 according to one embodiment.
Fig. 4 is a flowchart of another image processing method according to a preferred embodiment of the invention.
Fig. 5 is a flowchart of another image processing method according to a preferred embodiment of the invention.
Fig. 6 is a block diagram of an image processing apparatus 20 according to a preferred embodiment of the present invention.
Icon: 10-an electronic device; 11-a memory; 12-a processor; 13-a network module; 20-an image processing device; 21-an information obtaining module; 22-an analysis module; 23-an information acquisition module; 24-fuzzy processing module.
Detailed Description
The double-camera system used in more and more products at present is a camera system composed of two cameras as the name implies. The main advantages of the dual-camera system are: the depth of field and the parallax of the image are calculated through the stereo vision, and the depth of field information is utilized to perform background blurring, object segmentation, 3D scanning, auxiliary focusing, motion recognition and other applications on the image. The information fusion is carried out on the pictures obtained by the two cameras, so that images with higher resolution, better color, larger dynamic range and less noise are expected to be obtained.
Different requirements are also put on the hardware of the dual-camera system due to different emphasis points of functions. One of the hardware requires as large a distance between the two cameras as possible to obtain higher depth of field accuracy. And another image needs to be superposed and synthesized by two cameras, so that the distance between the two cameras is more desirable to reduce errors generated during image fusion.
In addition, the bi-camera system is not only capable of synthesizing two pictures, but also capable of calculating depth information of an image, i.e., an object distance, from the two pictures, as described above. The depth information can assist the image to complete the functions of 3D (three-dimensional) of a single image, intelligent cutout and the like, and can also realize background blurring.
Research shows that the blurring problem is a big problem in a double-shot system, a good blurring effect is not only simple low-pass filtering, and if the double-shot system can simulate the blurring effect of a single-lens reflex camera, the blurring effect of pictures shot by the double-shot system can be obviously improved, and user experience is improved.
Based on this, the embodiment of the invention provides an image processing method, an image processing device and an electronic device which are suitable for a dual-shooting system, can realize smooth transition among a foreground, a middle scene and a background, and achieve blurring effects with different degrees and rich in layering, so that the blurring effect of a picture shot by the dual-shooting system is conveniently improved.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Fig. 1 is a block diagram of an electronic device 10 according to a preferred embodiment of the invention. The electronic device 10 in the embodiment of the present invention may be a device capable of image processing such as an image processor. As shown in fig. 1, the electronic device 10 includes: memory 11, processor 12, network module 13 and image processing device 20.
The memory 11, the processor 12 and the network module 13 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 11 stores an image processing device 20, the image processing device 20 includes at least one software functional module which can be stored in the memory 11 in the form of software or firmware (firmware), and the processor 12 executes various functional applications and data processing, i.e. implements the image processing method in the embodiment of the present invention, by running the software programs and modules stored in the memory 11, such as the image processing device 20 in the embodiment of the present invention.
The Memory 11 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 11 is used for storing a program, and the processor 12 executes the program after receiving an execution instruction.
The processor 12 may be an integrated circuit chip having data processing capabilities. The Processor 12 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like. The various methods, steps and logic blocks disclosed in embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The network module 13 is used for establishing a communication connection between the electronic device 10 and an external communication terminal through a network, and implementing transceiving operations of network signals and data. The network signal may include a wireless signal or a wired signal.
It will be appreciated that the configuration shown in FIG. 1 is merely illustrative and that electronic device 10 may include more or fewer components than shown in FIG. 1 or may have a different configuration than shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
An embodiment of the present invention also provides a readable storage medium, which includes a computer program. The computer program controls the electronic device 10, where the readable storage medium is run, to perform the following image processing method.
Fig. 2 is a flowchart illustrating an image processing method according to a preferred embodiment of the invention. The method steps defined by the method-related flow, as applied to the electronic device 10, may be implemented by the processor 12. The specific process shown in fig. 2 will be described in detail below.
In step S21, the focus position of the photograph is obtained.
In this embodiment, the object to be photographed may be focused to obtain a focusing position when the object to be photographed is photographed by the two-camera system, or a user may select one focusing position by himself/herself after the object to be photographed is photographed by the two-camera system.
And step S22, calculating the front-back depth of field and the imaging clear range corresponding to the focusing position.
Referring to fig. 3, the embodiment of the present invention provides one exemplary implementation flow of step S22, which includes two sub-steps of step S221 and step S222.
Step S221, obtaining depth information of the focusing position through a depth map.
And step S222, calculating the front and back depth of field of the focusing position and the clear range of the imaging according to the depth information based on the focusing principle of the single lens reflex.
Optionally, in this embodiment, based on the focusing principle of the single lens reflex, the implementation procedure of calculating the front-back depth of field of the focused position and the clear range of the imaged image is as follows.
According to the imaging formula of the convex lens, the foreground depth D when the object is imaged1And a back field depth D2Are respectively as
Figure BDA0001391335580000081
Wherein N is a f-Number of focal length ratio; c is the diameter of the circle of confusion over the image, usually constant, and is related to the pixel size; u is the object distance; f is the focal length of the lens, and corresponds to f being NA, where a is the lens aperture.
The depth range of focus of the object is [ U-D ]1,U+D2]Substituting the above formula to obtain:
Figure BDA0001391335580000082
due to the fact that
Depth×Disparity=BL×f
Where BL is the BaseLine for two shots, f is the focal length of the main camera, and the focal distance reflects on parallax, there is the following formula.
Figure BDA0001391335580000083
Wherein d isURepresenting the corresponding parallax when the object distance is U, dXRepresents BL C N/f, only N is a variable, and dxIs proportional to N.
Therefore, the larger N is, the smaller the aperture is, the larger the front and rear field depth is, the larger the parallax span in the focusing range is, and the blurring is relatively inconspicuous. On the contrary, the smaller N is, the larger the aperture is, the smaller the front and rear field depth is, the smaller the parallax span range in the focusing range is, and the blurring is relatively more obvious at the moment.
And then at fixed dUThereafter, d is adjusted directlyXThe size of (2) can simulate the change process of the aperture.
And step S23, obtaining the foreground, the middle scene and the background of the photo based on the clear range.
In this embodiment, the clear range is used as a reference for subsequent blurring processing, and optionally, in this embodiment, based on the clear range, an object distance range corresponding to the clear range is obtained by calculation, and foreground, middle view and background division of the photo is completed according to the object distance range corresponding to the clear range obtained by calculation. The division rule is as follows.
And taking the image part of the picture with the corresponding object distance smaller than the object distance range as a foreground.
And taking the image part of the picture with the corresponding object distance in the object distance range as a medium scene.
And taking the image part of the picture with the corresponding object distance larger than the object distance range as a background.
Through the division, the area smaller than the object distance range corresponding to the clear range in the photo is used as the foreground, the area located in the object distance range corresponding to the clear range in the photo is used as the middle scene, and the area larger than the object distance range corresponding to the clear range in the photo is used as the background scene.
Step S24, performing blur processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the blur degree of the pixel closer to the middle scene in the foreground and the background is smaller than the blur degree of the pixel farther from the middle scene.
In the process of fuzzy processing, the middle scene is not subjected to fuzzy processing, the clarity is kept, and the foreground and the background are subjected to fuzzy processing. In the blurring process, the same kind of blurring process may be performed on the foreground and the background, or different kinds of blurring processes may be performed on the foreground and the background.
Optionally, in this embodiment, the foreground and the background are respectively blurred in the YUV three channels, and compared with the RGB color space, the decoupled YUV color space can ensure the image quality to the greatest extent. In addition, many digital devices also use YUV data streams, so that time consumption caused by data conversion can be reduced, the processing process is simplified, and the data processing complexity is reduced by respectively performing fuzzy processing on the foreground and the background by using three YUV channels.
In the foreground and the background, the implementation manner that the blur degree of the pixel closer to the middle scene is smaller than the blur degree of the pixel farther from the middle scene is more through the blur processing, and an exemplary implementation flow is exemplified in the embodiment of the present invention.
According to the physical model D which has the inverse relation between the parallax D and the object distance D and is oc 1/D, for a fixed focus distance D0Having a unique d0Corresponding to it. When the foreground corresponds to d>d0+ △ d, medium view corresponds to d ∈ [ d ∈0-△d,d0+△d]The background corresponds to d<d0-△d。
△ d is the front and rear depth of field, and is similar to the process of adjusting the aperture size of a single lens reflex camera, wherein the larger the aperture △ d is, the shallower the front and rear depth of field is, and conversely, the smaller the aperture △ d is, the larger the front and rear depth of field is.
Further, the foreground blur radius R1 ℃ (d-d)0- △ d), rear view blur radius R2 ℃ (d)0- △ d-d), defining two normal numbers C1 and C2, corresponding to R1 ═ C1 (d-d)0- △ d) and R2 ═ C2 (d)0△ d-d), the degree of blur of the foreground will generally be weaker than that of the background, so there will generally be C2 ≧ C1, e.g., C2 ≧ 2C1, in which case the degree of blur of the foreground and background can be adjusted independently by adjusting the values of C1 and C2.
In this way, the object distance D of each point can be obtained by the double-shot system, the parallax D is calculated, and after C1 and C2 are defined, the blur radius of each point, that is, the dispersion radius of the point energy can be obtained.
Optionally, in this embodiment, there may be multiple options for the foreground and background blur kernels, for example, a gaussian blur kernel may be taken at the same time. Also for example, a circular filter kernel may be taken simultaneously. For another example, the gaussian blur kernel may be selected in the foreground and the circular blur kernel may be selected in the background.
Based on the above research, please refer to fig. 4 in combination, the step of performing the blurring processing on the foreground and the background respectively includes four substeps of step S241 to step S244.
Step S241, obtaining the size of the photo.
Step S242, obtaining a maximum blur radius corresponding to the size of the photo according to a preset rule, where the maximum blur radius corresponds to a distance from the middle scene to the foreground or the back scene farthest.
In step S243, the adjustment information of the maximum blur radius is obtained.
Step S244, adjusting the maximum blur radius according to the adjustment information, so as to enhance or reduce the blur degree of the photo.
By defining a maximum blur radius during the calculation, for example, 1200 million images may be selected for a maximum blur radius of 60. The maximum blur radius corresponds to the foreground or background farthest from the middle scene, the blur radii corresponding to other positions are calculated in proportion, and the maximum blur radius is enlarged or reduced in proportion for images of other sizes. This allows the same degree of blurring effect to be achieved for different size images. In addition, the user may be given the option of adjusting this maximum blur radius to enhance or reduce the overall blur level.
Referring to fig. 5, in order to increase the out-of-focus imaging effect, the method further includes step S25 and step S26 before optionally performing the blurring process on the background.
And step S25, searching the position of the strong light source in the image corresponding to the background in the photo.
Optionally, in this embodiment, the position of the strong light source is found through the following method: and searching all strong light spots in the picture, wherein the Y channel value of the strong light spot is larger than a preset value, and obtaining the positions of the strong light sources after performing de-agglomeration and de-noising point processing on all the strong light spots.
Taking 8-bit YUV as an example, the highlight point may be defined as a point satisfying Y > ═ th, where th is a preset value, and th may be about 250, and a point having a Y channel value greater than or equal to th is defined as the highlight point.
The intense light spot is researched and classified into an intense light source, namely a real light source, and a large-area reflecting plate, such as glass, a steel plate and the like. Due to the 8bit limit, the present embodiment should recover energy from an intense light source, not a reflector. In order to distinguish the strong light source from the reflector, a search radius R is defined for each strong light point which is obtained by the previous calculation and satisfies Y > ═ th, the dispersion radius of the point can be generally taken as R, and in all other pixel points within the radius R, if the proportion of the strong light points does not exceed a given proportion, such as 25%, the point is considered as an isolated strong light point and is defined as the strong light source; otherwise, the light is considered to be a reflecting plate which is a strong light spot connected into a whole.
And step S26, performing light source enhancement on the position of the strong light source in the background on a Y channel.
The strong light spot is divided into a strong light source and a reflector. The strong light source is then energy enhanced and then blurred to produce the bokeh effect. After the treatment, the bokeh effect only appears at the strong light source and does not appear at the reflector.
According to the user requirements, bokeh can be presented in both foreground and background, or only in background. And for the situation that the foreground and the background appear at the same time, processing the foreground and the background in the same way as the above. For the situation that only the background has bokeh, the strong light source in the foreground is removed only after the position of the strong light source is determined.
In the process of bokeh presentation, in order to obtain different bokeh intensities and bokeh shapes, the corresponding light source enhancement intensity can be changed or the corresponding filtering kernel shape can be changed, common filtering kernel shapes include heart shapes, star shapes, Christmas tree shapes and the like, and the self-selection effect of various bokehs can be presented by adding an aperture lens with a corresponding shape in front of the lens in single lens reflex shooting. The option of changing the bokeh strength and shape can also be opened for the user to adjust.
Therefore, the method of this embodiment may further include: and obtaining different out-of-focus imaging intensities by changing the intensity of light source enhancement on the position of the strong light source in the background on the Y channel. Or/and obtaining different out-of-focus imaging shapes by changing the shape of a filtering kernel for carrying out de-agglomeration and noise point removal processing on all the intense light points.
To ensure image processing speed, optionally, before blurring the foreground and background, the method further comprises: and zooming the photo according to a preset proportion. After blurring the foreground and background, the method further comprises: and restoring the picture according to the proportion corresponding to the preset proportion.
By properly scaling the image, for example, 1200 ten thousand images, from the perspective of an algorithm, the image can be blurred and magnified after being compressed to one eighth in width and height, and this processing method can also ensure that the final output foreground and background blurring effects are not damaged. And the effect preview image can be displayed when the user shoots according to the size of the image by appropriate compression.
On the basis of the above, as shown in fig. 6, an embodiment of the present invention provides an image processing apparatus 20, which includes an information obtaining module 21, an analyzing module 22, an information obtaining module 23, and a blurring processing module 24.
The information obtaining module 21 is configured to obtain an in-focus position of the photo.
Since the information obtaining module 21 is similar to the implementation principle of step S21 in fig. 2, it will not be further described here.
The analysis module 22 is used for calculating the front-back depth of field corresponding to the in-focus position and the clear range of the imaging.
Since the analysis module 22 is similar to the implementation principle of step S22 in fig. 2, it will not be further described here.
The information obtaining module 23 is configured to obtain a foreground, a middle scene, and a background of the photo based on the clear range.
Since the information obtaining module 23 is similar to the implementation principle of step S23 in fig. 2, it will not be further described here.
The blurring processing module 24 is configured to perform blurring processing on the foreground and the background according to a distance between each pixel in the foreground and the background and the middle scene, where a blurring degree of a pixel closer to the middle scene in the foreground and the background is smaller than a blurring degree of a pixel farther from the middle scene.
Since the fuzzy processing module 24 is similar to the implementation principle of step S24 in fig. 2, it will not be further described here.
The image processing method, the image processing device and the electronic equipment 10 in the embodiment of the invention are based on the depth map of a double-shooting system, take the shooting principle of a single lens reflex as algorithm logic, process the shot image to obtain the effect of simulating single lens reflex true blurring, are simple to realize, have high operation speed, can ensure the high quality of a blurring picture, and are particularly suitable for realizing the blurring effect of the image with the depth map. In addition, multiple functions are opened for the user to select, so that the user can obtain different blurring effects, and the applicability is increased.
The image processing method, the image processing device and the electronic equipment 10 in the embodiment of the invention take the depth map as a reference, and automatically blur the image of the double-shot system or realize the function of shooting first and then focusing. The user can select not only the region but also the degree of blurring, presence or absence of bokeh, and shape and brightness of bokeh. The whole scheme can automatically select the compression multiple according to the size of the image, ensures the blurring effect and completes image processing at high speed at the same time, and enables a user to see the preview effect image in the shooting process. The whole processing logic is strictly in accordance with the shooting principle of the single lens reflex camera, and the final blurring effect is very close to that of the single lens reflex real shot photos.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or a part thereof, which essentially contributes to the prior art, can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, an electronic device 10, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only an alternative embodiment of the present invention and is not intended to limit the present invention, and various modifications and variations of the present invention may occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. An image processing method, characterized in that the method comprises:
obtaining the focusing position of the picture;
calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
obtaining the foreground, the middle scene and the background of the photo based on the clear range;
respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene;
wherein, the step of respectively carrying out fuzzy processing on the foreground and the background comprises the following steps: respectively carrying out fuzzy processing on the foreground and the background in the YUV three channels;
before the background is blurred, the method further comprises:
searching the position of a strong light source in the image corresponding to the background in the photo;
and performing light source enhancement on the position of the strong light source in the background on a Y channel.
2. The image processing method according to claim 1, wherein the step of calculating the depth of field and the clear range of the image corresponding to the in-focus position comprises:
obtaining depth information of the focusing position through a depth map;
and calculating the front and back depth of field of the focusing position and the clear range of the imaging according to the depth information based on the focusing principle of the single lens reflex.
3. The image processing method according to claim 1, wherein the step of obtaining the foreground, the middle and the background of the photograph based on the clear range comprises:
calculating to obtain an object distance range corresponding to the clear range based on the clear range;
taking the image part of the picture with the corresponding object distance smaller than the object distance range as a foreground;
taking the image part of the corresponding object distance in the picture in the object distance range as a medium scene;
and taking the image part of the picture with the corresponding object distance larger than the object distance range as a background.
4. The image processing method according to claim 1, wherein the step of finding the position of the strong light source in the image corresponding to the background in the photo comprises:
searching all strong light points in the picture, wherein the Y channel value of the strong light points is larger than a preset value;
and performing de-agglomeration and de-noising point processing on all the strong light points to obtain the positions of the strong light sources.
5. The image processing method according to claim 4, characterized in that the method further comprises:
obtaining different out-of-focus imaging intensities by changing the intensity of light source enhancement on the position of the strong light source in the background on a Y channel; and/or the first and/or second light sources,
and obtaining different out-of-focus imaging shapes by changing the shape of a filter kernel for removing the lumps and the noise points of all the strong light spots.
6. The image processing method according to any one of claims 1 to 5, wherein the step of blurring the foreground and the background, respectively, further comprises:
obtaining the size of the photograph;
obtaining a maximum fuzzy radius corresponding to the size of the photo according to a preset rule, wherein the maximum fuzzy radius corresponds to the distance between the middle scene and the foreground or the background which is farthest;
obtaining the adjustment information of the maximum fuzzy radius;
and adjusting the maximum blur radius according to the adjusting information so as to enhance or weaken the blur degree of the photo.
7. The image processing method according to any one of claims 1 to 5, wherein before blurring the foreground and background, the method further comprises:
zooming the photo according to a preset proportion;
after blurring the foreground and background, the method further comprises: and restoring the picture according to the proportion corresponding to the preset proportion.
8. An image processing apparatus characterized by comprising:
the information acquisition module is used for acquiring the focusing position of the photo;
the analysis module is used for calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
the information acquisition module is used for acquiring the foreground, the middle scene and the back scene of the photo based on the clear range;
the fuzzy processing module is used for respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene;
the fuzzy processing module is used for respectively carrying out fuzzy processing on the foreground and the background through the following steps: respectively carrying out fuzzy processing on the foreground and the background in the YUV three channels;
before the fuzzy processing module carries out fuzzy processing on the background, the fuzzy processing module is further configured to:
searching the position of a strong light source in the image corresponding to the background in the photo;
and performing light source enhancement on the position of the strong light source in the background on a Y channel.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of:
obtaining the focusing position of the picture;
calculating the front and back depth of field corresponding to the focusing position and the imaging clear range;
obtaining the foreground, the middle scene and the background of the photo based on the clear range;
respectively carrying out fuzzy processing on the foreground and the background according to the distance between each pixel in the foreground and the background and the middle scene, wherein the fuzzy degree of the pixel closer to the middle scene in the foreground and the background is smaller than that of the pixel farther from the middle scene;
wherein, the step of respectively carrying out fuzzy processing on the foreground and the background comprises the following steps: respectively carrying out fuzzy processing on the foreground and the background in the YUV three channels;
before the fuzzy processing is carried out on the background, the method further comprises the following steps:
searching the position of a strong light source in the image corresponding to the background in the photo;
and performing light source enhancement on the position of the strong light source in the background on a Y channel.
CN201710752944.8A 2017-08-28 2017-08-28 Image processing method and device and electronic equipment Active CN107454332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710752944.8A CN107454332B (en) 2017-08-28 2017-08-28 Image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710752944.8A CN107454332B (en) 2017-08-28 2017-08-28 Image processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN107454332A CN107454332A (en) 2017-12-08
CN107454332B true CN107454332B (en) 2020-03-10

Family

ID=60494389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710752944.8A Active CN107454332B (en) 2017-08-28 2017-08-28 Image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN107454332B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11893668B2 (en) 2021-03-31 2024-02-06 Leica Camera Ag Imaging system and method for generating a final digital image via applying a profile to image information

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108234865A (en) * 2017-12-20 2018-06-29 深圳市商汤科技有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN110009555B (en) * 2018-01-05 2020-08-14 Oppo广东移动通信有限公司 Image blurring method and device, storage medium and electronic equipment
CN108234826B (en) * 2018-01-15 2021-03-02 厦门美图之家科技有限公司 Image processing method and device
CN108335323B (en) * 2018-03-20 2020-12-29 厦门美图之家科技有限公司 Blurring method of image background and mobile terminal
CN108564541B (en) * 2018-03-28 2022-04-15 麒麟合盛网络技术股份有限公司 Image processing method and device
CN108629745B (en) * 2018-04-12 2021-01-19 Oppo广东移动通信有限公司 Image processing method and device based on structured light and mobile terminal
CN109035167B (en) * 2018-07-17 2021-05-18 北京新唐思创教育科技有限公司 Method, device, equipment and medium for processing multiple faces in image
CN109559272A (en) * 2018-10-30 2019-04-02 深圳市商汤科技有限公司 A kind of image processing method and device, electronic equipment, storage medium
CN109819188B (en) * 2019-01-30 2022-02-08 维沃移动通信有限公司 Video processing method and terminal equipment
CN109919867A (en) * 2019-02-21 2019-06-21 成都品果科技有限公司 A kind of filter method and device
CN113055584B (en) * 2019-12-26 2023-06-30 深圳市海思半导体有限公司 Focusing method based on fuzzy degree, lens controller and camera module
CN113129207B (en) * 2019-12-30 2023-08-01 武汉Tcl集团工业研究院有限公司 Picture background blurring method and device, computer equipment and storage medium
CN111246093B (en) * 2020-01-16 2021-07-20 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101764925B (en) * 2008-12-25 2011-07-13 华晶科技股份有限公司 Simulation method for shallow field depth of digital image
JP5322995B2 (en) * 2010-05-10 2013-10-23 キヤノン株式会社 Imaging apparatus and control method thereof
CN103945210B (en) * 2014-05-09 2015-08-05 长江水利委员会长江科学院 A kind of multi-cam image pickup method realizing shallow Deep Canvas
CN106530241B (en) * 2016-10-31 2020-08-11 努比亚技术有限公司 Image blurring processing method and device
CN106530252B (en) * 2016-11-08 2019-07-16 北京小米移动软件有限公司 Image processing method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11893668B2 (en) 2021-03-31 2024-02-06 Leica Camera Ag Imaging system and method for generating a final digital image via applying a profile to image information

Also Published As

Publication number Publication date
CN107454332A (en) 2017-12-08

Similar Documents

Publication Publication Date Title
CN107454332B (en) Image processing method and device and electronic equipment
US10645368B1 (en) Method and apparatus for estimating depth of field information
Wadhwa et al. Synthetic depth-of-field with a single-camera mobile phone
KR102278776B1 (en) Image processing method, apparatus, and apparatus
US10015469B2 (en) Image blur based on 3D depth information
US9013477B2 (en) Method and system for producing a virtual output image from data obtained by an array of image capturing devices
TWI538512B (en) Method for adjusting focus position and electronic apparatus
CN105187722B (en) Depth of field adjusting method, device and terminal
TWI554106B (en) Method and image capturing device for generating image bokeh effect
KR102229811B1 (en) Filming method and terminal for terminal
CN108154514B (en) Image processing method, device and equipment
JP2009282979A (en) Image processor and image processing method
CN102957927B (en) Image processing apparatus and image processing method
EP3005286B1 (en) Image refocusing
CN111986129A (en) HDR image generation method and device based on multi-shot image fusion and storage medium
KR20200031169A (en) Image processing method and device
CN109151329A (en) Photographic method, device, terminal and computer readable storage medium
EP3340608B1 (en) Image processing method and device, and non-transitory computer-readable storage medium
CN105450943B (en) Generate method and image acquiring device that image dissipates scape effect
JP6006506B2 (en) Image processing apparatus, image processing method, program, and storage medium
JP2001157107A (en) Photographing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant