CN110505406A - Background-blurring method, device, storage medium and terminal - Google Patents

Background-blurring method, device, storage medium and terminal Download PDF

Info

Publication number
CN110505406A
CN110505406A CN201910790728.1A CN201910790728A CN110505406A CN 110505406 A CN110505406 A CN 110505406A CN 201910790728 A CN201910790728 A CN 201910790728A CN 110505406 A CN110505406 A CN 110505406A
Authority
CN
China
Prior art keywords
image
camera
focus
instruction
foreground area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910790728.1A
Other languages
Chinese (zh)
Inventor
孙永刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201910790728.1A priority Critical patent/CN110505406A/en
Publication of CN110505406A publication Critical patent/CN110505406A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the present application discloses a kind of background-blurring method, device, storage medium and terminal, belongs to field of image processing.Method includes: reception photographing instruction;In response to the photographing instruction, first image is acquired by the camera in the case where camera is in state out of focus, and the second image is acquired by the camera in the case where the camera is in focus state;The foreground area that the foreground area of the first image is replaced with to second image, obtains background blurring image.The application avoids reducing background blurring processing expense to each pixel carries out software obfuscation processing in background area in image.

Description

Background-blurring method, device, storage medium and terminal
Technical field
This application involves field of image processing more particularly to a kind of background-blurring method, device, storage medium and terminals.
Background technique
Background blurring is exactly that the depth of field is made to shoal, and makes focus on theme, due to mobile terminal (such as: mobile phone) camera shooting The limitation of head performance parameter is realized that background blurring method includes: the clearly image of shooting one at present, is then identified on mobile phone The foreground area of image and background area out, based on fuzzy algorithmic approach (such as: Gaussian Blur or mean value are fuzzy etc.) to the back of image Scene area carry out Fuzzy Processing, the foreground area of the image obtained in this way be clearly, background area be it is fuzzy, be achieved in Background blurring effect.
It has been found that realize in the related art it is background blurring need to pass through each pixel of background area it is a large amount of Calculating, calculating process needs to consume a large amount of resource of terminal.
Summary of the invention
Background-blurring method, device, storage medium and the terminal that the embodiment of the present application provides, can solve the relevant technologies The middle problem for realizing that background blurring function expense is larger.The technical solution is as follows:
In a first aspect, the embodiment of the present application provides a kind of background-blurring method, which comprises
Photographing instruction is received, in response to photographing instruction, passes through camera acquisition first in the case where camera is in state out of focus Image, and the second image is acquired by camera in the case where camera is in focus state, the foreground area of the first image is replaced It is changed to the foreground area of the second image, obtains background blurring image.
Second aspect, the embodiment of the present application provide a kind of background blurring device, and the background blurring device includes:
Receiving unit, for receiving photographing instruction;
Acquisition unit, for being acquired by the camera in the case where camera is in state out of focus in response to photographing instruction First image, and the second image is acquired by the camera in the case where the camera is in focus state;
Virtualization unit is obtained for the foreground area of second image to be replaced with to the foreground area of the first image To background blurring image.
The third aspect, the embodiment of the present application provide a kind of computer storage medium, and the computer storage medium is stored with A plurality of instruction, described instruction are suitable for being loaded by processor and executing above-mentioned method and step.
Fourth aspect, the embodiment of the present application provide a kind of terminal, it may include: processor and memory;Wherein, the storage Device is stored with computer program, and the computer program is suitable for being loaded by the processor and executing above-mentioned method and step.
The technical solution bring beneficial effect that some embodiments of the application provide includes at least:
When receiving photographing instruction, by the first image that the camera acquisition under state out of focus is fuzzy, and by poly- Camera under coke-like state acquires clearly the second image, then replaces with the foreground area in the first image in the second image Foreground area, obtain background blurring image, solve and fuzzy place is carried out to background area by software algorithm in the related technology The big problem of bring processing expense is managed, the embodiment of the present application only needs to notify in camera between focusing lens and sensor devices Distance realize that the switching of state out of focus and focus state obtains blurred picture and clear image, avoid carrying out using fuzzy algorithmic approach Fuzzy Processing bring handles the big problem of expense.
Detailed description of the invention
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of application for those of ordinary skill in the art without creative efforts, can be with It obtains other drawings based on these drawings.
Fig. 1 is the structural schematic diagram of terminal provided by the embodiments of the present application;
Fig. 2 is the structural schematic diagram of operating system provided by the embodiments of the present application and user's space;
Fig. 3 is the architecture diagram of Android operation system in Fig. 1;
Fig. 4 is the architecture diagram of IOS operating system in Fig. 1;
Fig. 5 is the flow diagram of background-blurring method provided by the embodiments of the present application;
Fig. 6 is the flow diagram of background-blurring method provided by the embodiments of the present application;
Fig. 7 is the schematic illustration of state out of focus provided by the embodiments of the present application;
Fig. 8 is the schematic illustration of focus state provided by the embodiments of the present application;
Fig. 9 is a kind of structural schematic diagram of device provided by the embodiments of the present application.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with attached drawing to the embodiment of the present application Mode is described in further detail.
Referring to FIG. 1, the structural block diagram of the terminal provided it illustrates one exemplary embodiment of the application.The application In terminal may include one or more such as lower component: processor 110, memory 120, input unit 130, output device 140 and bus 150.Bus 150 can be passed through between processor 110, memory 120, input unit 130 and output device 140 Connection.
Processor 110 may include one or more processing core.Processor 110 utilizes various interfaces and connection Various pieces in entire terminal, by running or executing the instruction being stored in memory 120, program, code set or instruction Collection, and the data being stored in memory 120 are called, execute the various functions and processing data of terminal 100.Optionally, locate Reason device 110 can use Digital Signal Processing (digital signal processing, DSP), field programmable gate array (field-programmable gate array, FPGA), programmable logic array (programmable logic Array, PLA) at least one of example, in hardware realize.Processor 110 can integrating central processor (central Processing unit, CPU), in image processor (graphics processing unit, GPU) and modem etc. One or more of combinations.Wherein, the main processing operation system of CPU, user interface and application program etc.;GPU is for being responsible for Show the rendering and drafting of content;Modem is for handling wireless communication.It is understood that above-mentioned modem It can not be integrated into processor 110, be realized separately through one piece of communication chip.
Memory 120 may include random access memory (random Access Memory, RAM), also may include read-only Memory (read-only memory, ROM).Optionally, which includes non-transient computer-readable medium (non-transitory computer-readable storage medium).Memory 120 can be used for store instruction, journey Sequence, code, code set or instruction set.Memory 120 may include storing program area and storage data area, wherein storing program area Can store the instruction for realizing operating system, the instruction for realizing at least one function (such as touch function, sound play Function, image player function etc.), for realizing instruction of following each embodiments of the method etc., which can be Android (Android) system (including the system based on android system deep development), Apple Inc.'s exploitation IOS system (including base In the system of IOS system depth exploitation) or other systems.Storage data area can also store the number that terminal is created in use According to (such as phone directory, audio, video data, chat record data) etc..
Shown in Figure 2, memory 120 can be divided into operating system space and user's space, and operating system runs on behaviour Make system space, primary and third party application runs on user's space.In order to guarantee that different third party applications are equal Preferable operational effect can be reached, operating system distributes corresponding system resource for different third party applications for it. However, there is also differences for demand of the different application scene to system resource in same third party application, for example, in local money Under the loading scenario of source, third party application is more demanding to disk reading speed;Under animation render scenes, third party is answered With program then to the more demanding of GPU performance.And it is mutually indepedent between operating system and third party application, operating system is past Toward the current application scenarios of third party application cannot be perceived in time, cause operating system can not be according to third party application Concrete application scene carry out targetedly system resource be adapted to.
In order to enable operating system to distinguish the concrete application scene of third party application, need to get through third-party application Data communication between program and operating system enables operating system to obtain the current scene of third party application at any time Information, and then targetedly system resource is carried out based on current scene and is adapted to.
By taking operating system is android system as an example, the program and data stored in memory 120 is as shown in figure 3, storage Linux inner core 320, system run-time library layer 340, application framework layer 360 and application layer 380 can be stored in device 120, In, Linux inner core 320, system Runtime Library layer 340 and application framework layer 360 belong to operating system space, and application layer 380 belongs to In user's space.Linux inner core 320 provides the driving of bottom for the various hardware of terminal, and such as display driving, audio are driven Dynamic, webcam driver, bluetooth driving, Wi-Fi driving, power management etc..System Runtime Library layer 340 by some libraries C/C++ come Main characteristic is provided for android system to support.If the library SQLite provides the support of database, the library OpenGL/ES is mentioned The support for having supplied 3D to draw, the library Webkit provides the support etc. of browser kernel.It is also provided in system run-time library layer 340 There is Android run-time library (Android runtime), it mainly provides some core libraries, and developer can allow for use Java Language is applied to write Android.Application framework layer 360 provides the various API that may be used when building application program, exploitation Person can also construct the application program of oneself by using these API, such as activity management, window management, view management, logical Know management, content provider, assure reason, call management, resource management, orientation management.Operation has at least one in application layer 380 Application program, these application programs can be the included native applications program of operating system, such as contact person's program, short message journey Sequence, timing routine, camera applications etc.;It is also possible to the third party application that third party developer is developed, such as game class Application program, instant messaging program, photograph pretty program, shopping program etc..
By taking operating system is IOS system as an example, the program and data stored in memory 120 is as shown in figure 4, IOS system It include: kernel operating system layer 420 (Core OS layer), kernel service layer 440 (Core Services layer), media Layer 460 (Media layer), tangible layer 480 (Cocoa Touch Layer).Kernel operating system layer 420 includes operation System kernel, driver and underlying programs frame, these underlying programs frames provide the function closer to hardware, for position It is used in the program frame of kernel service layer 440.Kernel service layer 440 is supplied to system service required for application program And/or program frame, such as basic (Foundation) frame, account frame, advertising framework, data frame memory frame, network company Connect frame, geographical location frame, moving frame etc..Media layer 460 provide for application program in relation to audiovisual in terms of interface, such as The relevant interface of graph image, the relevant interface of Audiotechnica, the relevant interface of video technique, audio video transmission technology it is wireless Play (AirPlay) interface etc..Tangible layer 480 provides the relevant frame in various common interfaces for application development, Tangible layer 480 is responsible for the touch interactive operation of user at the terminal.Such as local notice service, long-range Push Service, advertisement Frame, game tool frame, message user interface interface (User Interface, UI) frame, user interface UIKit frame, Map frame etc..
In frame out shown in Fig. 4, frame related with major applications program includes but is not limited to: kernel service layer The UIKit frame in basic framework and tangible layer 480 in 440.Basic framework provides many basic object class and data It is unrelated to provide most basic system service and UI for all application programs for type.And the class that UIKit frame provides is basic UI class libraries, for creating the user interface based on touch, iOS application program can provide UI based on UIKit frame, so it The architecture of application program is provided, for constructing user interface, drawing, processing and user's alternative events, response gesture etc. Deng.
Wherein, realize that the mode of third party application and operating system data communication and principle can in IOS system With reference to android system, details are not described herein by the application.
Wherein, the instruction for receiving input of input unit 130 or data, input unit 130 include but is not limited to keyboard, Mouse, camera, microphone or touch control device.Output device 140 be used for output order or data, output device 140 include but It is not limited to display equipment and loudspeaker etc..In one example, input unit 130 and output device 140, which can close, sets, input dress Set 130 and output device 140 be touch display screen, the touch display screen is any using finger, felt pen etc. for receiving user The touch operation of suitable object on it or nearby, and the user interface of each application program of display.Touch display screen is logical It is normally placed in the front panel of terminal.Touch display screen is designed to shield comprehensively, Curved screen or abnormal shape shield.Touch display screen is also It is designed to shield and the combination of Curved screen comprehensively, the combination of abnormal shape screen and Curved screen, the embodiment of the present application is not subject to this It limits.
In addition to this, it will be understood by those skilled in the art that the structure of terminal shown by above-mentioned attached drawing is not constituted pair The restriction of terminal, terminal may include perhaps combining certain components or different components than illustrating more or fewer components Arrangement.For example, further including radio circuit, input unit, sensor, voicefrequency circuit, Wireless Fidelity (wireless in terminal Fidelity, WiFi) components such as module, power supply, bluetooth module, details are not described herein.
In the embodiment of the present application, the executing subject of each step can be terminal described above.Optionally, each step Executing subject is the operating system of terminal.Operating system can be Android system, be also possible to IOS system or other operations System, the embodiment of the present application are not construed as limiting this.
The terminal of the embodiment of the present application, is also equipped with display equipment thereon, and display equipment can be various be able to achieve The equipment of display function, such as: cathode-ray tube display (cathode ray tubedisplay, abbreviation CR), light-emitting diodes Tube display (light-emitting diode display, abbreviation LED), electronic ink screen, liquid crystal display (liquid Crystal display, abbreviation LCD), Plasmia indicating panel (plasma display panel, abbreviation PDP) etc..User It can use the display equipment in terminal 101, come information such as the texts, image, video of checking display.The terminal can be intelligence It can mobile phone, tablet computer, game station, AR (Augmented Reality, augmented reality) equipment, automobile, data storage dress It sets, audio playing apparatus, video play device, notebook, Desktop computing device, wearable device such as electronic watch, electronics The equipment such as glasses, electronic helmet, electronic bracelet, electronics necklace, electronic clothes.
In terminal shown in Fig. 1, processor 110 can be used for calling the application program stored in memory 120, and have Body executes the background-blurring method of the embodiment of the present application.
In technical solution provided by the embodiments of the present application, when receiving photographing instruction, pass through the camera under state out of focus The first fuzzy image of acquisition, and clearly the second image is acquired by the camera under focus state, then by the first figure Foreground area as in replaces with the foreground area in the second image, obtains background blurring image, solves and lead in the related technology It crosses software algorithm and the big problem of Fuzzy Processing bring processing expense is carried out to background area, the embodiment of the present application only needs to notify The distance between focusing lens and sensor devices realize that the switching of state out of focus and focus state obtains blurred picture in camera And clear image, it avoids carrying out the big problem of Fuzzy Processing bring processing expense using fuzzy algorithmic approach.
In following methods embodiment, for ease of description, only it is introduced using the executing subject of each step as terminal It is bright.
Below in conjunction with attached drawing 5- attached drawing 9, describe in detail to background-blurring method provided by the embodiments of the present application.Its In, the background blurring device in the embodiment of the present application can be Fig. 5-terminal shown in Fig. 9.
Fig. 5 is referred to, provides a kind of flow diagram of background-blurring method for the embodiment of the present application.As shown in figure 5, The embodiment of the present application the method may include following steps:
S501, photographing instruction is received.
Wherein, photographing instruction is that terminal is generated based on the photographing operation triggering of user, and the type of photographing operation can be One of touch control operation, button operation, acoustic control operation or somatosensory operation are a variety of.
Such as: terminal receives the instruction for opening camera application program, and display shoots interface, is provided on shooting interface virtual Shooting button, terminal receives the clicking operation occurred in shooting button, passes through camera collection image.
Such as: terminal receives the instruction for opening camera application program, and display shooting interface, terminal, which receives, to be arranged in terminal On entity shooting button, terminal display shoot interface in the case where receive the pressing operation occurred in shooting button When, pass through camera collection image.
Another example is: terminal receives the instruction for opening camera application program, display shooting interface, terminal is acquired by microphone Voice data, terminal shoot the voice data received under interface in display and are greater than with the reference voice data similarity of storage When preset threshold, pass through camera collection image.
S502, in response to photographing instruction, in the case where camera is in state out of focus by camera acquire the first image, and The second image is acquired by camera in the case where camera is in focus state.
Wherein, the first image acquired under state out of focus is fuzzy image, and the image acquired under focus state is Clearly image.Camera includes focusing lens and sensor devices (charge-coupled device, CCD), and focusing lens are Convex lens, for converging optical signal, sensor devices are for converting optical signals to electric signal.The focusing lens being arranged in camera Quantity can be one or more, focusing lens have focus, the distance between focus to focusing lens referred to as focal length.Poly- When the focus of focus lens is fallen on sensor devices, camera is in focus state, passes through camera acquired image (the at this time Two images) it is clearly;When the focus of focusing lens is fallen in except sensor devices, camera is in state out of focus, leads at this time It is fuzzy for crossing camera acquired image (the first image).Wherein, the focus of sensor devices deflection focusing eyeglass is bigger, adopts The image collected is fuzzyyer;The focus of sensor devices deflection focusing eyeglass is smaller, and acquired image is more clear.
Such as: the focal length of focusing lens is 4cm, is 4cm ± 0.5cm at a distance from sensor devices are between focusing lens When, camera is in focus state, at this time by camera acquire be clear photograph;Sensor devices and focus condition it Between distance be greater than 4.5cm or be less than 3.5cm when, camera be in visual angle state, this at this time by camera acquisition be mould The image of paste.
In one or more embodiments, multiple cameras have can be set in terminal, and multiple cameras include the first camera shooting Head and second camera, terminal can acquire the first image by the first camera under state out of focus, and by focusing shape Second camera under state acquires the second image.
S503, the foreground area that the foreground area of the first image is replaced with to the second image, obtain background blurring image.
Wherein, terminal determines the foreground area of the first image and the foreground area of the second image, the foreground zone of the first image Domain hereinafter referred to as the first foreground area, the foreground area of the second image hereinafter referred to as the second foreground area, the first foreground area and Second prospect includes identical object, such as: including identical personage, flowers and plants or animal etc..Foreground area is needed in image Pixel region where highlighted object, terminal can determine the prospect in the first image or the second image based on picture depth Then the foreground area that the foreground area of first image replaces with the second image is obtained background blurring image, the first figure by region Picture and the second image are stored in the memory of terminal, can't be shown, terminal is using background blurring image as shooting Preview image.
In one or more embodiments, terminal can determine the foreground area in the first image, then determine the prospect The foreground area position of the position in region, the first image and the second image is identical, before being determined in the second image based on the position Then the foreground area of first image is switched to the foreground area of the second image by scene area, obtain background blurring image, i.e., should Background blurring image is made of the foreground area of the second image and the background area of the first image.
In one or more embodiments, terminal can determine the foreground area in the second image, then determine the prospect The foreground area position of the position in region, the first image and the second image is identical, before being determined in the first image based on the position Then the foreground area that the foreground area of first image replaces with the second image is obtained background blurring image, that is, carried on the back by scene area Scape virtualization image is made of the foreground area of the second image and the background area of the first image.
The scheme of the embodiment of the present application when being executed, when receiving photographing instruction, is adopted by the camera under state out of focus The first fuzzy image of collection, and clearly the second image is acquired by the camera under focus state, then by the first image In foreground area replace with the foreground area in the second image, obtain background blurring image, solve and pass through in the related technology Software algorithm carries out the big problem of Fuzzy Processing bring processing expense to background area, and the embodiment of the present application only needs to notify to take the photograph As in head the distance between focusing lens and sensor devices realize state out of focus and focus state switching obtain blurred picture and Clear image avoids carrying out the big problem of Fuzzy Processing bring processing expense using fuzzy algorithmic approach.
It is a kind of flow diagram of background-blurring method provided by the embodiments of the present application referring to Fig. 6.The present embodiment is to carry on the back Scape weakening method is applied to come in terminal for example, the background-blurring method may comprise steps of:
S601, display background blur set interface.
Wherein, background blurring set interface user opens background blurring function and background blurring grade is arranged, background blurring Set interface can be a user interface in setting application program, such as: background blurring set interface includes for opening Or the switch of background blurring function is closed, and the slider bar of the background blurring grade of setting.Slider bar is for indicating background blurring The adjustable extent of grade, is provided with the minimum value and maximum value of background blurring grade on slider bar, background blurring grade and the The being positively correlated property of fog-level of one image, background blurring lower grade, and the first image fog-level is lower, conversely, background is empty Change higher grade, and the fog-level of the first image is higher.Numerical value can be used to indicate in background blurring grade, such as: background is empty Change grade is indicated using numerical value 1-5.When background blurring function is opened, terminal can carry out background blurring place to the image of shooting Reason, when background blurring function is closed, terminal will not carry out background blurring processing to the figure of shooting.
S602, the first instruction that user generates in background blurring set interface is received.
S603, background blurring function is opened based on the first instruction.
Wherein, the first instruction is that user generates in the trigger action that background blurring set interface executes, trigger action Type can be touch control operation or button operation.Such as: it include a virtual switch, terminal detection in background blurring set interface When the left sliding operation occurred on to the virtual switch, terminal closes background blurring function;Terminal detects to be sent out on the virtual switch When raw right cunning operates, terminal opens background blurring function.
S604, the second instruction that user generates on background blurring interface is received.
S605, background blurring grade is determined based on the second instruction.
Wherein, the second instruction is that the trigger action that user executes in background blurring set interface generates, trigger action Type can be touch control operation or button operation.Such as: it include a slider bar in background blurring set interface, slider bar The left side shows the minimum value 1 of background blurring grade, and the maximum value of background blurring grade, Yong Huke are shown on the right of slider bar Sliding block on dragging slider bar is slided to the left or is slided to the right, and the step-length of sliding is 1, and terminal detects that user stops sliding the cunning When block, the position of sliding block on the slider bar is obtained, current background blurring grade is determined according to the position.
S606, the distance between focusing lens and sensor devices are determined according to background blurring grade.
Wherein, terminal is pre-stored or is pre-configured with background blurring grade and the mapping relations between, different backgrounds Virtualization grade is associated with different distances, which is the distance between focusing lens and sensor devices, and this is apart from lower camera shooting Head is in state out of focus.Background blurring bigger grade, and the distance of the focus of sensor devices deflection focusing eyeglass is bigger, passes through at this time The fog-level of first image of camera acquisition is higher;Otherwise background blurring grade is smaller, sensor devices deflection focusing eyeglass Focus distance it is smaller, at this time by camera acquire the first image fog-level it is lower.
Such as: the focal length of focusing lens is 5cm, and the distance between focusing lens and sensor devices are between 4.5cm-5cm When, the image of acquisition be clearly, not in the range of 4.5cm-5cm when the photo that acquires be fuzzy.Background blurring grade Mapping relations between distance are as follows:
Background blurring grade Distance
1 4cm
2 3.5cm
3 3cm
4 2cm
5 1cm
Table 1
S607, photographing instruction is received.
Wherein, photographing instruction is that terminal is generated based on the photographing operation triggering of user, and the type of photographing operation can be One of touch control operation, button operation, acoustic control operation or somatosensory operation are a variety of.
Such as: terminal receives the instruction for opening camera applications, and display shooting interface shoots and is provided with virtual bat on interface Button is taken the photograph, terminal receives the clicking operation occurred in shooting button, passes through camera collection image.
Such as: terminal receives the instruction for opening camera application program, display shooting interface, and terminal receives the shooting of entity When the pressing operation occurred on key, pass through camera collection image.
Such as: terminal display shoot interface in the case where, by microphone acquire voice data, by voice data and with The reference voice data of storage carries out similarity-rough set, when the similarity of the two is greater than preset threshold, is acquired by camera Image.
S608, regulating command out of focus is sent to focusing motor.
Shown in referring to figs. 7 and 8, camera includes focusing motor, focusing lens and sensor devices, and focusing motor is for controlling Focusing lens processed are moved to left or are moved to right along optical axis direction, and focus motor can be a stepper motor, and focus motor receives processor Instruction determine moving direction and moving distance.Focusing lens are a convex lenses, and the parallel optical signal for injection converges to On focus f, the distance between focus f to focusing lens is focal length, and the quantity for the focusing lens being arranged in camera can be one A or multiple, when the quantity of focusing lens is multiple, multiple focusing lens can be arranged in parallel.Focusing lens and sensor devices The distance between must within the specified range, camera can just collect clearly image, if focusing lens and sensor devices The distance between not within the specified range, camera can collect fuzzy image, and specified range is with focal length with a reference value It floats up and down.As shown in fig. 7, the distance between focusing lens and sensor devices, not in specified range, camera, which is in, to be lost Coke-like state;As shown in figure 8, the distance between focusing lens and sensor devices, in specified range, camera, which is in, focuses shape State.
Wherein, regulating command out of focus is used to indicate focusing motor and is moved, so that between focusing lens and sensor devices Distance not in specified range.Terminal obtains distance to be moved according to the background blurring ranked queries table 1 of user setting, according to The current position of focusing motor determines distance to be moved, then indicates that focusing motor is moved by regulating command out of focus, The distance between focusing lens and sensor devices are adjusted to the distance inquired into table 1.
Such as: the background blurring grade of user setting is 3, and it is 3cm that terminal inquiry table 1, which obtains distance, and terminal gets poly- The distance between focus lens and sensor devices are 5cm, and the distance to be moved that focusing motor is calculated in terminal is 2cm, mobile Direction is to be moved to the left, and sends regulating command out of focus to focusing motor, and imbalance demodulation instruction carries moving direction and to be moved The distance between focusing lens and sensor devices are adjusted to 3cm according to regulating command out of focus by distance, focusing motor.
S609, the first image is acquired by camera.
Wherein, after focusing motor adjusts the distance between focusing lens and sensor devices according to regulating command out of focus, camera shooting Head is in state out of focus, and terminal is fuzzy image by collected first image of camera.
S610, focus adjustment instruction is sent to focusing motor.
Wherein, focus adjustment instruction is used to indicate the mobile focusing lens of focusing motor, make focusing lens and sensor devices it Between distance be pre-determined distance, such as: pre-determined distance is focal length.Terminal determines focusing motor according to the current position of focusing lens Moving direction and distance to be moved, then generate focus adjustment instruction.
Such as: between focusing lens and sensor devices current distance be 3cm, terminal calculate focusing motor it is to be moved away from From for 2cm, moving direction is to move right, and then generates focus adjustment instruction, sends focus adjustment instruction to focusing motor, right Burnt motor instructs the 2cm that moves right according to focus adjustment, and the distance between focusing lens and sensor devices are 5cm after movement, this When camera be in focus state.
S611, the second image is acquired by camera.
Wherein, collected second image of camera is clearly image.
It is understood that sequencing the embodiment of the present application of the first image and the second Image Acquisition is with no restriction, the One image can be limited to the second Image Acquisition, can also can also acquire simultaneously later in the second Image Acquisition.If the first image It is necessarily less than preset duration with the acquisition interval of the second image, just can guarantee that the first image and the second image include identical in this way Object.
S612, the first image and the second image are subjected to registration process.
Wherein, the first image and the second image resolution ratio having the same, registration process are indicated the first image and second The pixel of the same position of image is aligned.Such as: the resolution ratio of the first image and the second image is all 1024 × 768, Coordinate is that coordinate is that the pixel of (x, y) is aligned in the pixel and the second image of (x, y) in first image, x=[0,1024], Y=[0,768], x and y are integer.
S613, the foreground area that the second image is determined based on image depth information.
Wherein, foreground area is the pixel region needed where highlighted object in image, image depth information table For pixel in diagram picture to the distance between the sensor devices of camera, foreground area may include within the scope of distance to a declared goal The corresponding pixel of graphics depth.
S614, it determines position of the foreground area of the second image in the second image, the first image is determined based on the position In foreground area.
S615, the foreground area that the foreground area of the first image replaces with the second image is obtained into background blurring image.
Terminal can determine the foreground area in the first image or the second image based on picture depth, then by the first image The foreground area that foreground area replaces with the second image obtains background blurring image, and the first image and the second image are stored in end It in the memory at end, can't show, terminal is using background blurring image as the preview image of shooting.
In one or more embodiments, terminal can determine the foreground area in the first image, then determine the prospect The foreground area position of the position in region, the first image and the second image is identical, before being determined in the second image based on the position Then the foreground area of first image is switched to the foreground area of the second image by scene area, obtain background blurring image, i.e., should Background blurring image is made of the foreground area of the second image and the background area of the first image.
In one or more embodiments, terminal can determine the foreground area in the second image, then determine the prospect The foreground area position of the position in region, the first image and the second image is identical, before being determined in the first image based on the position Then the foreground area that the foreground area of first image replaces with the second image is obtained background blurring image, that is, carried on the back by scene area Scape virtualization image is made of the foreground area of the second image and the background area of the first image.
The scheme of the embodiment of the present application when being executed, when receiving photographing instruction, is adopted by the camera under state out of focus The first fuzzy image of collection, and clearly the second image is acquired by the camera under focus state, then by the first image In foreground area replace with the foreground area in the second image, obtain background blurring image, solve and pass through in the related technology Software algorithm carries out the big problem of Fuzzy Processing bring processing expense to background area, and the embodiment of the present application only needs to notify to take the photograph As in head the distance between focusing lens and sensor devices realize state out of focus and focus state switching obtain blurred picture and Clear image avoids carrying out the big problem of Fuzzy Processing bring processing expense using fuzzy algorithmic approach.
Following is the application Installation practice, can be used for executing the application embodiment of the method.It is real for the application device Undisclosed details in example is applied, the application embodiment of the method is please referred to.
Fig. 9 is referred to, it illustrates the structures of the processing unit of the courseware of one exemplary embodiment of the application offer to show It is intended to.Hereinafter referred to as device 9, device 9 can pass through the whole or one of software, hardware or both being implemented in combination with as terminal Part.Device 9 includes receiving unit 901, acquisition unit 902 and virtualization unit 903.
Receiving unit 901, for receiving photographing instruction.
Acquisition unit 902, for being adopted by the camera in the case where camera is in state out of focus in response to photographing instruction Collect the first image, and the second image is acquired by the camera in the case where the camera is in focus state.
Unit 903 is blurred, for the foreground area of the first image to be replaced with to the foreground area of second image, Obtain background blurring image.
In one or more embodiments, the camera includes focusing motor, focusing lens and sensor devices;
Wherein, described that first image is acquired by the camera in the case where camera is in state out of focus, and described Camera, which is under focus state, acquires the second image by the camera, comprising:
Regulating command out of focus is sent to the focusing motor;Wherein, the regulating command out of focus is used to indicate the focusing Motor adjusts the distance between the focusing lens and the sensor devices, so that the camera is in state out of focus;
The first image is acquired by the camera;
Focus adjustment instruction is sent to the focusing motor;Wherein, the focal adjustment instruction is used to indicate the focusing Motor adjusts the distance between the focusing lens and the sensor devices, so that the camera is in focus state;
The second image is acquired by the camera.
In one or more embodiments,
Receiving unit 901 is also used to receive the first instruction for opening background blurring function, and is based on described first Background blurring function is opened in instruction.
In one or more embodiments, receiving unit 901 is also used to: receiving the second finger for adjusting background blurring grade It enables, background blurring grade is determined based on second instruction;
The distance between the focusing lens and the sensor devices are determined according to the background blurring grade.
In one or more embodiments, described that the foreground area of the first image is replaced with into the first image Foreground area obtains background blurring image, comprising:
The first image and second image are subjected to registration process;
The foreground area of second image is determined based on image depth information;
Position of the foreground area of second image in second image is determined, based on described in the determination of the position Foreground area in first image;
The foreground area that the foreground area of the first image is replaced with to second image, obtains background blurring figure Picture.
In one or more embodiments, when the first image and the acquisition interval of second image are less than default It is long.
In one or more embodiments, it is described by the first image and second image carry out registration process, packet It includes:
Obtain the central point of the first image and the central point of second image;
The central point of the central point of the first image and second image is subjected to registration process.
It should be noted that device 9 provided by the above embodiment is when executing background-blurring method, only with above-mentioned each function The division progress of module can according to need and for example, in practical application by above-mentioned function distribution by different function moulds Block is completed, i.e., the internal structure of equipment is divided into different functional modules, to complete all or part of function described above Energy.In addition, background-blurring method embodiment provided by the above embodiment belongs to same design, embodies realization process and be detailed in method Embodiment, which is not described herein again.
Above-mentioned the embodiment of the present application serial number is for illustration only, does not represent the advantages or disadvantages of the embodiments.
When receiving photographing instruction, by the first image that the camera acquisition under state out of focus is fuzzy, and by poly- Camera under coke-like state acquires clearly the second image, then replaces with the foreground area in the first image in the second image Foreground area, obtain background blurring image, solve and fuzzy place is carried out to background area by software algorithm in the related technology The big problem of bring processing expense is managed, the embodiment of the present application only needs to notify in camera between focusing lens and sensor devices Distance realize that the switching of state out of focus and focus state obtains blurred picture and clear image, avoid carrying out using fuzzy algorithmic approach Fuzzy Processing bring handles the big problem of expense.
The embodiment of the present application also provides a kind of computer storage medium, the computer storage medium can store more Item instruction, described instruction are suitable for being loaded by processor and being executed the method and step such as above-mentioned figure 5-8 illustrated embodiment, specifically hold Row process may refer to illustrating for figure 5-8 illustrated embodiment, herein without repeating.
Present invention also provides a kind of computer program product, which is stored at least one instruction, At least one instruction is loaded as the processor and is executed to realize background-blurring method described in as above each embodiment.
It should be understood by those skilled in the art that, embodiments herein can provide as method, system or computer program Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the application Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the application, which can be used in one or more, The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces The form of product.
The application is referring to method, the process of equipment (system) and computer program product according to the embodiment of the present application Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.

Claims (10)

1. a kind of background-blurring method, which is characterized in that the described method includes:
Receive photographing instruction;
In response to the photographing instruction, first image is acquired by the camera in the case where camera is in state out of focus, and The second image is acquired by the camera in the case where the camera is in focus state;
The foreground area that the foreground area of the first image is replaced with to second image, obtains background blurring image.
2. the method according to claim 1, wherein the camera includes focusing motor, focusing lens and sense Optical device;
Wherein, described that first image is acquired by the camera in the case where camera is in state out of focus, and in the camera shooting Head acquires the second image by the camera under focus state, comprising:
Regulating command out of focus is sent to the focusing motor;Wherein, the regulating command out of focus is used to indicate the focusing motor The distance between the focusing lens and the sensor devices are adjusted, so that the camera is in state out of focus;
The first image is acquired by the camera;
Focus adjustment instruction is sent to the focusing motor;Wherein, the focal adjustment instruction is used to indicate the focusing motor The distance between the focusing lens and the sensor devices are adjusted, so that the camera is in focus state;
The second image is acquired by the camera.
3. according to the method described in claim 2, it is characterized in that, before the reception photographing instruction, further includes:
The first instruction for opening background blurring function is received, and background blurring function is opened based on first instruction.
4. according to the method described in claim 3, it is characterized in that, first received for opening background blurring function refers to It enables, and background blurring function is opened based on first instruction, further includes:
The second instruction for adjusting background blurring grade is received, background blurring grade is determined based on second instruction;
The distance between the focusing lens and the sensor devices are determined according to the background blurring grade.
5. the method according to claim 1, which is characterized in that the prospect by the first image Region replaces with the foreground area of second image, obtains background blurring image, comprising:
The first image and second image are subjected to registration process;
The foreground area of second image is determined based on image depth information;
It determines position of the foreground area of second image in second image, determines described first based on the position Foreground area in image;
The foreground area that the foreground area of the first image is replaced with to second image, obtains background blurring image.
6. according to the method described in claim 5, it is characterized in that, the acquisition interval of the first image and second image Less than preset duration.
7. according to the method described in claim 6, it is characterized in that, described carry out the first image with second image Registration process, comprising:
Obtain the central point of the first image and the central point of second image;
The central point of the central point of the first image and second image is subjected to registration process.
8. a kind of background blurring device, which is characterized in that described device includes:
Receiving unit, for receiving photographing instruction;
Acquisition unit, for passing through camera acquisition first in the case where camera is in state out of focus in response to photographing instruction Image, and the second image is acquired by the camera in the case where the camera is in focus state;
Virtualization unit is carried on the back for the foreground area of the first image to be replaced with to the foreground area of second image Scape blurs image.
9. a kind of computer storage medium, which is characterized in that the computer storage medium is stored with a plurality of instruction, described instruction Suitable for being loaded by processor and being executed the method and step such as claim 1~7 any one.
10. a kind of terminal characterized by comprising processor and memory;Wherein, the memory is stored with computer journey Sequence, the computer program are suitable for being loaded by the processor and being executed the method and step such as claim 1~7 any one.
CN201910790728.1A 2019-08-26 2019-08-26 Background-blurring method, device, storage medium and terminal Pending CN110505406A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910790728.1A CN110505406A (en) 2019-08-26 2019-08-26 Background-blurring method, device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910790728.1A CN110505406A (en) 2019-08-26 2019-08-26 Background-blurring method, device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN110505406A true CN110505406A (en) 2019-11-26

Family

ID=68589503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910790728.1A Pending CN110505406A (en) 2019-08-26 2019-08-26 Background-blurring method, device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN110505406A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246092A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112954212A (en) * 2021-02-08 2021-06-11 维沃移动通信有限公司 Video generation method, device and equipment
WO2021142711A1 (en) * 2020-01-16 2021-07-22 Oppo广东移动通信有限公司 Image processing method and apparatus, storage medium, and electronic device
CN113965663A (en) * 2020-07-21 2022-01-21 深圳Tcl新技术有限公司 Image quality optimization method, intelligent terminal and storage medium
CN114040099A (en) * 2021-10-29 2022-02-11 维沃移动通信有限公司 Image processing method and device and electronic equipment
CN115209051A (en) * 2022-07-08 2022-10-18 杭州海康威视数字技术股份有限公司 Focusing method and device of zoom camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033741A1 (en) * 2007-07-30 2009-02-05 Eun-Soo Kim 2d-3d convertible display device and method having a background of full-parallax integral images
CN107707809A (en) * 2017-08-17 2018-02-16 捷开通讯(深圳)有限公司 A kind of method, mobile device and the storage device of image virtualization
CN109151329A (en) * 2018-11-22 2019-01-04 Oppo广东移动通信有限公司 Photographic method, device, terminal and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033741A1 (en) * 2007-07-30 2009-02-05 Eun-Soo Kim 2d-3d convertible display device and method having a background of full-parallax integral images
CN107707809A (en) * 2017-08-17 2018-02-16 捷开通讯(深圳)有限公司 A kind of method, mobile device and the storage device of image virtualization
CN109151329A (en) * 2018-11-22 2019-01-04 Oppo广东移动通信有限公司 Photographic method, device, terminal and computer readable storage medium

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246092A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111246092B (en) * 2020-01-16 2021-07-20 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
WO2021142711A1 (en) * 2020-01-16 2021-07-22 Oppo广东移动通信有限公司 Image processing method and apparatus, storage medium, and electronic device
CN114762313A (en) * 2020-01-16 2022-07-15 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN114762313B (en) * 2020-01-16 2024-03-01 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN113965663A (en) * 2020-07-21 2022-01-21 深圳Tcl新技术有限公司 Image quality optimization method, intelligent terminal and storage medium
CN112954212A (en) * 2021-02-08 2021-06-11 维沃移动通信有限公司 Video generation method, device and equipment
CN114040099A (en) * 2021-10-29 2022-02-11 维沃移动通信有限公司 Image processing method and device and electronic equipment
CN114040099B (en) * 2021-10-29 2024-03-08 维沃移动通信有限公司 Image processing method and device and electronic equipment
CN115209051A (en) * 2022-07-08 2022-10-18 杭州海康威视数字技术股份有限公司 Focusing method and device of zoom camera
CN115209051B (en) * 2022-07-08 2024-02-13 杭州海康威视数字技术股份有限公司 Focusing method and device of zoom camera

Similar Documents

Publication Publication Date Title
CN110505406A (en) Background-blurring method, device, storage medium and terminal
CN112243583B (en) Multi-endpoint mixed reality conference
CN109754454B (en) Object model rendering method and device, storage medium and equipment
WO2021258994A1 (en) Method and apparatus for displaying virtual scene, and device and storage medium
US20170372449A1 (en) Smart capturing of whiteboard contents for remote conferencing
WO2017065935A1 (en) Methods and apparatuses for controlling video content displayed to a viewer
US11538211B2 (en) Puppeteering remote avatar by facial expressions
CN108292489A (en) Information processing unit and image generating method
CN107771391B (en) Method and apparatus for determining exposure time of image frame
CN112199016B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN106792147A (en) A kind of image replacement method and device
KR20170048137A (en) Method for transmitting media contents, apparatus for transmitting media contents, method for receiving media contents, apparatus for receiving media contents
CN112328091B (en) Barrage display method and device, terminal and storage medium
CN112053449A (en) Augmented reality-based display method, device and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN112053370A (en) Augmented reality-based display method, device and storage medium
CN111045511A (en) Gesture-based control method and terminal equipment
CN108986117B (en) Video image segmentation method and device
CN112884908A (en) Augmented reality-based display method, device, storage medium, and program product
CN110881104A (en) Photographing method, photographing device, storage medium and terminal
CN114531553B (en) Method, device, electronic equipment and storage medium for generating special effect video
CN114630057B (en) Method and device for determining special effect video, electronic equipment and storage medium
CN111866372A (en) Self-photographing method, device, storage medium and terminal
CN112906553B (en) Image processing method, apparatus, device and medium
CN107864333A (en) Image processing method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191126

RJ01 Rejection of invention patent application after publication