CN105163042B - A kind of apparatus and method for blurring processing depth image - Google Patents
A kind of apparatus and method for blurring processing depth image Download PDFInfo
- Publication number
- CN105163042B CN105163042B CN201510481927.6A CN201510481927A CN105163042B CN 105163042 B CN105163042 B CN 105163042B CN 201510481927 A CN201510481927 A CN 201510481927A CN 105163042 B CN105163042 B CN 105163042B
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- msup
- module
- mfrac
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Abstract
The invention discloses a kind of apparatus and method for blurring processing depth image, including:Acquisition module, for obtaining F-number, focal length, focal distance and allowing disperse circular diameter;Computing module, for the F-number according to acquisition, focal length, focal distance and allow disperse circular diameter calculate field depth;Determining module, for obtaining the object distance of focusing and determining the corresponding virtualization degree of different object distances according to the object distance of the focusing of acquisition, F-number, focal length;Correcting module, for according to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of different object distances determined;Fuzzy Processing module, the pixel for the depth image to blurring processing the need for outside field depth carries out Fuzzy Processing according to the corresponding radius of revised Gaussian Blur and shot.Technical solution of the present invention is realized in the case where not made a change to the physical hardware structure of existing mobile terminal, improves shooting effect.
Description
Technical field
The present invention relates to image processing techniques, espespecially a kind of apparatus and method for blurring processing depth image.
Background technology
Existing general digital equipment (such as mobile terminal or different cameras) can not be anti-with list on photo-sensitive cell
Camera is compared, it is impossible to the clearly demarcated background blurring effect of stereovision is taken, because the physical hardware structure of mobile terminal is difficult
Accomplish close with the physical hardware structure of slr camera.In addition, existing background blurring model algorithm does not make full use of list anti-
Imaging model, background blurring degree is untrue, good without single anti-physical background virtualization effect.
Therefore in the case of how not doing physical hardware structural change to existing mobile terminal, realize and effect is taken the photograph in single backhand
Fruit identical picture, the problem of as current urgent need to resolve.
The content of the invention
In order to solve the above-mentioned technical problem, the invention provides a kind of apparatus and method for blurring processing depth image, energy
It is enough to realize in the case where not made a change to the physical hardware structure of existing mobile terminal, improve shooting effect.
In order to reach the object of the invention, the invention provides a kind of device for blurring processing depth image, including:Obtain mould
Block, computing module, determining module, correcting module and Fuzzy Processing module;Wherein,
Acquisition module, for obtaining F-number, focal length, focal distance and allowing disperse circular diameter;
Computing module, for the F-number according to acquisition, focal length, focal distance and allow disperse circular diameter calculate the depth of field
Scope;
Determining module, for obtaining the object distance of focusing and true according to the object distance of the focusing of acquisition, F-number, focal length
Determine the corresponding virtualization degree of different object distances;
Correcting module, for according to the corresponding virtualization degree amendment Gaussian Blur of different object distances determined corresponding half
Footpath;
Fuzzy Processing module, pixel for the depth image to blurring processing the need for outside field depth is according to repairing
The corresponding radius of Gaussian Blur after just carries out Fuzzy Processing and shot.
Further, the correcting module is according to the corresponding radius of below equation amendment Gaussian Blur:
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is Bi's
Maximum, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum
Value.
Further, the determining module determines the corresponding virtualization degree of different object distances by below equation:
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is aperture
Numerical value, A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f.
Further, the computing module calculates field depth by below equation:
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance.
The invention also discloses a kind of method for blurring processing depth image, including:
Obtain F-number, focal length, focal distance and allow disperse circular diameter;
According to the F-number of acquisition, focal length, focal distance and allow disperse circular diameter calculate field depth;
Obtain the object distance of focusing and different object distances pair are determined according to the object distance of the focusing of acquisition, F-number, focal length
The virtualization degree answered;
According to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of the different object distances determined;
To blur the need for outside field depth processing depth image pixel according to revised Gaussian Blur pair
The radius answered carries out Fuzzy Processing and shot.
Further, according to the corresponding radius of below equation amendment Gaussian Blur:
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is Bi's
Maximum, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum
Value.
Further, the corresponding virtualization degree of different object distances is determined by below equation:
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is aperture
Numerical value, A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f.
Further, field depth is calculated by below equation:
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance.
Technical solution of the present invention includes:Acquisition module, computing module, determining module, correcting module and Fuzzy Processing module;
Wherein, acquisition module, for obtaining F-number, focal length, focal distance and allowing disperse circular diameter;Computing module, for root
According to the F-number of acquisition, focal length, focal distance and allow disperse circular diameter calculate field depth;Determining module, for obtaining
The object distance of focusing simultaneously determines the corresponding virtualization degree of different object distances according to the object distance of the focusing of acquisition, F-number, focal length;
Correcting module, for according to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of different object distances determined;Fuzzy place
Module is managed, the pixel for the depth image to blurring processing the need for outside field depth is according to revised Gaussian Blur
Corresponding radius carries out Fuzzy Processing and shot.Technical solution of the present invention is realized in physical hardware not to existing mobile terminal
In the case that structure makes a change, shooting effect is improved.
Brief description of the drawings
Accompanying drawing described herein is used for providing a further understanding of the present invention, constitutes the part of the application, this hair
Bright schematic description and description is used to explain the present invention, does not constitute inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 illustrates for the hardware configuration of the mobile terminal of realization each embodiment of the invention;
Fig. 2 is the schematic diagram of the communication system communicated between support mobile terminal of the present invention;
Fig. 3 handles the structural representation of the device of depth image for present invention virtualization;
Fig. 4 handles the flow chart of the method for depth image for present invention virtualization;
Fig. 5 is the schematic diagram that convex lens of the present invention is imaged;
Fig. 6 is the exemplary plot of field depth of the present invention;
Fig. 7 is the exemplary plot of the relation of field depth of the present invention and F-number;
Fig. 8 is the exemplary plot of the relation of field depth of the present invention and focal distance;
Fig. 9 is the corresponding schematic diagram of virtualization degree of the present invention radius corresponding with Gaussian Blur;
Figure 10 is the corresponding schematic diagram one of different object distances of the present invention and fuzziness;
Figure 11 is the corresponding schematic diagram two of different object distances of the present invention and fuzziness.
Embodiment
Technical scheme is described in detail below in conjunction with drawings and Examples.
Describe to realize the mobile terminal of each embodiment of the invention referring now to accompanying drawing.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for be conducive to the present invention explanation, itself
Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (tablet personal computer), PMP
The mobile terminal of (portable media player), guider etc. and such as numeral TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for movement
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is the hardware architecture diagram for the mobile terminal for realizing each embodiment of the invention.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user's input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups shown
Part.More or less components can alternatively be implemented.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module
112nd, at least one in wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting
Signal etc..Moreover, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Closing information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is received.Broadcast singal can exist in a variety of manners, for example, it can be with DMB (DMB) electronics
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiving module 111 can receive signal broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111, which may be constructed such that, to be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And in server at least one and/or receive from it radio signal.Such radio signal can be logical including voice
Talk about signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
ToothTM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module
Typical case be GPS (global positioning system).According to current technology, GPS module 115, which is calculated, comes from three or more satellites
Range information and correct time information and for the Information application triangulation of calculating so that according to longitude, latitude
Highly accurately calculate three-dimensional current location information.Currently, defended for calculating the method for position and temporal information using three
Star and the position calculated by using other satellite correction and the error of temporal information.In addition, GPS module 115
Can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121
Microphone 1220, the static map that 121 pairs of camera is obtained in video acquisition mode or image capture mode by image capture apparatus
The view data of piece or video is handled.Picture frame after processing may be displayed on display unit 151.At camera 121
Picture frame after reason can be stored in memory 160 (or other storage mediums) or be carried out via wireless communication unit 110
Send, two or more cameras 1210 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone relation mould
Sound (voice data) is received via microphone in formula, logging mode, speech recognition mode etc. operational mode, and can be by
Such acoustic processing is voice data.Audio (voice) data after processing can be changed in the case of telephone calling model
For the form output of mobile communication base station can be sent to via mobile communication module 112.Microphone 122 can implement all kinds
Noise eliminate (or suppress) algorithm with eliminate noise that (or suppression) produce during receiving and sending audio signal or
Person disturbs.
The order that user input unit 130 can be inputted according to user generates key input data to control each of mobile terminal
Plant operation.User input unit 130 allows user to input various types of information, and can include keyboard, metal dome, touch
Plate (for example, detection due to being touched caused by resistance, pressure, electric capacity etc. change sensitive component), roller, rocking bar etc.
Deng.Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, (for example, mobile terminal 1 00 opens or closes shape
State), the position of mobile terminal 1 00, user is for the presence or absence of contact (that is, touch input) of mobile terminal 1 00, mobile terminal
The acceleration or deceleration movement of 100 orientation, mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed
The sliding-type phone is opening or closing.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).Sensing unit 140, which can include proximity transducer 1410, to be combined below
This is described touch-screen.
Interface unit 170 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 1 00.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be that storage is used to verify that user uses each of mobile terminal 1 00
Plant information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for reception and come from
The input (for example, data message, electric power etc.) of external device (ED) and the input received is transferred in mobile terminal 1 00
One or more elements can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 may be used as allowing by it by electricity
Power provides to the path of mobile terminal 1 00 from base or may be used as allowing passing through it from the various command signals that base is inputted
It is transferred to the path of mobile terminal.The various command signals or electric power inputted from base may be used as being used to recognize that mobile terminal is
The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio and/or tactile manner
Go out signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include display
Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information handled in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
When talking about call mode, display unit 151 can be shown with conversing or other communicating (for example, text messaging, multimedia file
Download etc.) related user interface (U1) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, show video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad in the form of layer it is superposed on one another to form touch-screen when, display unit
151 may be used as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to be watched from outside, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired embodiment, mobile terminal 1 00 can include two or more display units (or other display devices), for example, moving
Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detection to touch
Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is that wireless communication unit 110 is received or in memory 160 when under the isotypes such as speech recognition mode, broadcast reception mode
The voice data transducing audio signal of middle storage and it is output as sound.Moreover, dio Output Modules 152 can be provided and movement
The audio output (for example, call signal receives sound, message sink sound etc.) for the specific function correlation that terminal 100 is performed.
Dio Output Modules 152 can include loudspeaker, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 1 00.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit
153 can provide output in a different manner with the generation of notification event.For example, alarm unit 153 can be in the form of vibration
Output is provided, when receiving calling, message or some other entrance communications (incoming communication), alarm list
Member 153 can provide tactile output (that is, vibrating) to notify to user.Exported by providing such tactile, even in
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memory 160 can store software program of the processing performed by controller 180 and control operation etc., Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memory 160 can store the vibration of various modes on being exported when touching and being applied to touch-screen and audio signal
Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..Moreover, mobile terminal 1 00 can be with performing memory by network connection
The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data
Communication, video calling etc. related control and processing.In addition, controller 180 can include being used to reproduce (or playback) many matchmakers
The multi-media module 1810 of volume data, multi-media module 1810 can be constructed in controller 180, or can be structured as and control
Device 180 processed is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input performed on the touchscreen or figure
Piece draws input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each member
Appropriate electric power needed for part and component.
Various embodiments described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implemented.Implement for hardware, embodiment described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein
At least one of electronic unit is implemented, and in some cases, such embodiment can be implemented in controller 180.
For software implementation, the embodiment of such as process or function can be with allowing to perform the single of at least one function or operation
Software module is implemented.Software code can by the software application (or program) write with any appropriate programming language Lai
Implement, software code can be stored in memory 160 and be performed by controller 180.
So far, oneself according to its function through describing mobile terminal.Below, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is as showing
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 1 00 as shown in Figure 1 may be constructed such that using via frame or packet transmission data it is all if any
Line and wireless communication system and satellite-based communication system are operated.
The communication system for describing wherein be operated according to the mobile terminal of the present invention referring now to Fig. 2.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base stations (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC280 is also structured to the BSC275 formation interfaces with that can be couple to base station 270 via back haul link.
If any of interface that back haul link can be known according to Ganji is constructed, the interface includes such as E1/T1, ATM, IP,
PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system can include multiple BSC2750 as shown in Figure 2.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion of line covering is radially away from BS270.Or, each subregion can be by two or more for diversity reception
Antenna is covered.Each BS270 may be constructed such that the multiple frequency distribution of support, and each frequency distribution has specific frequency spectrum
(for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed, which intersects, can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single
BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, specific BS270 each subregion can be claimed
For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal operated in system by broadcsting transmitter (BT) 295
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 1 00 to receive the broadcast sent by BT295
Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps to position multiple mobile terminals
At least one in 100.
In fig. 2, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite obtains useful
Location information.GPS module 115 as shown in Figure 1 is generally configured to coordinate with satellite 300 to obtain desired positioning letter
Breath.GPS tracking techniques or outside GPS tracking techniques are substituted, the other of the position that can track mobile terminal can be used
Technology.In addition, at least one gps satellite 300 can optionally or additionally handle satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00
Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.It is each anti-that certain base station 270 is received
Handled to link signal in specific BS270.The data of acquisition are forwarded to the BSC275 of correlation.BSC provides call
Resource allocation and the mobile management function of coordination including the soft switching process between BS270.BSC275 is also by the number received
According to MSC280 is routed to, it is provided for the extra route service with PSTN290 formation interfaces.Similarly, PSTN290 with
MSC280 formation interfaces, MSC and BSC275 formation interface, and BSC275 correspondingly control BS270 with by forward link signals
It is sent to mobile terminal 1 00.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the inventive method is proposed.
Fig. 3 handles the device of depth image for present invention virtualization, as shown in figure 3, including:Acquisition module, computing module, really
Cover half block, correcting module and Fuzzy Processing module.Wherein,
Acquisition module, for obtaining F-number, focal length, focal distance and allowing disperse circular diameter.
Computing module, for the F-number according to acquisition, focal length, focal distance and allow disperse circular diameter calculate the depth of field
Scope.
Wherein, the computing module calculates field depth by below equation:
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance.
Determining module, for obtaining the object distance of focusing and true according to the object distance of the focusing of acquisition, F-number, focal length
Determine the corresponding virtualization degree of different object distances.
Wherein, the determining module determines the corresponding virtualization degree of different object distances by below equation:
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is aperture
Numerical value, A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f.
Correcting module, for according to the corresponding virtualization degree amendment Gaussian Blur of different object distances determined corresponding half
Footpath.
Wherein, the correcting module is according to the corresponding radius of below equation amendment Gaussian Blur:
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is Bi's
Maximum, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum
Value.
Fuzzy Processing module, pixel for the depth image to blurring processing the need for outside field depth is according to repairing
The corresponding radius of Gaussian Blur after just carries out Fuzzy Processing and shot.
Wherein, Fuzzy Processing module to blur the need for outside field depth processing depth image pixel according to repairing
The corresponding radius of Gaussian Blur after just, which carries out Fuzzy Processing, to be included:Obtain the object distance of each pixel;Judge obtain each
Whether the object distance of pixel is outside field depth;To judging the corresponding pixel of object distance outside field depth according to repairing
The corresponding radius of Gaussian Blur after just carries out Fuzzy Processing.
Further, said apparatus can be arranged in a kind of terminal, can also be arranged in general digital product.
It should be noted that when such as above-mentioned device is set in the terminal, the acquisition module in the device can be arranged on
In A/V input blocks 130 in Fig. 1, computing module, determining module, correcting module and Fuzzy Processing module can be arranged on
In controller 180 in Fig. 1.
The method that Fig. 4 handles depth image for present invention virtualization, as shown in figure 4, including:
Step 401:Obtain F-number, focal length, focal distance and allow disperse circular diameter.
Step 402:According to the F-number of acquisition, focal length, focal distance and allow disperse circular diameter calculate field depth.
Wherein, field depth is calculated by formula (1):
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance.
It should be noted that according to convex lens model, as shown in figure 5, selecting a focusing, focusing to camera lens
Distance be object distance u0, the distance of the picture point of focusing to camera lens is image distance v0, lens focus is f, in addition an object point to camera lens away from
From for u, image distance is v, is shown in formula (2) according to the relation of image distance, object distance and focal length,
Formula (3) and formula (4) can be obtained by then having.
v0=fu0/(u0-f) (3)
V=fu/ (u-f) (4)
By Fig. 5 it can be seen that, the picture point outside focal plane can form hot spot in imaging plane, and this hot spot claims
Be blur circle, radius is bigger, and image is fuzzyyer, but the limited resolution of human eye, and the diameter range of blur circle is no more than one
Fixed scope, human eye, which can not be identified, carrys out image blur, can so form one section of clearly areas imaging, the referred to as depth of field, this
The maximum gauge of individual blur circle is also referred to as allowing disperse circular diameter, as shown in fig. 6, shooting distance (or referred to as focal distance)
For L, the preceding depth of field is Δ L1, and the rear depth of field is Δ L2, then field depth is Δ L.F-number is set as F, disperse circular diameter is allowed
For σ, field depth Δ L and hyperfocal distance can be calculated apart from LF using depth of field formula.Wherein, the preceding depth of field and the rear depth of field are led to respectively
Cross formula (5) and (6) calculate and obtained.
Then field depth is formula (7).
Depth of field calculation formula (5) above arrives (7), gives field depth, may thereby determine that each in field depth
The virtualization degree of individual pixel is 0, and the corresponding radius of Gaussian Blur is 0.
Step 403:Obtain the object distance of focusing and determined not according to the object distance of the focusing of acquisition, F-number, focal length
The corresponding virtualization degree with object distance.
Wherein, the corresponding virtualization degree of different object distances is determined by formula (8):
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is aperture
Numerical value, A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f.
It should be noted that from formula (8), keeping constant in the other parameters of camera, such as focal distance u0And picture
The object distance u of vegetarian refreshmentsiIn the case of certain, it can be seen that f-number F and BiIt is the relation being inversely proportional, that is to say, that f-number F
It is bigger, BiNumerical value it is smaller, fog-level is smaller, in addition by formula (7), it can also be seen that F change can also cause depth of field model
Δ L change is enclosed, same analysis is done, it can be seen that F is bigger, Δ L is also bigger.As shown in fig. 7, illustrating identical focusing
Apart from following, the field depth of the field depth of different apertures, i.e. large aperture is small, and fog-level is big outside field depth.
In addition, as shown in figure 8, adjustment focal distance L, will be background blurring to different main bodys, it can be seen that different focal distances, scape
Deep scope is also different, and aperture F is to BiAnalysis it is similar, the pixel fog-level under different object distances is also different.Wherein, k1 and k2
For distance limitation parameter, the fog-level more than a range of distance is all maximum.
It should be noted that according to the definition of aperture, there is F=f/D, wherein, F is F-number, and f is focal length, this area
Technical staff can derive the formula (9) for calculating disperse circular diameter according to similar triangles:
Wherein, δiFor pixel i disperse circular diameter, viFor pixel i image distance, v0For the image distance of focusing, D is mirror
The effective aperture of head, wherein i is positive integer.In addition, because different breadth photo-sensitive cells, its allow blur circle size (or
Say it is diameter) it is different, therefore equivalent full image film catercorner length A can be set, it is identical to equivalent focal length, then
Ratio between focal length and catercorner length is fixed, and might as well set f=α A can finally be derived by formula (10) and (9)
Go out formula (8):
Step 404:According to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of the different object distances determined.
Wherein, the corresponding radius of Gaussian Blur is corrected according to formula (11):
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is Bi's
Maximum, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum
Value.
Wherein, can be in the hope of Bmax and Bmin by formula (12):
By formula (12) can in the hope of Bmin=0,
Wherein it is determined that the corresponding schematic diagram of the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of the different object distances gone out
As shown in Figure 9.
Step 405:To blur the need for outside field depth processing depth image pixel according to revised height
This fuzzy corresponding radius carries out Fuzzy Processing and shot.
Wherein, to blur the need for outside field depth processing depth image pixel according to revised Gaussian mode
Pasting corresponding radius progress Fuzzy Processing includes:Obtain the object distance of each pixel;Judge the object distance of each pixel obtained
Whether outside field depth;To judging the corresponding pixel of object distance outside field depth according to revised Gaussian mode
Paste corresponding radius and carry out Fuzzy Processing.
Further, this method also includes:The virtualization degree of each pixel within field depth is 0, Gaussian Blur pair
The radius answered also is 0.
In the inventive method, by the pixel of the depth image to blurring processing the need for outside field depth according to amendment
The corresponding radius of Gaussian Blur afterwards carries out Fuzzy Processing, realizes and is made in the physical hardware structure not to existing mobile terminal
In the case of change, shooting effect is improved.
It should be noted that as shown in Figure 10, R is the radius size of pixel Gaussian Blur, value is bigger, and fog-level is got over
Greatly.Dist is the pixel actual physical distance in camera scene, and L is the focal distance of image, in certain f-number F,
Under conditions of focal length f, maximum allowable disperse circular diameter, field depth can be calculated, Lp is the deep critical point of close shot, and Ln is distant view
Deep critical point.In RFL curves as shown in Figure 10, inside field depth, R is 0, represents the image of these focal plane scopes
Imaging clearly, it is not necessary to fuzzy.Outside field depth, image distance field depth is more remote, and R value is bigger, and fog-level is bigger.
When L meets f*f-F*det*L=0, Ln levels off to infinity, and field depth expands to infinite point, and L now is hyperfocal distance
Apart from LF, corresponding fog-level curve is as shown in figure 11.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also including other key elements being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element limited by sentence " including one ... ", it is not excluded that including
Also there is other identical element in process, method, article or the device of the key element.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Understood based on such, technical scheme is substantially done to prior art in other words
Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are to cause a station terminal equipment (can be mobile phone, computer, clothes
It is engaged in device, air conditioner, or network equipment etc.) perform method described in each embodiment of the invention.
The preferred embodiments of the present invention are these are only, are not intended to limit the scope of the invention, it is every to utilize this hair
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (2)
1. a kind of device for blurring processing depth image, it is characterised in that including:Acquisition module, computing module, determining module,
Correcting module and Fuzzy Processing module;Wherein,
Acquisition module, for obtaining F-number, focal length, focal distance and allowing disperse circular diameter;
Computing module, for the F-number according to acquisition, focal length, focal distance and allow disperse circular diameter calculate depth of field model
Enclose,
The computing module calculates field depth by below equation:
<mrow>
<mi>&Delta;</mi>
<mi>L</mi>
<mo>=</mo>
<mfrac>
<mrow>
<mn>2</mn>
<msup>
<mi>f</mi>
<mn>2</mn>
</msup>
<msup>
<mi>F&sigma;L</mi>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<msup>
<mi>f</mi>
<mn>4</mn>
</msup>
<mo>-</mo>
<msup>
<mi>F</mi>
<mn>2</mn>
</msup>
<msup>
<mi>&sigma;</mi>
<mn>2</mn>
</msup>
<msup>
<mi>L</mi>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
</mrow>
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance;
Determining module, for obtaining the object distance of focusing and being determined not according to the object distance of the focusing of acquisition, F-number, focal length
The corresponding virtualization degree with object distance, the determining module determines the corresponding virtualization degree of different object distances by below equation:
<mrow>
<msub>
<mi>B</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mi>&alpha;</mi>
<mi>F</mi>
</mfrac>
<mo>|</mo>
<mfrac>
<mrow>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>/</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>-</mo>
<mi>&alpha;</mi>
<mi>A</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>/</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<mi>&alpha;</mi>
<mi>A</mi>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
<mo>-</mo>
<mn>1</mn>
<mo>|</mo>
</mrow>
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is F-number,
A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f;
Correcting module, for according to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of different object distances determined;
Fuzzy Processing module, after pixel for the depth image to blurring processing the need for outside field depth is according to amendment
The corresponding radius of Gaussian Blur carry out Fuzzy Processing and shoot;
The correcting module is according to the corresponding radius of below equation amendment Gaussian Blur:
<mrow>
<msub>
<mi>R</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<mi>B</mi>
<mi> </mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>-</mo>
<mi>B</mi>
<mi> </mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
<mrow>
<mi>R</mi>
<mi> </mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>-</mo>
<mi>R</mi>
<mi> </mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</mfrac>
<mi>l</mi>
<mi>o</mi>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>B</mi>
<mi>i</mi>
</msub>
<mo>+</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is BiMaximum
Value, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum value.
2. a kind of method for blurring processing depth image, it is characterised in that including:
Obtain F-number, focal length, focal distance and allow disperse circular diameter;
According to the F-number of acquisition, focal length, focal distance and allow disperse circular diameter calculate field depth;Pass through below equation
Calculate field depth:
<mrow>
<mi>&Delta;</mi>
<mi>L</mi>
<mo>=</mo>
<mfrac>
<mrow>
<mn>2</mn>
<msup>
<mi>f</mi>
<mn>2</mn>
</msup>
<msup>
<mi>F&sigma;L</mi>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<msup>
<mi>f</mi>
<mn>4</mn>
</msup>
<mo>-</mo>
<msup>
<mi>F</mi>
<mn>2</mn>
</msup>
<msup>
<mi>&sigma;</mi>
<mn>2</mn>
</msup>
<msup>
<mi>L</mi>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
</mrow>
Wherein, Δ L is field depth, and f is focal length, and F is F-number, and σ is allows disperse circular diameter, and L is focal distance;
Obtain the object distance of focusing and determine that different object distances are corresponding according to the object distance of the focusing of acquisition, F-number, focal length
Virtualization degree, the corresponding virtualization degree of different object distances is determined by below equation:
<mrow>
<msub>
<mi>B</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mi>&alpha;</mi>
<mi>F</mi>
</mfrac>
<mo>|</mo>
<mfrac>
<mrow>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>/</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>-</mo>
<mi>&alpha;</mi>
<mi>A</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>/</mo>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<mi>&alpha;</mi>
<mi>A</mi>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
<mo>-</mo>
<mn>1</mn>
<mo>|</mo>
</mrow>
Wherein, BiFor pixel i virtualization degree, u0For the object distance of focusing, uiFor pixel i object distance, F is F-number,
A is film catercorner length, and α is conversion coefficient;Wherein, i is positive integer, and A and α product are equal to focal length f;
According to the corresponding radius of the corresponding virtualization degree amendment Gaussian Blur of the different object distances determined;
The pixel of depth image to blurring processing the need for outside field depth is corresponding according to revised Gaussian Blur
Radius carries out Fuzzy Processing and shot;
According to the corresponding radius of below equation amendment Gaussian Blur:
<mrow>
<msub>
<mi>R</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<mi>B</mi>
<mi> </mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>-</mo>
<mi>B</mi>
<mi> </mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
<mrow>
<mi>R</mi>
<mi> </mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>-</mo>
<mi>R</mi>
<mi> </mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</mfrac>
<mi>l</mi>
<mi>o</mi>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>B</mi>
<mi>i</mi>
</msub>
<mo>+</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, RiFor the pixel i corresponding radius of Gaussian Blur, BiFor pixel i virtualization degree, Bmax is BiMaximum
Value, Bmin is BiMinimum value, Rmax be Gaussian Blur radius maximum, Rmin be Gaussian Blur radius minimum value.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510481927.6A CN105163042B (en) | 2015-08-03 | 2015-08-03 | A kind of apparatus and method for blurring processing depth image |
PCT/CN2016/093087 WO2017020836A1 (en) | 2015-08-03 | 2016-08-03 | Device and method for processing depth image by blurring |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510481927.6A CN105163042B (en) | 2015-08-03 | 2015-08-03 | A kind of apparatus and method for blurring processing depth image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105163042A CN105163042A (en) | 2015-12-16 |
CN105163042B true CN105163042B (en) | 2017-11-03 |
Family
ID=54803784
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510481927.6A Active CN105163042B (en) | 2015-08-03 | 2015-08-03 | A kind of apparatus and method for blurring processing depth image |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105163042B (en) |
WO (1) | WO2017020836A1 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105163042B (en) * | 2015-08-03 | 2017-11-03 | 努比亚技术有限公司 | A kind of apparatus and method for blurring processing depth image |
CN105979165B (en) * | 2016-06-02 | 2019-02-05 | Oppo广东移动通信有限公司 | Blur photograph generation method, device and mobile terminal |
CN107613199B (en) | 2016-06-02 | 2020-03-13 | Oppo广东移动通信有限公司 | Blurred photo generation method and device and mobile terminal |
JP6714791B2 (en) * | 2016-07-13 | 2020-07-01 | 株式会社バンダイナムコエンターテインメント | Simulation system and program |
CN106530241B (en) * | 2016-10-31 | 2020-08-11 | 努比亚技术有限公司 | Image blurring processing method and device |
CN106600529A (en) * | 2016-11-28 | 2017-04-26 | 北京暴风魔镜科技有限公司 | Method and device for acquiring full-focus panoramic image |
CN107133982B (en) * | 2017-04-28 | 2020-05-15 | Oppo广东移动通信有限公司 | Depth map construction method and device, shooting equipment and terminal equipment |
CN108234858B (en) | 2017-05-19 | 2020-05-01 | 深圳市商汤科技有限公司 | Image blurring processing method and device, storage medium and electronic equipment |
CN108701361A (en) * | 2017-11-30 | 2018-10-23 | 深圳市大疆创新科技有限公司 | Depth value determines method and apparatus |
CN108234865A (en) * | 2017-12-20 | 2018-06-29 | 深圳市商汤科技有限公司 | Image processing method, device, computer readable storage medium and electronic equipment |
CN108335323B (en) * | 2018-03-20 | 2020-12-29 | 厦门美图之家科技有限公司 | Blurring method of image background and mobile terminal |
CN108337434B (en) * | 2018-03-27 | 2020-05-22 | 中国人民解放军国防科技大学 | Out-of-focus virtual refocusing method for light field array camera |
CN108765271B (en) * | 2018-05-30 | 2022-06-03 | 北京小米移动软件有限公司 | Image processing method and apparatus |
CN111209782B (en) * | 2018-11-22 | 2024-04-16 | 中国银联股份有限公司 | Recognition method and recognition system for abnormal lamp of equipment in machine room |
EP3744088A1 (en) | 2019-04-01 | 2020-12-02 | Google LLC | Techniques to capture and edit dynamic depth images |
CN114979472B (en) * | 2022-05-13 | 2023-11-24 | 杭州联吉技术有限公司 | Automatic focusing method, device, equipment and readable storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6148113A (en) * | 1998-02-03 | 2000-11-14 | Micrografx, Inc. | System for stimulating the depth of field of an image in two dimensional space and method of operation |
CN101764925B (en) * | 2008-12-25 | 2011-07-13 | 华晶科技股份有限公司 | Simulation method for shallow field depth of digital image |
US8335390B2 (en) * | 2010-03-22 | 2012-12-18 | Sony Corporation | Blur function modeling for depth of field rendering |
CN102158648B (en) * | 2011-01-27 | 2014-09-10 | 明基电通有限公司 | Image capturing device and image processing method |
CN104424640B (en) * | 2013-09-06 | 2017-06-20 | 格科微电子(上海)有限公司 | The method and apparatus for carrying out blurring treatment to image |
CN103945118B (en) * | 2014-03-14 | 2017-06-20 | 华为技术有限公司 | Image weakening method, device and electronic equipment |
CN103945210B (en) * | 2014-05-09 | 2015-08-05 | 长江水利委员会长江科学院 | A kind of multi-cam image pickup method realizing shallow Deep Canvas |
CN105163042B (en) * | 2015-08-03 | 2017-11-03 | 努比亚技术有限公司 | A kind of apparatus and method for blurring processing depth image |
-
2015
- 2015-08-03 CN CN201510481927.6A patent/CN105163042B/en active Active
-
2016
- 2016-08-03 WO PCT/CN2016/093087 patent/WO2017020836A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN105163042A (en) | 2015-12-16 |
WO2017020836A1 (en) | 2017-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105163042B (en) | A kind of apparatus and method for blurring processing depth image | |
CN106502693B (en) | A kind of image display method and device | |
CN104735255B (en) | Multi-screen display method and system | |
CN105227837A (en) | A kind of image combining method and device | |
CN105916096B (en) | A kind of processing method of sound waveform, device, mobile terminal and VR helmets | |
CN106097284B (en) | A kind of processing method and mobile terminal of night scene image | |
CN106231087A (en) | A kind of method and apparatus of positive and negative dual-screen display device false-touch prevention | |
CN106341817A (en) | Access control system, access control method, mobile terminals and access server | |
CN105430258B (en) | A kind of method and apparatus of self-timer group photo | |
CN106331499A (en) | Focusing method and shooting equipment | |
CN107016639A (en) | A kind of image processing method and device | |
CN104951229B (en) | Screenshot method and device | |
CN106534693A (en) | Photo processing method, photo processing device and terminal | |
CN106851113A (en) | A kind of photographic method and mobile terminal based on dual camera | |
CN106791141A (en) | A kind of method of adjustment and mobile terminal of sound effect parameters of conversing | |
CN106373110A (en) | Method and device for image fusion | |
CN106303044B (en) | A kind of mobile terminal and obtain the method to coke number | |
CN105242483B (en) | The method and apparatus that a kind of method and apparatus for realizing focusing, realization are taken pictures | |
CN107018326A (en) | A kind of image pickup method and device | |
CN106657783A (en) | Image shooting device and method | |
CN106791016A (en) | A kind of photographic method and terminal | |
CN106646442A (en) | Distance measurement method and terminal | |
CN105208279B (en) | A kind of parfocal method and apparatus in school | |
CN106780408A (en) | Image processing method and device | |
CN108111744A (en) | A kind of filming apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |