CN107730460A - A kind of image processing method and mobile terminal - Google Patents

A kind of image processing method and mobile terminal Download PDF

Info

Publication number
CN107730460A
CN107730460A CN201710881932.5A CN201710881932A CN107730460A CN 107730460 A CN107730460 A CN 107730460A CN 201710881932 A CN201710881932 A CN 201710881932A CN 107730460 A CN107730460 A CN 107730460A
Authority
CN
China
Prior art keywords
image
virtualization
pending image
background
depth map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710881932.5A
Other languages
Chinese (zh)
Other versions
CN107730460B (en
Inventor
纪杨琨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710881932.5A priority Critical patent/CN107730460B/en
Publication of CN107730460A publication Critical patent/CN107730460A/en
Application granted granted Critical
Publication of CN107730460B publication Critical patent/CN107730460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Abstract

The present invention, which provides a kind of image processing method and mobile terminal, this method, to be included:The pending image of dual camera shooting is obtained, obtains depth map corresponding with the pending image;According to the depth map, virtualization processing is carried out to the background area of the pending image, obtains blurring image;By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.The present invention is synthesized by will blur image luminance picture corresponding with pending image corresponding to pending image, obtain output image, the brightness of the highlight regions in image background can be effectively reduced, the scattered scape effect of large aperture is simulated, improves image processing effect.

Description

A kind of image processing method and mobile terminal
Technical field
The present invention relates to communication technical field, more particularly to a kind of image processing method and mobile terminal.
Background technology
With shooting demand more and more higher of the user to smart mobile phone, dual camera has become one kind of smart mobile phone Standard configuration.Wherein, dual camera can utilize the depth map of synthesis to simulate single anti-large aperture effect, to lift shooting effect, Strengthen Consumer's Experience.
At present, during the image shot to dual camera carries out background blurring processing, in background area High bright part blurs processing mode, usually expansion process;However, for the expanded processing in the larger region of some high bright parts Afterwards, it can make it that the high bright part of background area is excessively bright, and then the visual effect of image can be influenceed.
It can be seen that there is the problem of high bright part is excessively bright in the background blurring processing of conventional images.
The content of the invention
The embodiment of the present invention provides a kind of image processing method and mobile terminal, to solve the background blurring place of conventional images The problem of high bright part is excessively bright be present in reason.
In order to solve the above technical problems, what the present invention was realized in:A kind of image processing method, taken the photograph applied to including double As the mobile terminal of head, including:The pending image of the dual camera shooting is obtained, is obtained corresponding with the pending image Depth map;According to the depth map, virtualization processing is carried out to the background area of the pending image, obtains blurring image; By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.
In a first aspect, the embodiments of the invention provide a kind of image processing method, applied to the movement including dual camera Terminal, including:
The pending image of dual camera shooting is obtained, obtains depth map corresponding with the pending image;
According to the depth map, virtualization processing is carried out to the background area of the pending image, obtains blurring image;
By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.
Second aspect, the embodiment of the present invention also provide a kind of mobile terminal, including dual camera, and the mobile terminal also wraps Include:
Acquisition module, for obtaining the pending image of the dual camera shooting, obtain and the pending image pair The depth map answered;
Blurring module, for according to the depth map, carrying out virtualization processing to the background area of the pending image, obtaining To virtualization image;
Synthesis module, for by virtualization image luminance picture synthesis corresponding with the pending image, obtaining defeated Go out image.
The third aspect, the embodiment of the present invention also provide a kind of mobile terminal, including processor, memory and are stored in described On memory and the computer program that can run on the processor, the computer program is by real during the computing device The step of existing above-mentioned image processing method.
Fourth aspect, the embodiment of the present invention also provide a kind of computer-readable recording medium, the computer-readable storage Computer program is stored with medium, the computer program realizes the step of above-mentioned image processing method when being executed by processor Suddenly.
The embodiment of the present invention is by will blur image luminance picture corresponding with pending image corresponding to pending image Synthesis, obtains output image, can effectively reduce the brightness of the highlight regions in image background, simulates the scattered scape effect of large aperture Fruit, improve image processing effect.
Brief description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, needed for being described below to the embodiment of the present invention The accompanying drawing to be used is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the present invention, For those of ordinary skill in the art, without having to pay creative labor, can also be obtained according to these accompanying drawings Obtain other accompanying drawings.
Fig. 1 is the flow chart for the image processing method that one embodiment of the invention provides;
Fig. 2 is the flow chart for the image processing method that another embodiment of the present invention provides;
Fig. 3 is a kind of schematic diagram for blurring curve provided by the invention;
Fig. 4 is the structure chart for the mobile terminal that one embodiment of the invention provides;
Fig. 5 is the structure chart of blurring module in the mobile terminal that one embodiment of the invention provides;
Fig. 6 is the structure chart of synthesis module in the mobile terminal that one embodiment of the invention provides;
Fig. 7 is the structure chart of the second generation submodule in the mobile terminal that one embodiment of the invention provides;
Fig. 8 is the structure chart of determination sub-module in the mobile terminal that one embodiment of the invention provides;
Fig. 9 is the structure chart of the first generation submodule in the mobile terminal that one embodiment of the invention provides;
Figure 10 is the structure chart for the mobile terminal that another embodiment of the present invention provides.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is part of the embodiment of the present invention, rather than whole embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained under the premise of creative work is not made Example, belongs to the scope of protection of the invention.
Referring to Fig. 1, Fig. 1 is the flow chart for the image processing method that one embodiment of the invention provides, as shown in figure 1, including Following steps:
Step 101, the pending image for obtaining the dual camera shooting, obtain corresponding with the pending image deep Degree figure.
In the step, dual camera can be that two color cameras form, and the shooting of image is being carried out using dual camera During, a master image can be exported, i.e., pending image, can also generate a depth map corresponding with pending image. , it is necessary to obtain depth map corresponding with pending image during to the background blurring processing of pending image progress.
Step 102, according to the depth map, virtualization processing is carried out to the background area of the pending image, obtains void Change image.
In the step, a destination object can be first determined, then by the region outside destination object in pending image It is defined as background area, background area that can also be using the region corresponding to destination object as pending image.Wherein, target The characteristics of image pre-set that object can be, such as face characteristic or flowers, birds, fish and insects feature etc.;Moreover, destination object It can also be determined according to the touch control operation of user.
It is determined that after background area, each background object and target pair in background area can be obtained according to depth map The position relationship of elephant, the virtualization grade of each background object is determined, and each background object is blurred using virtualization grade Processing, and obtain the virtualization image of pending image.For example in depth map, the background object ratio away from destination object is close The background object of destination object, virtualization degree is higher, that is, it is more obvious to blur effect.
Step 103, by virtualization image luminance picture synthesis corresponding with the pending image, obtain output figure Picture.
In the step, luminance picture corresponding with pending image, and the luminance picture that will be acquired can be first obtained Synthesized with virtualization image, obtain output image.
In the present embodiment, the original brightness image of pending image can be first obtained, and by original brightness figure As carrying out virtualization processing, luminance picture corresponding with pending image is obtained.Original brightness image is being carried out to blur processing During, highlight bar existing for the part of background area can will be corresponded in original brightness image by the way of disk filtering Domain, is filtered into spot light, and carries out virtualization processing to carrying out the luminance picture after disk filtering process, obtain can be used for The luminance picture that virtualization image is synthesized.
In this way, after can effectively reducing pending image virtualization processing, bright part pair is crossed in background area The influence of output image, improve the virtualization effect to pending image.
In the embodiment of the present invention, the above method can apply to any mobile terminal including dual camera, wherein, it is mobile Terminal can be mobile phone, tablet personal computer (Tablet Personal Computer), laptop computer (Laptop Computer), personal digital assistant (personal digital assistant, abbreviation PDA), mobile Internet access device (Mobile Internet Device, MID) or wearable device (Wearable Device) etc..
The image processing method of the embodiment of the present invention, the pending image shot by obtaining the dual camera, is obtained Depth map corresponding with the pending image;According to the depth map, void is carried out to the background area of the pending image Change is handled, and obtains blurring image;By virtualization image luminance picture synthesis corresponding with the pending image, exported Image.So synthesized, obtained defeated by the way that image luminance picture corresponding with pending image will be blurred corresponding to pending image Go out image, can effectively reduce the brightness of the highlight regions in image background, simulate the scattered scape effect of large aperture, improve figure As treatment effect.
Referring to Fig. 2, Fig. 2 is the flow chart for the image processing method that another embodiment of the present invention provides, as shown in Fig. 2 bag Include following steps:
Step 201, the pending image for obtaining the dual camera shooting, obtain corresponding with the pending image deep Degree figure.
In the step, dual camera can be that two color cameras form, and the shooting of image is being carried out using dual camera During, a master image can be exported, i.e., pending image, can also generate a depth map corresponding with pending image. , it is necessary to obtain depth map corresponding with pending image during to the background blurring processing of pending image progress.
Step 202, the background area for determining the pending image.
In the step, a destination object can be first determined, then by the region outside destination object in pending image It is defined as background area, background area that can also be using the region corresponding to destination object as pending image.Wherein, target The characteristics of image pre-set that object can be, such as face characteristic or flowers, birds, fish and insects feature etc.;Moreover, destination object It can also be determined according to the touch control operation of user.
Optionally, step 202 can also include:Touch control operation of the detection for the pending image;If detect institute State touch control operation, it is determined that operating area corresponding to the touch control operation;By operating area described in the pending image it Background area of the outer region as the pending image.
, can be by detecting touch control operation in present embodiment, and can determine that touch-control is grasped according to the position of touch control operation Operating area corresponding to work, and the background area using the region outside operating area in pending image as pending image. In this way, user can freely determine destination object, i.e. characteristics of image corresponding to touch control operation, to lift the behaviour of user Experience.
Step 203, the virtualization parameter according to the depth map generation pending image.
In the step, it can obtain the position of each background object and destination object in background area according to depth map and close System, to determine the virtualization grade of each background object, and according to the virtualization grade of each background object, generate corresponding virtualization ginseng Number.Wherein, in depth map, for the background object away from destination object than the background object of close destination object, virtualization degree will It is higher, that is, it is more obvious to blur effect.
Optionally, step 203 can also include:The characteristics of image of the operating area is identified, and described in recognizing Destination object of the characteristics of image as the pending image;According to each background object in the background area in the depth map In position relationship with the destination object, generate the virtualization curve of the pending image;Wherein, it is every in the background area Virtualization degree and the relative distance of one background object are proportionate, and the relative distance is each background object in the depth map In relatively described destination object distance.
, can be by identifying the characteristics of image of operating area, using the characteristics of image recognized as treating in present embodiment Handle the destination object of image, for example the characteristics of image recognized be " face characteristic ", then should " face characteristic " conduct wait to locate The destination object of image is managed, and by the image-region outside " face characteristic ", the background area as the pending image;And lead to Depth map corresponding with pending image is crossed, determines the position relationship of each background object and destination object, and is closed by position System determines the virtualization degree of each background object, and forms the virtualization curve of a pending image.Wherein, each background object Virtualization degree is proportionate with relative distance, relative distance be each background object in depth map relative target object away from From.
It should be noted that as shown in figure 3, abscissa " X " axle represents each reference object distance shooting in pending image The distance of camera lens, ordinate " Y " axle represent the virtualization degree of each reference object in pending image, wherein, the mesh that " S " is expressed as Mark object, and will be located at " S " front and rear sides reference object be background object, and can by virtualization dotted line as shown in Figure 3, Virtualization processing is carried out to the background object of pending image.
It should be noted that in order to preferably lift the background blurring effect of pending image, target pair can be pointed to As the background object of front and rear sides carries out different degrees of virtualization processing.Specifically, in the corresponding depth map of pending image In, the position based on destination object " S " in depth map, depth map is divided into front and rear sides;As shown in figure 3, it is located at target The virtualization degree of background object on front side of object is more fierce, and the virtualization degree phase of the background object on rear side of destination object To gentle;I.e. for positioned at destination object front and rear sides, and the equidistant background object of relative target object, the back of the body of front side The virtualization degree of scape object is more than the virtualization degree of the background object of rear side, wherein, the background object of front side is pending image Shooting process in, the background object between dual camera and destination object, i.e., close to dual camera background object, after The background object of side is the background object away from dual camera.
Step 204, using it is described virtualization parameter virtualization processing is carried out to the background area, obtain the virtualization image.
In the step, the virtualization parameter that step 203 can be used to generate blurs to the background area of pending image Processing, obtain blurring image.
Step 205, by virtualization image luminance picture synthesis corresponding with the pending image, obtain output figure Picture.
In the step, luminance picture corresponding with pending image, and the luminance picture that will be acquired can be first obtained Synthesized with virtualization image, obtain output image.
Optionally, step 205 can also include:Generate the original brightness image of the pending image;Will be described original Luminance picture corresponds to the region division of the background area at least one luminance area, and by least one luminance area Middle brightness exceedes predetermined luminance, and brightness area is more than the luminance area of preset area, as object brightness region;By the mesh Mark luminance area removes from the original brightness image, obtains object brightness image;Using the virtualization parameter to the mesh Mark luminance picture carries out virtualization processing, obtains the luminance picture;The virtualization image is synthesized with the luminance picture, obtained The output image.
In the present embodiment, pending image can be converted into by original brightness image according to colour light emitting principle;And By corresponding to the region division of background area to original brightness image at least one luminance area, and it will wherein meet that brightness surpasses Predetermined luminance is crossed, and brightness area is more than the luminance area of preset area, as object brightness region, to filter out original brightness Highlight regions in image;Then, then by the object brightness region screened removed from original brightness image, obtain target Luminance picture, it can so reduce the influence of virtualization effect of the highlight regions in background area to pending image, lifting pair The virtualization effect of pending image;Subsequently, virtualization processing is being carried out using virtualization parameters on target luminance picture, is obtaining and treat Handle luminance picture corresponding to image;Finally, then by step 204 the virtualization image luminance graph corresponding with pending image obtained As synthesis, output image is obtained.
In this way, after can effectively reducing pending image virtualization processing, bright part pair is crossed in background area The influence of output image, improve the virtualization effect to pending image.
The image processing method of the embodiment of the present invention, the pending image shot by obtaining the dual camera, is obtained Depth map corresponding with the pending image;Determine the background area of the pending image;Generated according to the depth map The virtualization parameter of the pending image;Virtualization processing is carried out to the background area using the virtualization parameter, obtained described Blur image.By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.It is so logical Image luminance picture synthesis corresponding with pending image will be blurred by crossing corresponding to pending image, obtain output image, can The brightness of the highlight regions in image background is effectively reduced, the scattered scape effect of large aperture is simulated, improves image processing effect.
Referring to Fig. 4, Fig. 4 is the structure chart for the mobile terminal that one embodiment of the invention provides, as shown in figure 4, mobile terminal 400 include acquisition module 401, blurring module 402 and synthesis module 403, wherein, acquisition module 401 connects with blurring module 402 Connect, blurring module 402 is also connected with synthesis module 403:
Acquisition module 401, for obtaining the pending image of the dual camera shooting, obtain and the pending image Corresponding depth map;
Blurring module 402, for according to the depth map, being carried out to the background area of the pending image at virtualization Reason, obtain blurring image;
Synthesis module 403, for by virtualization image luminance picture synthesis corresponding with the pending image, obtaining Output image.
Optionally, as shown in figure 5, the blurring module 402 includes:
Determination sub-module 4021, for determining the background area of the pending image;
First generation submodule 4022, for generating the virtualization parameter of the pending image according to the depth map;
Submodule 4023 is blurred, for carrying out virtualization processing to the background area using the virtualization parameter, obtains institute State virtualization image.
Optionally, as shown in fig. 6, the synthesis module 403 includes:
Second generation submodule 4031, for generating luminance picture corresponding with the pending image;
Submodule 4032 is synthesized, for the virtualization image to be synthesized with the luminance picture, obtains the output image.
Optionally, as shown in fig. 7, the second generation submodule 4031 includes:
First generation unit 40311, for generating the original brightness image of the pending image;
First determining unit 40312, the original brightness image is corresponded into the region division of the background area at least One luminance area, and brightness at least one luminance area is exceeded into predetermined luminance, and brightness area is more than default face Long-pending luminance area, as object brightness region;
Unit 40313 is removed, the object brightness region is removed from the original brightness image, obtains object brightness Image;
Unit 40314 is blurred, for carrying out virtualization processing to the object brightness image using the virtualization parameter, is obtained The luminance picture.
Optionally, as shown in figure 8, the determination sub-module 4021 includes:
Detection unit 40211, for detecting the touch control operation for the pending image;
Second determining unit 40212, if for detecting the touch control operation, it is determined that grasped corresponding to the touch control operation Make region;
3rd determining unit 40213, the region outside operating area described in the pending image is treated as described in Handle the background area of image.
Optionally, as shown in figure 9, the first generation submodule 4022 includes:
Recognition unit 40221, for identifying the characteristics of image of the operating area, and the described image feature that will be recognized Destination object as the pending image;
Second generation unit 40222, for according to each background object in the background area in the depth map with institute The position relationship of destination object is stated, generates the virtualization curve of the pending image;
Wherein, the virtualization degree of each background object is proportionate with relative distance in the background area, described relative Distance is the distance of each background object relatively described destination object in the depth map.
Optionally, in the depth map, the position based on the destination object in the depth map, by the depth Figure is divided into front and rear sides, and for positioned at the destination object front and rear sides, and relatively described destination object is equidistant Background object, the virtualization degree of front side background object are more than the virtualization degree of rear side background object;
Wherein, the front side background object for the pending image shooting process in, positioned at the dual camera with Background object between the destination object.
Mobile terminal 400 can realize each process that mobile terminal is realized in Fig. 1 to Fig. 3 embodiment of the method, to keep away Exempt to repeat, repeat no more here.
The mobile terminal 400 of the embodiment of the present invention, the pending image shot by obtaining the dual camera, obtain with Depth map corresponding to the pending image;According to the depth map, the background area of the pending image is blurred Processing, obtain blurring image;By virtualization image luminance picture synthesis corresponding with the pending image, output figure is obtained Picture.So synthesized, exported by the way that image luminance picture corresponding with pending image will be blurred corresponding to pending image Image, the brightness of the highlight regions in image background can be effectively reduced, simulate the scattered scape effect of large aperture, improve image Treatment effect.
Figure 10 is a kind of hardware architecture diagram for the mobile terminal for realizing each embodiment of the present invention, as shown in Figure 10, The mobile terminal 1000 includes but is not limited to:Radio frequency unit 1001, mixed-media network modules mixed-media 1002, audio output unit 1003, input are single Member 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, processing The part such as device 1010 and power supply 1011.It will be understood by those skilled in the art that the mobile terminal structure shown in Figure 10 is not The restriction to mobile terminal is formed, mobile terminal can be included than illustrating more or less parts, or some parts of combination, Or different part arrangement.In embodiments of the present invention, mobile terminal includes but is not limited to mobile phone, tablet personal computer, notebook Computer, palm PC, car-mounted terminal, wearable device and pedometer etc..
Wherein, processor 1010, for obtaining the pending image of dual camera shooting, obtain with it is described pending Depth map corresponding to image;According to the depth map, virtualization processing is carried out to the background area of the pending image, obtains void Change image;By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.
Optionally, the processor 1010 is additionally operable to:Determine the background area of the pending image;According to the depth Figure generates the virtualization parameter of the pending image;Virtualization processing is carried out to the background area using the virtualization parameter, obtained To the virtualization image.
Optionally, the processor 1010 is additionally operable to:Generation luminance picture corresponding with the pending image;By described in Virtualization image synthesizes with the luminance picture, obtains the output image.
Optionally, the processor 1010 is additionally operable to:Generate the original brightness image of the pending image;By the original Beginning luminance picture corresponds to the region division of the background area at least one luminance area, and by least one brightness region Brightness exceedes predetermined luminance in domain, and brightness area is more than the luminance area of preset area, as object brightness region;By described in Object brightness region removes from the original brightness image, obtains object brightness image;Using the virtualization parameter to described Object brightness image carries out virtualization processing, obtains the luminance picture.
Optionally, the processor 1010 is additionally operable to:Touch control operation of the detection for the pending image;If detect The touch control operation, it is determined that operating area corresponding to the touch control operation;By operating area described in the pending image Outside background area of the region as the pending image.
Optionally, the processor 1010 is additionally operable to:Identify the characteristics of image of the operating area, and the institute that will be recognized State destination object of the characteristics of image as the pending image;According to each background object in the background area in the depth With the position relationship of the destination object in figure, the virtualization curve of the pending image is generated;Wherein, in the background area Virtualization degree and the relative distance of each background object are proportionate, and the relative distance is each background object in the depth The distance of relatively described destination object in figure.
Optionally, in the depth map, the position based on the destination object in the depth map, by the depth Figure is divided into front and rear sides, and for positioned at the destination object front and rear sides, and relatively described destination object is equidistant Background object, the virtualization degree of front side background object are more than the virtualization degree of rear side background object;Wherein, the front side background pair As in the shooting process for the pending image, the background object between the dual camera and the destination object.
Mobile terminal 1000 can realize each process that mobile terminal is realized in previous embodiment, to avoid repeating, this In repeat no more.
The mobile terminal 1000 of the embodiment of the present invention, the pending image shot by obtaining the dual camera, is obtained Depth map corresponding with the pending image;According to the depth map, void is carried out to the background area of the pending image Change is handled, and obtains blurring image;By virtualization image luminance picture synthesis corresponding with the pending image, exported Image.So synthesized, obtained defeated by the way that image luminance picture corresponding with pending image will be blurred corresponding to pending image Go out image, can effectively reduce the brightness of the highlight regions in image background, simulate the scattered scape effect of large aperture, improve figure As treatment effect.
It should be understood that in the embodiment of the present invention, radio frequency unit 1001 can be used for receiving and sending messages or communication process in, signal Reception and transmission, specifically, by from base station downlink data receive after, handled to processor 1010;In addition, will be up Data are sent to base station.Generally, radio frequency unit 1001 includes but is not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 1001 can also pass through wireless communication system and network and other Equipment communication.
Mobile terminal has provided the user wireless broadband internet by mixed-media network modules mixed-media 1002 and accessed, and such as helps user to receive Send e-mails, browse webpage and access streaming video etc..
Audio output unit 1003 can be receiving by radio frequency unit 1001 or mixed-media network modules mixed-media 1002 or in memory It is sound that the voice data stored in 1009, which is converted into audio signal and exported,.Moreover, audio output unit 1003 can be with The audio output related to the specific function that mobile terminal 1000 performs is provided (for example, call signal receives sound, message sink Sound etc.).Audio output unit 1003 includes loudspeaker, buzzer and receiver etc..
Input block 1004 is used to receive audio or video signal.Input block 1004 can include graphics processor (Graphics Processing Unit, GPU) 10041 and microphone 10042, graphics processor 10041 in video to capturing The static images or the view data of video obtained in pattern or image capture mode by image capture apparatus (such as camera) enter Row processing.Picture frame after processing may be displayed on display unit 1006.Picture frame after the processing of graphics processor 10041 It can be stored in memory 1009 (or other storage mediums) or be carried out via radio frequency unit 1001 or mixed-media network modules mixed-media 1002 Send.Microphone 10042 can receive sound, and can be voice data by such acoustic processing.Audio after processing Data can be converted to the lattice that mobile communication base station can be sent to via radio frequency unit 1001 in the case of telephone calling model Formula exports.
Mobile terminal 1000 also includes at least one sensor 1005, for example, optical sensor, motion sensor and other Sensor.Specifically, optical sensor includes ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to ring The light and shade of environmental light adjusts the brightness of display panel 10061, proximity transducer can when mobile terminal 1000 is moved in one's ear, Close display panel 10061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions The size of (generally three axles) acceleration, can detect that size and the direction of gravity when static, available for identification mobile terminal appearance State (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) Deng;Sensor 1005 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gas Meter, hygrometer, thermometer, infrared ray sensor etc. are pressed, will not be repeated here.
Display unit 1006 is used for the information for showing the information inputted by user or being supplied to user.Display unit 1006 can Including display panel 10061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes can be used Forms such as (Organic Light-Emitting Diode, OLED) is managed to configure display panel 10061.
User input unit 1007 can be used for the numeral or character information for receiving input, and produce the use with mobile terminal The key signals input that family is set and function control is relevant.Specifically, user input unit 1007 include contact panel 10071 with And other input equipments 10072.Contact panel 10071, also referred to as touch-screen, collect touch behaviour of the user on or near it Make (for example user uses any suitable objects or annex such as finger, stylus on contact panel 10071 or in contact panel Operation near 10071).Contact panel 10071 may include both touch detecting apparatus and touch controller.Wherein, touch The touch orientation of detection means detection user is touched, and detects the signal that touch operation is brought, transmits a signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 1010, Order that reception processing device 1010 is sent simultaneously is performed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surface The polytypes such as sound wave realize contact panel 10071.Except contact panel 10071, user input unit 1007 can also include Other input equipments 10072.Specifically, other input equipments 10072 can include but is not limited to physical keyboard, function key (ratio Such as volume control button, switch key), trace ball, mouse, action bars, will not be repeated here.
Further, contact panel 10071 can be covered on display panel 10061, when contact panel 10071 detects After touch operation on or near it, processor 1010 is sent to determine the type of touch event, is followed by subsequent processing device 1010 Corresponding visual output is provided on display panel 10061 according to the type of touch event.Although in Fig. 10, contact panel 10071 realize the input of mobile terminal and output function with display panel 10061 is the part independent as two, but In some embodiments, contact panel 10071 and display panel 10061 can be integrated and realize the input and output of mobile terminal Function, do not limit herein specifically.
Interface unit 1008 is the interface that external device (ED) is connected with mobile terminal 1000.For example, external device (ED) can include Wired or wireless headphone port, external power source (or battery charger) port, wired or wireless FPDP, storage card Port, the port for connecting the device with identification module, audio input/output (I/O) port, video i/o port, earphone Port etc..Interface unit 1008 can be used for receiving the input (for example, data message, electric power etc.) from external device (ED) simultaneously And one or more elements that the input received is transferred in mobile terminal 1000 or it can be used in mobile terminal Data are transmitted between 1000 and external device (ED).
Memory 1009 can be used for storage software program and various data.Memory 1009 can mainly include storage program Area and storage data field, wherein, storing program area can storage program area, needed at least one function application program (such as Sound-playing function, image player function etc.) etc.;Storage data field can store uses created data (ratio according to mobile phone Such as voice data, phone directory) etc..In addition, memory 1009 can include high-speed random access memory, can also include non- Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 1010 is the control centre of mobile terminal, utilizes each of various interfaces and the whole mobile terminal of connection Individual part, by running or performing the software program and/or module that are stored in memory 1009, and call and be stored in storage Data in device 1009, the various functions and processing data of mobile terminal are performed, so as to carry out integral monitoring to mobile terminal.Place Reason device 1010 may include one or more processing units;Preferably, processor 1010 can integrate application processor and modulation /demodulation Processor, wherein, application processor mainly handles operating system, user interface and application program etc., modem processor master Handle radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 1010.
Mobile terminal 1000 can also include the power supply 1011 (such as battery) to all parts power supply, it is preferred that power supply 1011 can be logically contiguous by power-supply management system and processor 1010, so as to realize that management is filled by power-supply management system The functions such as electricity, electric discharge and power managed.
In addition, mobile terminal 1000 includes some unshowned functional modules, will not be repeated here.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 1010, memory 1009, storage On memory 1009 and the computer program that can be run on the processor 1010, the computer program is by processor 1010 Each process of above-mentioned image processing method embodiment is realized during execution, and identical technique effect can be reached, to avoid repeating, Here repeat no more.
The embodiment of the present invention also provides a kind of computer-readable recording medium, and meter is stored with computer-readable recording medium Calculation machine program, the computer program realize each process of above-mentioned image processing method embodiment, and energy when being executed by processor Reach identical technique effect, to avoid repeating, repeat no more here.Wherein, described computer-readable recording medium, such as only Read memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation RAM), magnetic disc or CD etc..
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row His property includes, so that process, method, article or device including a series of elements not only include those key elements, and And also include the other element being not expressly set out, or also include for this process, method, article or device institute inherently Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this Other identical element also be present in the process of key element, method, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), including some instructions to cause a station terminal (can be mobile phone, computer, service Device, air conditioner, or network equipment etc.) perform method described in each embodiment of the present invention.
Embodiments of the invention are described above in conjunction with accompanying drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned embodiment is only schematical, rather than restricted, one of ordinary skill in the art Under the enlightenment of the present invention, in the case of present inventive concept and scope of the claimed protection is not departed from, it can also make a lot Form, belong within the protection of the present invention.

Claims (16)

  1. A kind of 1. image processing method, applied to the mobile terminal including dual camera, it is characterised in that methods described includes:
    The pending image of dual camera shooting is obtained, obtains depth map corresponding with the pending image;
    According to the depth map, virtualization processing is carried out to the background area of the pending image, obtains blurring image;
    By virtualization image luminance picture synthesis corresponding with the pending image, output image is obtained.
  2. 2. according to the method for claim 1, it is characterised in that it is described according to the depth map, to the pending image Background area carry out virtualization processing, obtain blur image the step of, including:
    Determine the background area of the pending image;
    The virtualization parameter of the pending image is generated according to the depth map;
    Virtualization processing is carried out to the background area using the virtualization parameter, obtains the virtualization image.
  3. 3. according to the method for claim 2, it is characterised in that described by the virtualization image and the pending image pair The luminance picture synthesis answered, the step of obtaining output image, including:
    Generation luminance picture corresponding with the pending image;
    The virtualization image is synthesized with the luminance picture, obtains the output image.
  4. 4. according to the method for claim 3, it is characterised in that generation luminance graph corresponding with the pending image The step of picture, including:
    Generate the original brightness image of the pending image;
    The original brightness image is corresponded into the region division of the background area at least one luminance area, and will described in extremely Brightness exceedes predetermined luminance in a few luminance area, and brightness area is more than the luminance area of preset area, bright as target Spend region;
    The object brightness region is removed from the original brightness image, obtains object brightness image;
    Virtualization processing is carried out to the object brightness image using the virtualization parameter, obtains the luminance picture.
  5. 5. according to the method for claim 2, it is characterised in that the step of the background area for determining the pending image Suddenly, including:
    Touch control operation of the detection for the pending image;
    If detect the touch control operation, it is determined that operating area corresponding to the touch control operation;
    Background area using the region outside operating area described in the pending image as the pending image.
  6. 6. according to the method for claim 5, it is characterised in that described that the pending image is generated according to the depth map Virtualization parameter the step of, including:
    Identify the characteristics of image of the operating area, and the mesh using the described image feature recognized as the pending image Mark object;
    According to position relationship of each background object in the depth map with the destination object in the background area, institute is generated State the virtualization curve of pending image;
    Wherein, the virtualization degree of each background object is proportionate with relative distance in the background area, the relative distance For the distance of each background object relatively described destination object in the depth map.
  7. 7. according to the method for claim 6, it is characterised in that in the depth map, based on the destination object in institute The position in depth map is stated, the depth map is divided into front and rear sides, for positioned at the destination object front and rear sides, and phase To the equidistant background object of the destination object, the virtualization degree of front side background object is more than the void of rear side background object Change degree;
    Wherein, the front side background object for the pending image shooting process in, positioned at the dual camera with it is described Background object between destination object.
  8. 8. a kind of mobile terminal, including dual camera, it is characterised in that the mobile terminal also includes:
    Acquisition module, for obtaining the pending image of the dual camera shooting, obtain corresponding with the pending image Depth map;
    Blurring module, for according to the depth map, carrying out virtualization processing to the background area of the pending image, obtaining void Change image;
    Synthesis module, for by virtualization image luminance picture synthesis corresponding with the pending image, obtaining output figure Picture.
  9. 9. mobile terminal according to claim 8, it is characterised in that the blurring module includes:
    Determination sub-module, for determining the background area of the pending image;
    First generation submodule, for generating the virtualization parameter of the pending image according to the depth map;
    Submodule is blurred, for carrying out virtualization processing to the background area using the virtualization parameter, obtains the virtualization figure Picture.
  10. 10. mobile terminal according to claim 9, it is characterised in that the synthesis module includes:
    Second generation submodule, for generating luminance picture corresponding with the pending image;
    Submodule is synthesized, for the virtualization image to be synthesized with the luminance picture, obtains the output image.
  11. 11. mobile terminal according to claim 10, it is characterised in that the second generation submodule includes:
    First generation unit, for generating the original brightness image of the pending image;
    First determining unit, the original brightness image is corresponded into the region division of the background area at least one brightness region Domain, and brightness at least one luminance area is exceeded into predetermined luminance, and brightness area is more than the brightness region of preset area Domain, as object brightness region;
    Unit is removed, the object brightness region is removed from the original brightness image, obtains object brightness image;
    Unit is blurred, for carrying out virtualization processing to the object brightness image using the virtualization parameter, obtains the brightness Image.
  12. 12. mobile terminal according to claim 9, it is characterised in that the determination sub-module includes:
    Detection unit, for detecting the touch control operation for the pending image;
    Second determining unit, if for detecting the touch control operation, it is determined that operating area corresponding to the touch control operation;
    3rd determining unit, using the region outside operating area described in the pending image as the pending image Background area.
  13. 13. mobile terminal according to claim 12, it is characterised in that the first generation submodule includes:
    Recognition unit, for identifying the characteristics of image of the operating area, and using the described image feature recognized as described in The destination object of pending image;
    Second generation unit, for according to each background object in the background area in the depth map with the destination object Position relationship, generate the virtualization curve of the pending image;
    Wherein, the virtualization degree of each background object is proportionate with relative distance in the background area, the relative distance For the distance of each background object relatively described destination object in the depth map.
  14. 14. mobile terminal according to claim 13, it is characterised in that in the depth map, based on the target pair As the position in the depth map, the depth map is divided into front and rear sides, for two before and after the destination object Side, and the equidistant background object of relatively described destination object, the virtualization degree of front side background object are more than rear side background The virtualization degree of object;
    Wherein, the front side background object for the pending image shooting process in, positioned at the dual camera with it is described Background object between destination object.
  15. 15. a kind of mobile terminal, it is in its feature, including processor, memory and is stored on the memory and can be The computer program run on the processor, the computer program are realized such as claim 1 during the computing device The step of to image processing method any one of 7.
  16. 16. a kind of computer-readable recording medium, it is characterised in that be stored with computer on the computer-readable recording medium Program, the image processing method as any one of claim 1 to 7 is realized when the computer program is executed by processor The step of.
CN201710881932.5A 2017-09-26 2017-09-26 Image processing method and mobile terminal Active CN107730460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710881932.5A CN107730460B (en) 2017-09-26 2017-09-26 Image processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710881932.5A CN107730460B (en) 2017-09-26 2017-09-26 Image processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107730460A true CN107730460A (en) 2018-02-23
CN107730460B CN107730460B (en) 2020-02-14

Family

ID=61208093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710881932.5A Active CN107730460B (en) 2017-09-26 2017-09-26 Image processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107730460B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108449589A (en) * 2018-03-26 2018-08-24 德淮半导体有限公司 Handle the method, apparatus and electronic equipment of image
CN108989678A (en) * 2018-07-27 2018-12-11 维沃移动通信有限公司 A kind of image processing method, mobile terminal
CN110855876A (en) * 2018-08-21 2020-02-28 中兴通讯股份有限公司 Image processing method, terminal and computer storage medium
CN111626924A (en) * 2020-05-28 2020-09-04 维沃移动通信有限公司 Image blurring processing method and device, electronic equipment and readable storage medium
WO2021155549A1 (en) * 2020-02-06 2021-08-12 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method, system, and computer-readable medium for generating stabilized image compositing effects for image sequence

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11893668B2 (en) 2021-03-31 2024-02-06 Leica Camera Ag Imaging system and method for generating a final digital image via applying a profile to image information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366352A (en) * 2012-03-30 2013-10-23 北京三星通信技术研究有限公司 Device and method for producing image with background being blurred
CN103942755A (en) * 2013-01-23 2014-07-23 深圳市腾讯计算机系统有限公司 Image brightness adjusting method and device
CN106875356A (en) * 2017-01-22 2017-06-20 深圳市金立通信设备有限公司 The method and terminal of a kind of image procossing
CN106993112A (en) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 Background-blurring method and device and electronic installation based on the depth of field

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366352A (en) * 2012-03-30 2013-10-23 北京三星通信技术研究有限公司 Device and method for producing image with background being blurred
CN103942755A (en) * 2013-01-23 2014-07-23 深圳市腾讯计算机系统有限公司 Image brightness adjusting method and device
CN106875356A (en) * 2017-01-22 2017-06-20 深圳市金立通信设备有限公司 The method and terminal of a kind of image procossing
CN106993112A (en) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 Background-blurring method and device and electronic installation based on the depth of field

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
韩金辉: "基于人类视觉特性的复杂背景红外小目标检测研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108449589A (en) * 2018-03-26 2018-08-24 德淮半导体有限公司 Handle the method, apparatus and electronic equipment of image
CN108989678A (en) * 2018-07-27 2018-12-11 维沃移动通信有限公司 A kind of image processing method, mobile terminal
CN108989678B (en) * 2018-07-27 2021-03-23 维沃移动通信有限公司 Image processing method and mobile terminal
CN110855876A (en) * 2018-08-21 2020-02-28 中兴通讯股份有限公司 Image processing method, terminal and computer storage medium
CN110855876B (en) * 2018-08-21 2022-04-05 中兴通讯股份有限公司 Image processing method, terminal and computer storage medium
WO2021155549A1 (en) * 2020-02-06 2021-08-12 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method, system, and computer-readable medium for generating stabilized image compositing effects for image sequence
CN115066881A (en) * 2020-02-06 2022-09-16 Oppo广东移动通信有限公司 Method, system and computer readable medium for generating a stabilized image composition effect for an image sequence
CN115066881B (en) * 2020-02-06 2023-11-14 Oppo广东移动通信有限公司 Method, system and computer readable medium for generating stabilized image composition effects for image sequences
CN111626924A (en) * 2020-05-28 2020-09-04 维沃移动通信有限公司 Image blurring processing method and device, electronic equipment and readable storage medium
CN111626924B (en) * 2020-05-28 2023-08-15 维沃移动通信有限公司 Image blurring processing method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN107730460B (en) 2020-02-14

Similar Documents

Publication Publication Date Title
CN107730460A (en) A kind of image processing method and mobile terminal
CN107172364A (en) A kind of image exposure compensation method, device and computer-readable recording medium
CN108391053A (en) A kind of filming control method and terminal
CN107580184A (en) A kind of image pickup method and mobile terminal
CN107592471A (en) A kind of high dynamic range images image pickup method and mobile terminal
CN107817939A (en) A kind of image processing method and mobile terminal
CN107977144A (en) A kind of screenshotss processing method and mobile terminal
CN107948499A (en) A kind of image capturing method and mobile terminal
CN108566479A (en) screen state control method, mobile terminal and computer readable storage medium
CN108989678A (en) A kind of image processing method, mobile terminal
CN108682040A (en) A kind of sketch image generation method, terminal and computer readable storage medium
CN107330347A (en) A kind of display methods, terminal and computer-readable recording medium
CN107734260A (en) A kind of image processing method and mobile terminal
CN107231470A (en) Image processing method, mobile terminal and computer-readable recording medium
CN108320263A (en) A kind of method, device and mobile terminal of image procossing
CN107958161A (en) A kind of multitask display methods and mobile terminal
CN107886321A (en) A kind of method of payment and mobile terminal
CN107749046A (en) A kind of image processing method and mobile terminal
CN107831891A (en) A kind of brightness adjusting method and mobile terminal
CN108053371A (en) A kind of image processing method, terminal and computer readable storage medium
CN107765941A (en) A kind of icon display method, terminal and computer-readable recording medium
CN107767430A (en) One kind shooting processing method, terminal and computer-readable recording medium
CN107704812A (en) A kind of face identification method and mobile terminal
CN108040209A (en) A kind of image pickup method and mobile terminal
CN109300099A (en) A kind of image processing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant