CN106851098A - A kind of image processing method and mobile terminal - Google Patents
A kind of image processing method and mobile terminal Download PDFInfo
- Publication number
- CN106851098A CN106851098A CN201710042795.6A CN201710042795A CN106851098A CN 106851098 A CN106851098 A CN 106851098A CN 201710042795 A CN201710042795 A CN 201710042795A CN 106851098 A CN106851098 A CN 106851098A
- Authority
- CN
- China
- Prior art keywords
- pending image
- destination object
- spatial information
- image
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
Abstract
The invention discloses a kind of image processing method and mobile terminal.Methods described includes:Include the pending image of predetermined destination object at least two different viewing angles;Wherein, all at least include in pending image:First pending image and the second pending image;The spatial information in addition to the destination object is obtained in the described first pending image;According to the spatial information in the described first pending image in addition to the destination object, the destination object is removed in the described second pending image.Image processing method disclosed by the invention and mobile terminal, be able to can realize for destination object carrying out removal treatment, while ensure that the complete and fidelity of picture material in complex scene such that it is able to reach the purpose of lifting Consumer's Experience.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of image processing method and mobile terminal.
Background technology
With continuing to develop for science and technology, the communication technology has also obtained development at full speed, and the species of mobile terminal is also got over
Come more, people have also enjoyed the various facilities that development in science and technology brings.Present people can be by various types of mobile whole
Enjoy the comfortable life brought with development in science and technology in end.Such as, the mobile terminal such as mobile phone, camera, panel computer has become
An indispensable part in people's life, people can whenever and wherever possible be taken pictures by these mobile terminals, it is also possible to
Two will photographed or multiple images are synthesized.
At present, using mobile terminal when being taken pictures, during the object that will easily be not desired to shoot photographs picture image, because
This, background object influences whether main body composition in the picture image, is proposed object in some applications at present and removes function,
Its basic ideas is:Traditional image interpolation method is normally based on single image realization, and its basic ideas is:First in picture
Unwanted content is removed from image, and the region filled up after removal content is then gone to the pixel of the region surrounding removed, and fills out
The uniformity of image can be considered when covering white.This image interpolation method easily produces the loss of picture material, Huo Zherong
Easily cause the confusion of picture material.For example, in 2D image spaces, user first selects the mesh to be removed by clicking on destination object
Cursor position region;Then the target location region is filled using the correlation of the background image of surrounding.
Realize it is of the invention during, inventor find at least there are the following problems in the prior art:
In relatively simple scene, the demand of user can be met using existing image processing method, but compared with
In for complicated scene, the loss of picture material is easily produced using existing image processing method, or easily cause image
The confusion of content.
The content of the invention
In view of this, a kind of image processing method and mobile terminal are the embodiment of the invention provides, can be complex
Scene in can realize for destination object carrying out removal treatment, while ensure that the complete and fidelity of picture material so that
The purpose of lifting Consumer's Experience can be reached.
What the technical scheme of the embodiment of the present invention was realized in:
A kind of image processing method is the embodiment of the invention provides, methods described includes:
Include the pending image of predetermined destination object at least two different viewing angles;Wherein, all
At least include in pending image:First pending image and the second pending image;
The spatial information in addition to the destination object is obtained in the described first pending image;
It is pending described second according to the spatial information in the described first pending image in addition to the destination object
The destination object is removed in image.
In the above-described embodiments, the space obtained in the described first pending image in addition to the destination object
Information, including:
Corresponding first spatial information of the destination object is determined in the described first pending image;
The second space information in addition to first spatial information is obtained in the described first pending image;Wherein,
First spatial information at least includes the first pixel point set of the composition destination object;The second space information is at least
Including the second pixel point set in addition to the first pixel point set.
In the above-described embodiments, it is described that corresponding first sky of the destination object is determined in the described first pending image
Between information, including:
The operation trace for determining the destination object is obtained in the described first pending image;
The destination object is determined in the described first pending image according to the operation trace;
Corresponding first space of the destination object is determined according to the destination object in the described first pending image
Information.
In the above-described embodiments, it is described to remove the destination object in the described second pending image, including:
Recognize that first spatial information is corresponding according to the second space information in the described second pending image
Spatial information to be filled;
First spatial information is removed in the described second pending image and the spatial information to be filled is inserted.
In the above-described embodiments, it is described that the spatial information to be filled, bag are inserted in the described second pending image
Include:
The image parameter of the destination object is obtained according to first spatial information;
The spatial information to be filled is inserted in the described second pending image according to described image parameter;Wherein,
Described image parameter includes:The resolution parameter of the destination object, speed parameter, color saturation parameter, white balance ginseng
At least one of number, contrast level parameter and sharpness parameter.
The embodiment of the invention also discloses a kind of mobile terminal, the mobile terminal includes:Shooting unit, acquiring unit and
Processing unit;Wherein,
The shooting unit, locates for the treating including predetermined destination object at least two different viewing angles
Reason image;Wherein, all at least include in pending image:First pending image and the second pending image;
The acquiring unit, for obtaining the space letter in addition to the destination object in the described first pending image
Breath;
The processing unit, for according to the space letter in the described first pending image in addition to the destination object
Breath, the destination object is removed in the described second pending image.
In the above-described embodiments, the acquiring unit includes:Determination subelement and acquisition subelement;Wherein,
The determination subelement, for determining corresponding first sky of the destination object in the described first pending image
Between information;
The acquisition subelement, for being obtained in addition to first spatial information in the described first pending image
Second space information;Wherein, first spatial information at least includes the first pixel point set of the composition destination object;Institute
Second space information is stated at least including the second pixel point set in addition to the first pixel point set.
In the above-described embodiments, the determination subelement, is used for specifically for being obtained in the described first pending image
Determine the operation trace of the destination object;The target is determined in the described first pending image according to the operation trace
Object;The corresponding first space letter of the destination object is determined according to the destination object in the described first pending image
Breath.
In the above-described embodiments, the processing unit includes:Identification subelement and treatment subelement;Wherein,
The identification subelement, in the described second pending image according to second space information identification
The corresponding spatial information to be filled of first spatial information;
The treatment subelement, for removing first spatial information in the described second pending image and inserting institute
State spatial information to be filled.
In the above-described embodiments, the treatment subelement, specifically for obtaining the mesh according to first spatial information
Mark the image parameter of object;The space letter to be filled is inserted in the described second pending image according to described image parameter
Breath;Wherein, described image parameter includes:The resolution parameter of the destination object, speed parameter, color saturation parameter,
At least one of white balance parameter, contrast level parameter and sharpness parameter.
As can be seen here, in the technical scheme of the embodiment of the present invention, first being shot at least two different visual angles is included in advance really
The pending image of fixed destination object, wherein, all at least include in pending image:First pending image and second is treated
Treatment image, then obtains the spatial information in addition to destination object in the first pending image, waits to locate finally according to first
Spatial information in reason image beyond destination object, destination object is removed in the second pending image.That is, in this hair
In the technical scheme of bright embodiment, can be according to the spatial information in the first pending image in addition to destination object, second
Destination object is removed in pending image.And losing for picture material is easily produced using image processing method of the prior art
Lose, or the confusion for easily causing picture material.Therefore, compared to the prior art, the image procossing that the embodiment of the present invention is proposed
Method and mobile terminal, can realize for destination object carrying out removal treatment, while ensure that in complex scene
The complete and fidelity of picture material such that it is able to reach the purpose of lifting Consumer's Experience;Also, the technical side of the embodiment of the present invention
Case realize it is simple and convenient, be easy to popularization, the scope of application is wider.
Brief description of the drawings
Fig. 1 is a hardware architecture diagram for optional mobile terminal 1 00 for realizing each embodiment of the invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal 1 00 as shown in Figure 1;
Fig. 3 realizes schematic flow sheet for the image processing method of the offer of the embodiment of the present invention one;
Fig. 4 is the composition structural representation of the first pending image and the second pending image in the embodiment of the present invention;
Fig. 5 is the composition structural representation of the pending image of different visual angles shooting in the embodiment of the present invention;
Fig. 6 illustrates to obtain the implementation method flow of the whole spatial informations beyond destination object in the embodiment of the present invention
Figure;
Fig. 7 is the composition structural representation of the display interface of mobile terminal first in the embodiment of the present invention;
Fig. 8 is the composition structural representation of the display interface of mobile terminal second in the embodiment of the present invention;
Fig. 9 is the implementation method schematic flow sheet of removal destination object in the embodiment of the present invention;
Figure 10 is the first composition structural representation of the mobile terminal that the embodiment of the present invention two is provided;
Figure 11 is the second composition structural representation of the mobile terminal that the embodiment of the present invention three is provided.
Specific embodiment
It should be appreciated that the technical scheme that the specific embodiments described herein are merely illustrative of the present invention, is not used to
Limit protection scope of the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for being conducive to explanation of the invention, itself
Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
It is phone, smart phone, notebook computer, digit broadcasting receiver, personal digital assistant (PDA), panel computer (PAD), portable
The mobile terminal of formula multimedia player (PMP), guider etc. and the such as fixation of numeral TV, desktop computer etc.
Terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for moving mesh
Element outside, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration of the mobile terminal 1 00 for realizing each embodiment of the invention is illustrated, as shown in figure 1, mobile whole
End 100 can include that wireless communication unit 110, audio/video (A/V) input block 120, user input unit 130, sensing are single
Unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc..Fig. 1 shows
Mobile terminal 1 00 with various assemblies, it should be understood that being not required for implementing all components for showing.Can substitute
Implement more or less component in ground.The element of mobile terminal 1 00 will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit 110 can include mobile communication module 111 and wireless interconnected
At least one of net module 112.
Mobile communication module 111 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 112 supports the Wi-Fi (Wireless Internet Access) of mobile terminal 1 00.Wireless Internet module 112 can
To be internally or externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by wireless Internet module 112 can include
WLAN (WLAN), Wireless Fidelity (Wi-Fi), WiMAX (Wibro), worldwide interoperability for microwave accesses
(Wimax), high-speed downlink packet access (HSDPA) etc..
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include camera 121, phase
Static images or the image of video that 121 pairs, machine is obtained in Video Capture pattern or image capture mode by image capture apparatus
Data are processed.Picture frame after treatment may be displayed on display unit 151.Picture frame after being processed through camera 121 can
It is transmitted in memory 160 (or other storage mediums) or via wireless communication unit 110 with storing, can be according to shifting
The construction of dynamic terminal 100 provides two or more cameras 121.Audio (voice) data after treatment can be in telephone calling model
In the case of be converted to can via mobile communication module 111 be sent to mobile communication base station form export.
User input unit 130 can generate key input data to control mobile terminal 1 00 according to the order of user input
Various operations.User input unit 130 allow the various types of information of user input, and can include keyboard, metal dome,
Touch pad (for example, detection due to being touched caused by resistance, pressure, electric capacity etc. change sensitive component), roller, shake
Bar etc..Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, (for example, mobile terminal 1 00 opens or closes shape
State), the presence or absence of the contact (that is, touch input) of the position of mobile terminal 1 00, user for mobile terminal 1 00, mobile terminal
The acceleration or deceleration movement of 100 orientation, mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed
The sliding-type phone is opened or closed.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port (typical case is general-purpose serial bus USB port), the dress for connecting with identification module
Port, audio input/output (I/O) port, video i/o port, ear port for putting etc..Identification module can be that storage is used
In checking user is using the various information of mobile terminal 1 00 and can include subscriber identification module (UIM), client identification module
(SIM), Universal Subscriber identification module (USIM) etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module
The form of smart card can be taken, therefore, identifying device can connect via port or other attachment means with mobile terminal 1 00
Connect.
Interface unit 170 can be used for receive the input (for example, data message, electric power etc.) from external device (ED) and
One or more elements or can be used in the He of mobile terminal 1 00 that the input that will be received is transferred in mobile terminal 1 00
Data are transmitted between external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity
Power provides to the path of mobile terminal 1 00 from base or can serve as allowing the various command signals being input into from base to pass through it
It is transferred to the path of mobile terminal 1 00.Be can serve as from the various command signals or electric power of base input mobile whole for recognizing
Whether end 100 is accurately fitted within the signal on base.
Output unit 150 is configured to provide output signal (for example, audio is believed with vision, audio and/or tactile manner
Number, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include display unit 151 and audio output mould
Block 152 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, show video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad in the form of layer it is superposed on one another to form touch-screen when, display unit
151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to be watched from outside, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired implementation method, mobile terminal 1 00 can include two or more display units (or other display devices), for example, moving
Dynamic terminal 100 can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used to detect
Touch input pressure and touch input position and touch input area.
Dio Output Modules 152 can be in call signal reception pattern, call mode, record mould in mobile terminal 1 00
It is that wireless communication unit 110 is received or in memory when under the isotypes such as formula, speech recognition mode, broadcast reception mode
In 160 store voice data transducing audio signal and be output as sound.And, dio Output Modules 152 can provide with
The audio output of the specific function correlation that mobile terminal 1 00 is performed is (for example, call signal receives sound, message sink sound etc.
Deng).Dio Output Modules 152 can include loudspeaker, buzzer etc..
Memory 160 can store software program for the treatment and control operation performed by controller 180 etc., Huo Zheke
Temporarily to store the data that exported or will export (for example, telephone directory, message, still image, video etc.).And
And, memory 160 can store the vibration of various modes on being exported when touching and being applied to touch-screen and audio signal
Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection
The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal 1 00 of controller 180.For example, controller 180 perform with voice call,
Data communication, video calling etc. related control and treatment.In addition, controller 180 can be included for reproducing or playing back many
The multi-media module 181 of media data, multi-media module 181 can be constructed in controller 180, or can be structured as and control
Device processed 180 is separated.Controller 180 can be with execution pattern identifying processing, the handwriting input that will be performed on the touchscreen or figure
Piece draws input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various implementation methods described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implemented.Implement for hardware, implementation method described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein
At least one in electronic unit is implemented, and in some cases, such implementation method can be implemented in controller 180.
For software implementation, the implementation method of such as process or function can with allow to perform the single of at least one function or operation
Software module is implemented.Software code can be come by the software application (or program) write with any appropriate programming language
Implement, software code can be stored in memory 160 and performed by controller 180.
So far, mobile terminal 1 00 is described according to its function.Below, for the sake of brevity, description is such as folded
Sliding-type movement in various types of mobile terminal 1s 00 of type, board-type, oscillating-type, slide type mobile terminal 100 etc. is eventually
End 100 is as an example.Therefore, the present invention can be applied to any kind of mobile terminal 1 00, and be not limited to sliding-type movement
Terminal 100.
Fig. 2 is the wireless communication system schematic diagram of mobile terminal 1 00 as shown in Figure 1, referring now to Fig. 2 descriptions wherein
The communication system that mobile terminal 1 of the invention 00 can be operated.
Such communication system can use different air interface and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base station (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC 280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC 280 is also structured to be formed with the BSC 275 that can be couple to base station 270 via back haul link and connects
Mouthful.If any one in the interface that back haul link can be known according to Ganji is constructed, the interface include such as E1/T1, ATM,
IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system can include multiple as shown in Figure 2
BSC2750。
Each BS 270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion of line covering is radially away from BS 270.Or, each subregion can by two for diversity reception or more
Multiple antennas are covered.Each BS 270 may be constructed such that the multiple frequency distribution of support, and the distribution of each frequency has specific frequency
Spectrum (for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed intersects can be referred to as CDMA Channel.BS 270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC
275 and at least one BS 270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS 270 can be claimed
It is multiple cellular stations.
Used as a typical operation of wireless communication system, BS 270 receives the reverse strand from various mobile terminal 1s 00
Road signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270
Reverse link signal is processed in specific BS 270.The data of acquisition are forwarded to the BSC 275 of correlation.BSC is provided
Call resource allocation and the mobile management function of the coordination including the soft switching process between BS 270.BSC275 will also be received
Data be routed to MSC 280, its provide for PSTN 290 formed interface extra route service.Similarly, PSTN
290 form interface with MSC280, and MSC and BSC 275 form interface, and BSC 275 correspondingly controls BS 270 with by forward direction
Link signal is sent to mobile terminal 1 00.
The mobile communication module 111 of wireless communication unit 110 is based on the built-in access movement of mobile terminal in mobile terminal
The necessary data (including customer identification information and authentication information) of communication network (such as 2G/3G/4G mobile communications networks) is accessed
Mobile communications network is the business transmission mobile data (bag such as the web page browsing of mobile terminal user, network multimedia broadcasting
Include up mobile data and descending mobile data).
The wireless Internet module 112 of wireless communication unit 110 by run the related protocol function of hotspot and reality
The function of existing hotspot, hotspot supports that multiple mobile terminals (any mobile terminal outside mobile terminal) are accessed, and leads to
The webpage that the mobile communication crossed between multiplexing mobile communication module 111 and mobile communications network is connected as mobile terminal user is clear
Look at, the business transmission mobile data such as network multimedia is played (including up mobile data and descending movement it is logical
Letter data), because mobile terminal is substantially the mobile communication connection transmission movement between multiplexing mobile terminal and communication network
Communication data, therefore the flow of the mobile data of mobile terminal consumption counts movement by the charging entity of communication network side
The post and telecommunication tariff of terminal, so as to consume the data flow of the mobile data that the mobile terminal post and telecommunication tariff that uses of signing includes
Amount.
Based on the hardware configuration of above-mentioned mobile terminal 1 00 and communication system, the inventive method each embodiment is proposed.
Embodiment one
Fig. 3 realizes schematic flow sheet for the image processing method of the offer of the embodiment of the present invention one.As shown in figure 3, image
Processing method may comprise steps of:
Step 301, the pending image at least two different viewing angles including predetermined destination object;Its
In, all at least include in pending image:First pending image and the second pending image.
In a particular embodiment of the present invention, user can be at least two different viewing angles bags using mobile terminal
Include the pending image of predetermined destination object;Wherein, all at least include in pending image:First pending image
With the second pending image.Specifically, in a particular embodiment of the present invention, the first pending image refers to that user predefines
The pending image that need not remove destination object;Second pending image refers to remove target the need for user predefines
The pending image of object.It is preferred that the first pending image can be user uses mobile terminal in the first viewing angles
Pending image including predetermined destination object;Second pending image can be that user uses mobile terminal second
The pending image including predetermined destination object of viewing angles.
In general, the mobile terminal acquisition parameters corresponding when shooting can to a certain extent determine that shooting is obtained
Pending image image parameter, therefore, want to allow the effect phase as far as possible of the first pending image and the second pending image
Same or close, mobile terminal can be adjusted when shooting the second pending image according to the image parameter of the first pending image
Acquisition parameters, i.e. in a particular embodiment of the present invention, the acquisition parameters for shooting the second pending image are according to first
What the image parameter of pending image was obtained.For example, the image parameter of pending image can include:The color of pending image
Saturation parameters, speed parameter and white balance parameter, for example, the color saturation parameter of pending image is:70%, it is photosensitive
Spending parameter is:200, white balance parameter is:Daylight;So, mobile terminal can be according to the image parameter of the first pending image
To obtain acquisition parameters;Then the second pending image is shot according to the acquisition parameters.For example, it is assumed that mobile terminal can basis
The acquisition parameters that the image parameter of the first pending image is obtained can include:Color saturation parameter is set to:70%th, it is photosensitive
Degree parameter is set to:200th, white balance parameter is set to:Daylight.Mobile terminal can shoot second and wait to locate according to the acquisition parameters
Reason image.So, mobile terminal can cause that shoot the pending image of second for obtaining treats with first by setting acquisition parameters
The image parameter for processing image is kept substantially unanimously.Meanwhile, the embodiment of the present invention supports the setting to various image parameters, moves
The image parameter that dynamic terminal is set is more, and the second pending image that shooting is obtained is similar to the effect of the first pending image
Degree may be higher, is conducive to removing the picture quality after destination object in the second pending image, also improves electronics and sets
Standby image-capable.
It is preferred that in a particular embodiment of the present invention, the first pending image and the second pending image can be appointed
The image of meaning form, such as JPEG (Joint Photographic Experts Group, picture format) form, BMP
(Bitmap, picture format) form etc..
Fig. 4 is the composition structural representation of the first pending image and the second pending image in the embodiment of the present invention.Its
In, Fig. 4 (a) is the composition structural representation of the first pending image in the embodiment of the present invention;Fig. 4 (b) is the embodiment of the present invention
In the second pending image composition structural representation.As shown in Fig. 4 (a) and Fig. 4 (b), it is assumed that destination object is number of people statue,
If to remove number of people statue in the first pending image, because the background object behind number of people statue is more complicated, often
Removal treatment can not well be carried out.Therefore, in a particular embodiment of the present invention, user can be using mobile terminal first the
One visual angle and the second visual angle shoot the first pending image and the second pending image respectively, then in the first pending image
The spatial information in addition to number of people statue is obtained, finally according to the space letter in the first pending image in addition to number of people statue
Breath, removes number of people statue in the second pending image.Therefore, in the technical scheme of the embodiment of the present invention, can be more multiple
Realize for destination object carrying out removal treatment in miscellaneous scene, while the complete and fidelity of picture material is ensure that, so as to
Enough reach the purpose of lifting Consumer's Experience.
Step 302, obtain in the first pending image spatial information in addition to destination object.
In a particular embodiment of the present invention, mobile terminal is predetermined to including at least two different viewing angles
Destination object pending image after, can be obtained in the first pending image in addition to destination object space letter
Breath.Specifically, mobile terminal can be shot, it is also possible in multiple when the first pending image is shot at a visual angle
Visual angle is shot.Fig. 5 is the composition structural representation of the pending image of different visual angles shooting in the embodiment of the present invention.Its
In, Fig. 5 (a) is the composition structural representation of the pending image of the first viewing angles in the embodiment of the present invention;Fig. 5 (b) is this
The composition structural representation of the pending image of the second viewing angles in inventive embodiments.As shown in Fig. 5 (a) and Fig. 5 (b),
In complex scene, the background object behind destination object is often more complicated, it is assumed that as shown in Fig. 5 (a) and Fig. 5 (b)
Pending image in destination object be number of people statue, mobile terminal can be obtained in each first pending image remove people
Spatial information beyond head statue.
Fig. 6 illustrates to obtain the implementation method flow of the whole spatial informations beyond destination object in the embodiment of the present invention
Figure.As shown in fig. 6, the method for the whole spatial informations beyond obtaining destination object in the first pending image includes following step
Suddenly:
Step 302a, corresponding first spatial information of determination destination object in the first pending image.
In a particular embodiment of the present invention, beyond mobile terminal gets destination object in the first pending image
During whole spatial informations, corresponding first spatial information of destination object can be first determined in the first pending image.Specifically,
In a particular embodiment of the present invention, mobile terminal first can determine destination object in the first pending image;Then
Corresponding first spatial information of destination object is determined in one pending image.Specifically, in a particular embodiment of the present invention, move
Dynamic terminal can obtain the operation trace for determining destination object in the first pending image;According to operation trace first
Determine destination object in pending image;Destination object corresponding first is determined according to destination object in the first pending image
Spatial information.It is preferred that mobile terminal can determine destination object using following two methods:Manually determined destination object and from
It is dynamic to determine destination object.
First, destination object is chosen manually
Fig. 7 is the composition structural representation of the display interface of mobile terminal first in the embodiment of the present invention.As shown in fig. 7, moving
Dynamic terminal after the pending image that at least two different viewing angles include predetermined destination object, mobile terminal
The operation trace of the first operation that can be sent according to user first determines target area;Then to the destination object in target area
It is identified, and the profile of the destination object for identifying can be presented on the display interface of mobile terminal;Additionally, mobile whole
The operation trace of the second operation that end can also send according to user determines destination object in the first pending image.For example,
When mobile terminal receives the second operation of user's transmission for touch control operation, mobile terminal first detects the operating point of touch control operation
Position, be then based on the object indicated by the location recognition touch control operation of operating point, by touch control operation indicate Object identifying
It is destination object, corresponding first spatial information of destination object is finally determined according to destination object in the first pending image;
Mobile terminal can be so set to be target by the tactile Object identifying of point when user puts and touch in a profile for object
Object.
2nd, destination object is chosen automatically
Fig. 8 is the composition structural representation of the display interface of mobile terminal second in the embodiment of the present invention.As shown in figure 8, moving
After the pending image that at least two different viewing angles include predetermined destination object, user can be with for dynamic terminal
First pass through the first operation and indicate target area in the terminal, then mobile terminal is known to the object in target area
Not;Additionally, when user does not indicate target area in the terminal, mobile terminal can be to the Zone Full in display interface
Automatic identification is carried out, and the profile of the destination object for identifying can be presented on the display interface of mobile terminal;Then will
The part or all of object for identifying is used as destination object;Or, mobile terminal is being presented the profile of the destination object for identifying
During more than the second operation that user is not detected in Preset Time, the part or all of object that mobile terminal will will identify that is made
It is destination object.Specifically, in a particular embodiment of the present invention, in automatic identification destination object, mobile terminal can be with base
Object matching is carried out in default shape library, it is determined that the destination object with meaning of monitoring, such as people;Or, mobile terminal may be used also
The object that part need not monitor, such as carton and desk and other items are filtered out with object-based area.
Step 302b, obtain in the first pending image second space information in addition to the first spatial information;Wherein,
First spatial information at least includes the first pixel point set of composition destination object;Second space information at least includes removing the first picture
The second pixel point set beyond vegetarian refreshments set.
In a particular embodiment of the present invention, mobile terminal determines destination object corresponding in the first pending image
After one spatial information, the second space information in addition to the first spatial information can be obtained in the first pending image;Its
In, the first spatial information at least includes the first pixel point set of composition destination object;Second space information is at least included except the
The second pixel point set beyond one pixel point set.For example, it is assumed that the destination object in the first pending image is number of people carving
Picture, after mobile terminal gets corresponding first spatial information of number of people statue in the first pending image, can be first
The second space information in addition to corresponding first spatial information of number of people statue is obtained in pending image.
Step 303, according to the spatial information in the first pending image in addition to destination object, in the second pending image
Middle removal destination object.
In a particular embodiment of the present invention, mobile terminal is got in addition to destination object in the first pending image
Spatial information after, can be pending second according to the spatial information in the first pending image in addition to destination object
Destination object is removed in image.Fig. 9 is the implementation method schematic flow sheet of removal destination object in the embodiment of the present invention.Such as Fig. 9
Shown, the method that destination object is removed in the second pending image may comprise steps of:
Step 303a, according to the identification of second space information, the first spatial information is corresponding waits to fill out in the second pending image
The spatial information for filling.
In a particular embodiment of the present invention, mobile terminal determines destination object corresponding in the first pending image
One spatial information, and after obtaining the second space information in addition to the first spatial information in the first pending image, move
Dynamic terminal can recognize the corresponding sky to be filled of the first spatial information in the second pending image according to second space information
Between information.
Step 303b, remove in the second pending image the first spatial information and insert spatial information to be filled.
In a particular embodiment of the present invention, mobile terminal is recognized in the second pending image according to second space information
Go out after the corresponding spatial information to be filled of the first spatial information, mobile terminal can remove in the second pending image
One spatial information simultaneously inserts spatial information to be filled.
Specifically, in a particular embodiment of the present invention, mobile terminal can obtain target pair according to the first spatial information
The image parameter of elephant;Then spatial information to be filled is inserted in the second pending image according to image parameter;Wherein, image
Parameter includes:The resolution parameter of destination object, speed parameter, color saturation parameter, white balance parameter, contrast ginseng
At least one of number and sharpness parameter.Other possible parameters can certainly be included, as long as the ginseng that image has
Number may each comprise within the scope of image parameter, and this is not limited by the present invention.
The image processing method that the embodiment of the present invention is proposed, first includes predetermined in the shooting of at least two different visual angles
The pending image of destination object, wherein, all at least include in pending image:First pending image and second pending
Image, then obtains the spatial information in addition to destination object, finally according to the first pending figure in the first pending image
Spatial information as in beyond destination object, destination object is removed in the second pending image.That is, of the invention real
Apply in the technical scheme of example, can wait to locate second according to the spatial information in the first pending image in addition to destination object
Destination object is removed in reason image.And the loss of picture material is easily produced using image processing method of the prior art, or
Person easily causes the confusion of picture material.Therefore, compared to the prior art, the image processing method that the embodiment of the present invention is proposed,
Can realize for destination object carrying out removal treatment in complex scene, at the same ensure that the complete of picture material and
Fidelity such that it is able to reach the purpose of lifting Consumer's Experience;Also, the technical scheme of the embodiment of the present invention realize it is simple and convenient,
It is easy to popularization, the scope of application is wider.
Embodiment two
Figure 10 is the first composition structural representation of the mobile terminal that the embodiment of the present invention two is provided.As shown in Figure 10, move
Dynamic terminal includes:Shooting unit 1001, acquiring unit 1002 and processing unit 1003;Wherein,
Shooting unit 1001, locates for the treating including predetermined destination object at least two different viewing angles
Reason image;Wherein, all at least include in pending image:First pending image and the second pending image.
In a particular embodiment of the present invention, user can be at least two not using the shooting unit 1001 of mobile terminal
Same viewing angles include the pending image of predetermined destination object;Wherein, all at least include in pending image:
First pending image and the second pending image.Specifically, in a particular embodiment of the present invention, the first pending image is
Refer to the predetermined pending image that need not remove destination object of user;Second pending image refers to that user predefines
The need for remove destination object pending image.It is preferred that the first pending image can be user being existed using mobile terminal
The pending image including predetermined destination object of the first viewing angles;Second pending image can be that user uses
Pending image including predetermined destination object of the mobile terminal in the second viewing angles.
Acquiring unit 1002, for obtaining the spatial information in addition to destination object in the first pending image.
In a particular embodiment of the present invention, shooting unit 1001 is advance to including at least two different viewing angles
After the pending image of the destination object of determination, acquiring unit 1002 can be obtained except target pair in the first pending image
Spatial information as beyond.
Processing unit 1003, for according to the spatial information in the first pending image in addition to destination object, second
Destination object is removed in pending image.
In a particular embodiment of the present invention, acquiring unit 1002 is got except destination object in the first pending image
After spatial information in addition, processing unit 1003 can be according to the space letter in the first pending image in addition to destination object
Breath, destination object is removed in the second pending image.
Specifically, in a particular embodiment of the present invention, mobile terminal can obtain target pair according to the first spatial information
The image parameter of elephant;Then spatial information to be filled is inserted in the second pending image according to image parameter;Wherein, image
Parameter includes:The resolution parameter of destination object, speed parameter, color saturation parameter, white balance parameter, contrast ginseng
At least one of number and sharpness parameter.Other possible parameters can certainly be included, as long as the ginseng that image has
Number may each comprise within the scope of image parameter, and this is not limited by the present invention.
The mobile terminal that the embodiment of the present invention is proposed, first being shot at least two different visual angles includes predetermined target
The pending image of object, wherein, all at least include in pending image:First pending image and the second pending figure
Picture, then obtains the spatial information in addition to destination object, finally according to the first pending image in the first pending image
Spatial information beyond middle destination object, destination object is removed in the second pending image.That is, implementing in the present invention
In the technical scheme of example, can be pending second according to the spatial information in the first pending image in addition to destination object
Destination object is removed in image.And the loss of picture material is easily produced using image processing method of the prior art, or
Easily cause the confusion of picture material.Therefore, compared to the prior art, the mobile terminal that the embodiment of the present invention is proposed, Ke Yi
Realize for destination object carrying out removal treatment in complex scene, while the complete and fidelity of picture material is ensure that,
So as to reach the purpose of lifting Consumer's Experience;Also, the technical scheme of the embodiment of the present invention realize it is simple and convenient, be easy to it is general
And, the scope of application is wider.
Embodiment three
Figure 11 is the second composition structural representation of the mobile terminal that the embodiment of the present invention three is provided.As shown in Figure 10, obtain
Taking unit 1002 includes:Determination subelement 10021 and acquisition subelement 10022;Wherein,
Determination subelement 10021, for determining corresponding first spatial information of destination object in the first pending image.
In a particular embodiment of the present invention, determination subelement 10021 first can determine mesh in the first pending image
Corresponding first spatial information of mark object.Specifically, in a particular embodiment of the present invention, determination subelement 10021 can be first
Determine destination object in the first pending image;Then corresponding first sky of destination object is determined in the first pending image
Between information.Specifically, in a particular embodiment of the present invention, determination subelement 10021 can be obtained in the first pending image
Take in it is determined that the operation trace of destination object;Destination object is determined in the first pending image according to operation trace;
Corresponding first spatial information of destination object is determined according to destination object in one pending image.It is preferred that determination subelement
10021 can determine destination object using following two methods:Manually determined destination object and automatically determine destination object.
Subelement 10022 is obtained, for obtaining the second sky in addition to the first spatial information in the first pending image
Between information;Wherein, the first spatial information at least includes the first pixel point set of composition destination object;Second space information is at least
Including the second pixel point set in addition to the first pixel point set.
In a particular embodiment of the present invention, determination subelement 10021 determines destination object in the first pending image
After corresponding first spatial information, obtaining subelement 10022 can obtain except the first space letter in the first pending image
Second space information beyond breath.For example, it is assumed that the destination object in the first pending image is number of people statue, subelement is obtained
After 10022 get corresponding first spatial information of number of people statue in the first pending image, can be pending first
The second space information in addition to corresponding first spatial information of number of people statue is obtained in image;Wherein, the first spatial information is extremely
Few the first pixel point set for including composition destination object;Second space information is at least included in addition to the first pixel point set
Second pixel point set.
Further, determination subelement 10021, specifically for being obtained for determining target pair in the first pending image
The operation trace of elephant;Destination object is determined in the first pending image according to operation trace;The root in the first pending image
Determine corresponding first spatial information of destination object according to destination object.
Further, processing unit 1003 includes:Identification subelement 10031 and treatment subelement 10032;Wherein,
Identification subelement 10031, for recognizing the first space letter according to second space information in the second pending image
Cease corresponding spatial information to be filled.
In a particular embodiment of the present invention, determination subelement 10021 determines destination object in the first pending image
Corresponding first spatial information, and obtain subelement 10022 is obtained in the first pending image remove the first spatial information with
After outer second space information, identification subelement 10031 can be known in the second pending image according to second space information
The corresponding spatial information to be filled of other first spatial information.
Treatment subelement 10032, for removing the first spatial information in the second pending image and inserting to be filled
Spatial information.
Further, subelement 10032 is processed, the image ginseng specifically for obtaining destination object according to the first spatial information
Number;Spatial information to be filled is inserted in the second pending image according to image parameter;Wherein, image parameter includes:Target
The resolution parameter of object, speed parameter, color saturation parameter, white balance parameter, contrast level parameter and sharpness parameter
At least one of.
In a particular embodiment of the present invention, recognize subelement 10031 in the second pending image according to second space
Information identifies after the corresponding spatial information to be filled of the first spatial information that treatment subelement 10032 can be treated second
The first spatial information is removed in treatment image and insert spatial information to be filled.
Specifically, in a particular embodiment of the present invention, treatment subelement 10032 can be obtained according to the first spatial information
The image parameter of destination object;Then spatial information to be filled is inserted in the region of target location according to image parameter;Wherein,
Image parameter includes:The resolution parameter of destination object, speed parameter, color saturation parameter, white balance parameter, contrast
At least one of parameter and sharpness parameter.Other possible parameters can certainly be included, as long as what image had
Parameter may each comprise within the scope of image parameter, and this is not limited by the present invention.
The mobile terminal that the embodiment of the present invention is proposed, first being shot at least two different visual angles includes predetermined target
The pending image of object, wherein, all at least include in pending image:First pending image and the second pending figure
Picture, then obtains the spatial information in addition to destination object, finally according to the first pending image in the first pending image
Spatial information beyond middle destination object, destination object is removed in the second pending image.That is, implementing in the present invention
In the technical scheme of example, can be pending second according to the spatial information in the first pending image in addition to destination object
Destination object is removed in image.And the loss of picture material is easily produced using image processing method of the prior art, or
Easily cause the confusion of picture material.Therefore, compared to the prior art, the mobile terminal that the embodiment of the present invention is proposed, Ke Yi
Realize for destination object carrying out removal treatment in complex scene, while the complete and fidelity of picture material is ensure that,
So as to reach the purpose of lifting Consumer's Experience;Also, the technical scheme of the embodiment of the present invention realize it is simple and convenient, be easy to it is general
And, the scope of application is wider.
Shooting unit 1001, acquiring unit 1002 and processing unit 1003 in mobile terminal provided in an embodiment of the present invention
Can be realized by the processor in mobile terminal;Certainly can also be realized by specific logic circuit;In specific implementation
During example, processor can be central processing unit (CPU), microprocessor (MPU), digital signal processor (DSP) or existing
Field programmable gate array (FPGA) etc..
It should be understood that " one embodiment " or " embodiment " that specification is mentioned in the whole text means relevant with embodiment
Special characteristic, structure or characteristic are included at least one embodiment of the present invention.Therefore, occur everywhere in entire disclosure
" in one embodiment " or " in one embodiment " not necessarily refers to identical embodiment.Additionally, these specific feature, knots
Structure or characteristic can be combined in one or more embodiments in any suitable manner.It should be understood that in various implementations of the invention
In example, the size of the sequence number of above-mentioned each process is not meant to the priority of execution sequence, and the execution sequence of each process should be with its work(
Can determine with internal logic, the implementation process without tackling the embodiment of the present invention constitutes any restriction.The embodiments of the present invention
Sequence number is for illustration only, and the quality of embodiment is not represented.
It should be noted that herein, term " including ", "comprising" or its any other variant be intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also include other key elements being not expressly set out, or also include for this process, method, article or device institute are intrinsic
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Also there is other identical element in the process of key element, method, article or device.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, can be by it
Its mode is realized.Apparatus embodiments described above are only schematical, for example, the division of the unit, is only
A kind of division of logic function, can have other dividing mode, such as when actually realizing:Multiple units or component can be combined, or
Another system is desirably integrated into, or some features can be ignored, or do not perform.In addition, shown or discussed each composition portion
Coupling point each other or direct-coupling or communication connection can be the INDIRECT COUPLINGs of equipment or unit by some interfaces
Or communication connection, can be electrical, machinery or other forms.
The above-mentioned unit that is illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit
The part for showing can be or may not be physical location;Both a place had been may be located at, it is also possible to be distributed to multiple network lists
In unit;Part or all of unit therein can be according to the actual needs selected to realize the purpose of this embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can be fully integrated into a processing unit, also may be used
Being each unit individually as a unit, it is also possible to which two or more units are integrated in a unit;It is above-mentioned
Integrated unit can both be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
One of ordinary skill in the art will appreciate that:Realizing all or part of step of above method embodiment can pass through
Programmed instruction related hardware is completed, and foregoing program can be stored in computer read/write memory medium, and the program exists
During execution, the step of including above method embodiment is performed;And foregoing storage medium includes:Movable storage device, read-only deposit
Reservoir (Read Only Memory, ROM), magnetic disc or CD etc. are various can be with the medium of store program codes.
Or, if the above-mentioned integrated unit of the present invention is to realize in the form of software function module and as independent product
When selling or using, it is also possible to which storage is in a computer read/write memory medium.Based on such understanding, the present invention is implemented
The part that the technical scheme of example substantially contributes to prior art in other words can be embodied in the form of software product,
The computer software product is stored in a storage medium, including some instructions are used to so that computer equipment (can be with
It is personal computer, server or network equipment etc.) perform all or part of each embodiment methods described of the invention.
And foregoing storage medium includes:Movable storage device, ROM, magnetic disc or CD etc. are various can be with Jie of store program codes
Matter.
The above, specific embodiment only of the invention, but protection scope of the present invention is not limited thereto, and it is any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all contain
Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be based on the protection scope of the described claims.
Claims (10)
1. a kind of image processing method, it is characterised in that methods described includes:
Include the pending image of predetermined destination object at least two different viewing angles;Wherein, all wait to locate
At least include in reason image:First pending image and the second pending image;
The spatial information in addition to the destination object is obtained in the described first pending image;
According to the spatial information in the described first pending image in addition to the destination object, in the described second pending image
It is middle to remove the destination object.
2. method according to claim 1, it is characterised in that described to be obtained except described in the described first pending image
Spatial information beyond destination object, including:
Corresponding first spatial information of the destination object is determined in the described first pending image;
The second space information in addition to first spatial information is obtained in the described first pending image;Wherein, it is described
First spatial information at least includes the first pixel point set of the composition destination object;The second space information at least includes
The second pixel point set in addition to the first pixel point set.
3. method according to claim 2, it is characterised in that described to determine the mesh in the described first pending image
Corresponding first spatial information of mark object, including:
The operation trace for determining the destination object is obtained in the described first pending image;
The destination object is determined in the described first pending image according to the operation trace;
Corresponding first spatial information of the destination object is determined according to the destination object in the described first pending image.
4. method according to claim 3, it is characterised in that described to remove the mesh in the described second pending image
Mark object, including:
Recognize that first spatial information is corresponding according to the second space information in the described second pending image to wait to fill out
The spatial information for filling;
First spatial information is removed in the described second pending image and the spatial information to be filled is inserted.
5. method according to claim 4, it is characterised in that described to be treated described in insertion in the described second pending image
The spatial information of filling, including:
The image parameter of the destination object is obtained according to first spatial information;
The spatial information to be filled is inserted in the described second pending image according to described image parameter;Wherein, it is described
Image parameter includes:It is the resolution parameter of the destination object, speed parameter, color saturation parameter, white balance parameter, right
Than at least one of degree parameter and sharpness parameter.
6. a kind of mobile terminal, it is characterised in that the mobile terminal includes:Shooting unit, acquiring unit and processing unit;Its
In,
The shooting unit, for including the pending figure of predetermined destination object at least two different viewing angles
Picture;Wherein, all at least include in pending image:First pending image and the second pending image;
The acquiring unit, for obtaining the spatial information in addition to the destination object in the described first pending image;
The processing unit, for according to the spatial information in the described first pending image in addition to the destination object,
The destination object is removed in the second pending image.
7. mobile terminal according to claim 6, it is characterised in that the acquiring unit includes:Determination subelement and obtain
Take subelement;Wherein,
The determination subelement, for determining the corresponding first space letter of the destination object in the described first pending image
Breath;
The acquisition subelement, for obtaining second in addition to first spatial information in the described first pending image
Spatial information;Wherein, first spatial information at least includes the first pixel point set of the composition destination object;Described
Two spatial informations at least include the second pixel point set in addition to the first pixel point set.
8. mobile terminal according to claim 7, it is characterised in that the determination subelement, specifically for described
The operation trace for determining the destination object is obtained in one pending image;Treated described first according to the operation trace
Determine the destination object in treatment image;The target is determined according to the destination object in the described first pending image
Corresponding first spatial information of object.
9. mobile terminal according to claim 8, it is characterised in that the processing unit includes:Identification subelement and place
Reason subelement;Wherein,
The identification subelement, for recognizing described first according to the second space information in the described second pending image
The corresponding spatial information to be filled of spatial information;
The treatment subelement, treats for being removed in the described second pending image described in first spatial information and insertion
The spatial information of filling.
10. mobile terminal according to claim 9, it is characterised in that the treatment subelement, specifically for according to described
First spatial information obtains the image parameter of the destination object;According to described image parameter in the described second pending image
Insert the spatial information to be filled;Wherein, described image parameter includes:It is the resolution parameter of the destination object, photosensitive
At least one of degree parameter, color saturation parameter, white balance parameter, contrast level parameter and sharpness parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710042795.6A CN106851098A (en) | 2017-01-20 | 2017-01-20 | A kind of image processing method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710042795.6A CN106851098A (en) | 2017-01-20 | 2017-01-20 | A kind of image processing method and mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106851098A true CN106851098A (en) | 2017-06-13 |
Family
ID=59119902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710042795.6A Pending CN106851098A (en) | 2017-01-20 | 2017-01-20 | A kind of image processing method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106851098A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107832795A (en) * | 2017-11-14 | 2018-03-23 | 深圳码隆科技有限公司 | Item identification method, system and electronic equipment |
CN111556278A (en) * | 2020-05-21 | 2020-08-18 | 腾讯科技(深圳)有限公司 | Video processing method, video display device and storage medium |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101482968A (en) * | 2008-01-07 | 2009-07-15 | 日电(中国)有限公司 | Image processing method and equipment |
CN102779270A (en) * | 2012-06-21 | 2012-11-14 | 西南交通大学 | Target clothing image extraction method aiming at shopping image search |
CN104038700A (en) * | 2014-06-26 | 2014-09-10 | Tcl集团股份有限公司 | Picture taking method and device |
CN104145479A (en) * | 2012-02-07 | 2014-11-12 | 诺基亚公司 | Object removal from an image |
US20150022698A1 (en) * | 2013-07-16 | 2015-01-22 | Samsung Electronics Co., Ltd. | Removing unwanted objects from photographed image |
CN104349045A (en) * | 2013-08-09 | 2015-02-11 | 联想(北京)有限公司 | Image collecting method and electronic equipment |
CN104580910A (en) * | 2015-01-09 | 2015-04-29 | 宇龙计算机通信科技(深圳)有限公司 | Image synthesis method and system based on front camera and rear camera |
CN104735364A (en) * | 2013-12-19 | 2015-06-24 | 中兴通讯股份有限公司 | Photo shooting method and device |
CN105763812A (en) * | 2016-03-31 | 2016-07-13 | 北京小米移动软件有限公司 | Intelligent photographing method and device |
CN105827952A (en) * | 2016-02-01 | 2016-08-03 | 维沃移动通信有限公司 | Photographing method for removing specified object and mobile terminal |
CN106170976A (en) * | 2014-04-14 | 2016-11-30 | 阿尔卡特朗讯公司 | For the method and apparatus obtaining the image with motion blur |
CN106210542A (en) * | 2016-08-16 | 2016-12-07 | 深圳市金立通信设备有限公司 | The method of a kind of photo synthesis and terminal |
CN106204423A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | A kind of picture-adjusting method based on augmented reality, device and terminal |
-
2017
- 2017-01-20 CN CN201710042795.6A patent/CN106851098A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101482968A (en) * | 2008-01-07 | 2009-07-15 | 日电(中国)有限公司 | Image processing method and equipment |
CN104145479A (en) * | 2012-02-07 | 2014-11-12 | 诺基亚公司 | Object removal from an image |
CN102779270A (en) * | 2012-06-21 | 2012-11-14 | 西南交通大学 | Target clothing image extraction method aiming at shopping image search |
US20150022698A1 (en) * | 2013-07-16 | 2015-01-22 | Samsung Electronics Co., Ltd. | Removing unwanted objects from photographed image |
CN104349045A (en) * | 2013-08-09 | 2015-02-11 | 联想(北京)有限公司 | Image collecting method and electronic equipment |
CN104735364A (en) * | 2013-12-19 | 2015-06-24 | 中兴通讯股份有限公司 | Photo shooting method and device |
CN106170976A (en) * | 2014-04-14 | 2016-11-30 | 阿尔卡特朗讯公司 | For the method and apparatus obtaining the image with motion blur |
CN104038700A (en) * | 2014-06-26 | 2014-09-10 | Tcl集团股份有限公司 | Picture taking method and device |
CN104580910A (en) * | 2015-01-09 | 2015-04-29 | 宇龙计算机通信科技(深圳)有限公司 | Image synthesis method and system based on front camera and rear camera |
CN105827952A (en) * | 2016-02-01 | 2016-08-03 | 维沃移动通信有限公司 | Photographing method for removing specified object and mobile terminal |
CN105763812A (en) * | 2016-03-31 | 2016-07-13 | 北京小米移动软件有限公司 | Intelligent photographing method and device |
CN106204423A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | A kind of picture-adjusting method based on augmented reality, device and terminal |
CN106210542A (en) * | 2016-08-16 | 2016-12-07 | 深圳市金立通信设备有限公司 | The method of a kind of photo synthesis and terminal |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107832795A (en) * | 2017-11-14 | 2018-03-23 | 深圳码隆科技有限公司 | Item identification method, system and electronic equipment |
CN111556278A (en) * | 2020-05-21 | 2020-08-18 | 腾讯科技(深圳)有限公司 | Video processing method, video display device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106485689B (en) | A kind of image processing method and device | |
CN106780634A (en) | Picture dominant tone extracting method and device | |
CN106791017A (en) | A kind of terminal and photographic method | |
CN105956999A (en) | Thumbnail generating device and method | |
CN106506795A (en) | A kind of mobile terminal and image processing method | |
CN105488756B (en) | Picture synthetic method and device | |
CN106534619A (en) | Method and apparatus for adjusting focusing area, and terminal | |
CN106569709A (en) | Device and method for controlling mobile terminal | |
CN106851003A (en) | The method and device of text color is adjusted according to wallpaper color | |
CN106506778A (en) | A kind of dialing mechanism and method | |
CN105554382B (en) | A kind of mobile terminal and its method for realizing control of taking pictures | |
CN105893490A (en) | Picture display device and method | |
CN106851113A (en) | A kind of photographic method and mobile terminal based on dual camera | |
CN106792609A (en) | A kind of method of calling of mobile terminal and emergency numbers | |
CN106911881A (en) | A kind of an action shot filming apparatus based on dual camera, method and terminal | |
CN106973226A (en) | A kind of image pickup method and terminal | |
CN105183830B (en) | picture browsing method and device | |
CN104935822A (en) | Method and device for processing images | |
CN106980460A (en) | A kind of mobile terminal and image processing method | |
CN104731484B (en) | The method and device that picture is checked | |
CN106851098A (en) | A kind of image processing method and mobile terminal | |
CN106791449A (en) | Method, photo taking and device | |
CN106603859A (en) | Photo filter processing method, device and terminal | |
CN106709882A (en) | Image fusion method and device | |
CN106254636A (en) | A kind of control method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170613 |