CN106331482A - Photo processing device and method - Google Patents

Photo processing device and method Download PDF

Info

Publication number
CN106331482A
CN106331482A CN201610708502.9A CN201610708502A CN106331482A CN 106331482 A CN106331482 A CN 106331482A CN 201610708502 A CN201610708502 A CN 201610708502A CN 106331482 A CN106331482 A CN 106331482A
Authority
CN
China
Prior art keywords
processed
processing
area
photo
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610708502.9A
Other languages
Chinese (zh)
Inventor
张晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201610708502.9A priority Critical patent/CN106331482A/en
Publication of CN106331482A publication Critical patent/CN106331482A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses a photo processing device and method. The device comprises: an acquisition module, a first determining module, a second determining module and a processing module. The acquisition module is used for acquiring a photo to be processed, wherein the photo to be processed is a cloned photo comprising a plurality of same sceneries and/or figures. The first determining module is used for determining one or more of the same sceneries and/or figures in the photo to be processed as an object to be processed. The second determining module is used for determining one or more regions to be processed of the object to be processed. The processing module is used for processing the region to be processed into one or more preset processing effects. By the scheme of the embodiment of the invention, a user with a general shooting technology and processing technology can obtain a more special photo, so that the novelty of the photo is improved and user experience is improved.

Description

Photo processing device and method
Technical Field
The invention relates to the technical field of terminal application and picture processing, in particular to a photo processing device and method.
Background
Photography, especially cell phone photography, has become a major entertainment activity for people, but featured photography and highly skilled post-processing are expected to be inaccessible to most users, and it is desired to take more excellent or featured photographs, which is not currently possible with the photography technology alone. Therefore, how to obtain more distinctive photos is a problem that is not solved at present for users with common shooting techniques and processing techniques.
Disclosure of Invention
The invention mainly aims to provide a photo processing device and a photo processing method, which can enable users with common shooting technology and processing technology to obtain more distinctive photos, increase the novelty of the photos and improve the experience of the users.
To achieve the above object, the present invention provides a photograph processing apparatus comprising: the device comprises an acquisition module, a first determination module, a second determination module and a processing module.
The acquisition module is used for acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons.
And the first determining module is used for determining one or more same scenes and/or people in the photo to be processed as the object to be processed.
And the second determination module is used for determining one or more to-be-processed areas of the to-be-processed object.
And the processing module is used for processing the area to be processed into one or more preset processing effects.
Optionally, the preset processing effect includes: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings.
Optionally, the acquiring module acquires the to-be-processed photo includes:
and detecting the photographing operation of the preset clone camera.
And when the preset cloning camera finishes the photographing operation, sending out a prompt for reminding whether special effect processing needs to be carried out on the newly photographed picture.
And when a confirmation instruction for carrying out special effect processing on the newly shot photo is received, putting the newly shot photo into a preset special effect editing interface, and taking the newly shot photo as a photo to be processed.
Optionally, the determining, by the second determining module, one or more to-be-processed regions of the to-be-processed object includes:
detecting a sliding track of a finger on an object to be processed, and taking an area surrounded by the sliding track as an area to be processed; or,
and detecting an smearing area of the finger on the object to be processed, and taking the smearing area as the area to be processed.
Optionally, the processing module processes the region to be processed into one or more preset processing effects, including:
and determining an edge area and a non-edge area of the area to be processed.
The non-edge area is processed to a selected first processing effect.
And processing the edge area into a gradual change effect of the first processing effect from nothing to any.
The edge area is an area formed by points in the area to be processed, wherein the distance between the points and the edge line of the area to be processed is smaller than or equal to a preset distance threshold.
Optionally, the apparatus further comprises: the device comprises a detection module and a revocation module.
And the detection module is used for detecting whether a cancel instruction exists or whether a selection instruction with a processing effect different from the currently selected processing effect exists.
And the cancelling module is used for cancelling the processing data for processing the area surrounded by the previous sliding track or cancelling the processing data corresponding to the previous smearing action when a cancelling instruction is detected.
And the processing module is further used for processing the area to be processed according to the reselected third processing effect on the basis of the current second processing effect when the selection instruction of different processing effects is detected.
In order to achieve the above object, the present invention further provides a photo processing method, including:
acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons.
And determining one or more same scenes and/or persons in the photos to be processed as objects to be processed.
One or more regions to be processed of the object to be processed are determined.
And processing the area to be processed into one or more preset processing effects.
Optionally, the preset processing effect includes: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings.
Optionally, the acquiring the to-be-processed photo includes:
and detecting the photographing operation of the preset clone camera.
And when the preset cloning camera finishes the photographing operation, sending out a prompt for reminding whether special effect processing needs to be carried out on the newly photographed picture.
And when a confirmation instruction for carrying out special effect processing on the newly shot photo is received, putting the newly shot photo into a preset special effect editing interface, and taking the newly shot photo as a photo to be processed.
Optionally, determining one or more regions to be processed of the object to be processed comprises:
detecting a sliding track of a finger on an object to be processed, and taking an area surrounded by the sliding track as an area to be processed; or,
and detecting an smearing area of the finger on the object to be processed, and taking the smearing area as the area to be processed.
Optionally, the processing the area to be processed to a preset one or more processing effects comprises:
and determining an edge area and a non-edge area of the area to be processed.
The non-edge area is processed to a selected first processing effect.
And processing the edge area into a gradual change effect of the first processing effect from nothing to any.
The edge area is an area formed by points in the area to be processed, wherein the distance between the points and the edge line of the area to be processed is smaller than or equal to a preset distance threshold.
Optionally, the method further comprises:
it is detected whether there is a cancel instruction or a select instruction for a different processing effect than the currently selected processing effect.
And when a cancel instruction is detected, canceling the processing data for processing the area surrounded by the previous sliding track, or canceling the processing data corresponding to the previous smearing action.
And when a selection instruction of different processing effects is detected, processing the area to be processed according to the reselected third processing effect on the basis of the current second processing effect.
The present invention provides a photo processing apparatus including: the device comprises an acquisition module, a first determination module, a second determination module and a processing module. The acquisition module is used for acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons. And the first determining module is used for determining one or more same scenes and/or people in the photo to be processed as the object to be processed. And the second determination module is used for determining one or more to-be-processed areas of the to-be-processed object. And the processing module is used for processing the area to be processed into one or more preset processing effects. Through the scheme of the embodiment of the invention, a user with a common shooting technology and a common processing technology can obtain a more distinctive picture, so that the novelty of the picture is improved, and the user experience is improved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a block diagram of a photo processing apparatus according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for processing a picture according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a picture processing method according to an embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
An alternative mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 shows a mobile terminal having various components, but it is to be understood that the mobile terminal does not require the implementation of all of the illustrated components and that the present solution may implement more or fewer components. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit 100 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Moreover, it is wideThe broadcast signal may further comprise a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive a broadcast signal by using a signal such as multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO)@) A digital broadcasting system of a terrestrial digital broadcasting integrated service (ISDB-T), etc. receives digital broadcasting. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), redOuter data Association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd so on.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a proximity sensor 1410 as will be described below in connection with a touch screen.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 1810 for reproducing (or playing back) multimedia data, and the multimedia module 1810 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275 that may be coupled to the base station 270 via backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above optional hardware structure of the mobile terminal and the communication system, various embodiments of the method of the present invention are provided.
As shown in fig. 3, a first embodiment of the present invention proposes a photograph processing apparatus 1, which includes: the device comprises an acquisition module 11, a first determination module 12, a second determination module 13 and a processing module 14.
The acquisition module 11 is used for acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons.
In the embodiment of the invention, with the popularization of the terminal photographing function, everyone can photograph through the terminal, but with the higher and higher entertainment requirements of people, the common photograph can not meet the requirements of users, so that a clone camera is derived to meet the interestingness of photographing.
The cloning camera is realized through software, people and objects in the photos can be copied, a plurality of same images can be created in the same scene, and a user can easily realize the cloning camera only by creativity without learning complex picture processing skills. The specific implementation method comprises the following steps: after the photo is shot, editing is started from the second opening, the outline of the person and the object needing to be copied is drawn by fingers, the software can automatically select the graph in the outline, and the outline of the person and the object needing to be copied can be larger but cannot be lost. What we want to scratch is the second to the last photo, after the outline of each person and thing needing clone processing is drawn on the second to the last photo in sequence, the effect picture of the clone photo is automatically generated by the clone camera software. The processing effect of the cloning camera is very natural, the cloned person, the cloned object and the background are fused naturally, and even the shadow of the object can be seen, which brings much fun to people.
On the basis of the clone camera technology, in order to further increase the interestingness of a user, the obtained clone photos can be subjected to further special effect processing to obtain novel visual effects, for example, when the clone photos are shot, one character image can be shot to be in a drawing gesture, the other character image can be shot to be in a statue gesture, after the clone photos are synthesized, the character image in the statue gesture in the clone photos is subjected to special effect processing through the embodiment of the invention, such as a simple stroke effect, so that the effect of the whole clone photo appears as a self-portrait of a person who draws the person in the form of a simple stroke.
Of course, different effects may be handled in different embodiments, and are not limited to the above-described simple stroke effect. And the processing effect is preset, and the photo or the picture to be processed can be processed into the corresponding processing effect through the selection of a user. Optionally, the preset processing effect includes: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings. The scheme of the embodiment can enable different effects of the same person or scenery to be presented on one photo, increases the interestingness of the photo, and improves the entertainment and experience of a user.
In the embodiment of the present invention, in order to achieve the above effect, first, a photo to be processed is obtained through the obtaining module 11; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons. For example, the clone photo may include multiple images of the same person in different postures, multiple images of an animal in different scenes or postures, or different sceneries of a sight spot at different times (e.g., morning, noon, evening, or spring, summer, autumn, winter).
It should be noted that the photo to be processed is not limited to the clone photo, and any one photo may be taken as the photo to be processed according to the embodiment. For example, if a normal landscape photo is treated in a canvas style, a part or the whole of the photo can be made to have a flavor like a canvas, and a user can also treat a part of the photo with a strong emphasis to make the emphasis have a flavor more, thereby achieving an unexpected presentation result.
In the embodiment of the present invention, the photo to be processed may be acquired by the following scheme.
Optionally, the acquiring module 11 acquires the photo to be processed, including steps S101 to S103, where a clone camera is taken as an example:
s101, detecting the photographing operation of a preset clone camera.
In the embodiment of the present invention, a new photo can be obtained at the first time by capturing the photographing action in the current terminal, and the new photo is taken as a photo to be processed, and the following description still takes a clone photo as an example. For example, the obtaining module 11 may detect a photographing operation of the cloning camera, where the specific detection method may be real-time detection or periodic detection, and in order to save memory, the detection may also be implemented in a notification manner, that is, when the cloning camera is started, a start notification is sent to the obtaining module 11 to notify the obtaining module 11 that the current terminal is taking a picture through the cloning camera. For ordinary photographing, the starting action of the terminal camera can be detected, or the shutter starting action can be detected. In the embodiment of the present invention, specific detection methods, algorithms, and apparatuses are not limited, and any method, algorithm, and apparatus capable of detecting a photographing operation are within the protection scope of the embodiment of the present invention.
In the embodiment of the invention, the method for acquiring the photo to be processed is not limited to the new shot photo of the camera, and the photo or the picture stored in the terminal can be directly acquired as the photo to be processed or the picture to be processed. In particular implementations, a confirmation instruction of the current photo or picture may be detected to determine whether further special effect processing of the current photo or picture as a processed photo or picture is required. For example, after the obtaining module 11 detects that the user selects a photo or picture from the picture storage space, a preset selection window pops up, where the selection window includes an input window or a key for determining or denying whether to perform special effect processing on the current photo or picture, and the obtaining module 11 determines whether the current user wishes to perform special effect processing on the currently selected photo or picture according to the input content of the input window or according to the selection result of the determination key or the negation key. If the user determines not to perform special effect processing on the current photo or picture, the obtaining module 11 may ignore the selection operation of the user and does not act; if the user determines to perform special effect processing on the current photo or picture, the obtaining module 11 may use the current photo or picture as an object to be processed, and send the photo or picture to a subsequent processing process.
And S102, when the preset cloning camera finishes the photographing operation, sending out a prompt for judging whether special effect processing needs to be carried out on the newly photographed picture.
In the embodiment of the present invention, when it is detected that a new shot is taken through step S101, the process may be directly performed in the special effect processing scheme of the embodiment of the present invention, or a prompt indicating whether to perform special effect processing may be sent to the user in advance. For example, like the above-mentioned method for confirming an object to be processed, the step may also pop up a preset selection window, where the selection window includes an input window or a key for confirming or denying whether to perform special effect processing on the current new shot, and the obtaining module 11 determines whether the current user wishes to perform special effect processing on the current new shot according to the input content of the input window or according to the selection result of the positive key or the negative key. If the user determines not to perform special effect processing on the newly taken photo, the obtaining module 11 may ignore the newly taken photo and does not act; if the user determines to perform special effect processing on the newly taken photo, the obtaining module 11 may use the newly taken photo as an object to be processed, and send the newly taken photo to a subsequent processing process.
S103, when a confirmation instruction for carrying out special effect processing on the newly shot photo is received, putting the newly shot photo into a preset special effect editing interface, and taking the newly shot photo as a photo to be processed.
In the embodiment of the present invention, when it is confirmed in step S103 that special effect processing is performed on a new shot, the new shot may be placed in a preset special effect editing interface, and it should be noted that the preset special effect editing interface may include one or more editing mode options required for editing, and when a certain option is selected, a corresponding editing effect is implemented, for example, selection of a processing effect, selection of a processing area, a cancel command option, a resume command option, and the like. It should be noted that the editing type, the interface presentation form, the number of interfaces, the appearance position on the terminal screen, and the like included in the special effect editing interface are not specifically limited.
A first determining module 12, configured to determine one or more of the same scene and/or person in the photo to be processed as an object to be processed.
In the embodiment of the present invention, after the to-be-processed photo is obtained by the obtaining module 11, one or more processing objects, such as one or more same scenes and/or people in the above-mentioned clone photo, may be selected from the to-be-processed photo. In the subsequent processes, the same selected scene and/or character can be processed into different effects, respectively. Still taking the above-mentioned clone photo as an example, if one of the character images can be shot in a painting gesture and a plurality of other character images can be shot in a plurality of different gestures when the clone photo is taken, after the clone photo is synthesized, the character images in the plurality of different gestures in the clone photo are specially processed by the embodiment of the invention, for example, one character is processed in an oil painting effect, one character is processed in a water painting effect, and one character is processed in a water painting sketch, … …, of course, the character image with the painting gesture can be kept without specially processing, so that the processed photo can present an effect picture that one person can self-portrait in different painting modes. The interest of the photo and the entertainment of the user are increased.
Of course, in different embodiments, whether in the clone photo or the non-clone photo, different persons or objects can be used as the processing objects, so that the processing objects can be subjected to special effect processing to obtain various picture effects. For example, if all the animals, people and scenes except the first person in the photo to be processed are taken as the objects to be processed, and all the selected objects to be processed are processed into the old movie-style sketch, the first person in the photo and other objects in the photo can be presented in different roles, wherein the first person has modern sense of reality, and all the other objects subjected to special effect processing become the background of the first person or are presented with the effect similar to the recall picture of the first person, and the scheme enables the photo subjected to special effect processing to be endowed with new meaning or story quality.
A second determining module 13, configured to determine one or more regions to be processed of the object to be processed.
In the embodiment of the present invention, after the first determining module 12 determines the processing objects in the photo to be processed, the second determining module 13 may determine the area to be processed for each processing object, where the area to be processed may include part or all of the processing objects. Because each object to be processed does not necessarily need to be processed into the selected processing effect in order to achieve different processing effects, for example, in the embodiment of the drawing effect, in order to achieve the effect that a person is drawing itself, a part of the object to be processed may be processed into pencil sketch, color sketch or watercolor, for example, only the edge area of the object to be processed is processed into the above effect, so that the drawing effect can be presented, and the processed picture effect is more vivid.
In the embodiment of the present invention, the region to be processed of the object to be processed may be determined by the following scheme.
Optionally, the determining, by the second determining module 13, one or more to-be-processed regions of the to-be-processed object includes a first mode and a second mode:
the first method is to detect a sliding track of a finger on an object to be processed, and to use an area surrounded by the sliding track as an area to be processed.
In the embodiment of the present invention, the to-be-processed area on each to-be-processed object may be determined by the user's delineation, and the second processing module 13 determines the to-be-processed area by detecting the delineated area. In the embodiment of the present invention, a specific detection method may be: in the to-be-processed region determining mode, a sliding trajectory of a finger of a user on the to-be-processed object is detected, where the sliding trajectory is a process of a user to be processed on the to-be-processed region, and therefore, a region surrounded by the sliding trajectory is the to-be-processed region outlined by the user, and thus, as long as the region surrounded by the sliding trajectory of the finger of the user is determined, the to-be-processed region selected by the user is determined by the second processing module 13.
And secondly, detecting an smearing area of the finger on the object to be processed, and taking the smearing area as the area to be processed.
In the embodiment of the present invention, the to-be-processed area on each to-be-processed object may also be determined by smearing by the user, and the second processing module 13 determines the to-be-processed area by detecting the smeared area. In the embodiment of the present invention, a specific detection method may be: in the to-be-processed area determining mode, an application area of a finger of a user on the to-be-processed object is detected, that is, a sliding contact area of the finger of the user on the terminal interface, and an application process of the user is a selection process of the user on the to-be-processed area, so that an area covered by the application operation is the to-be-processed area selected by the user, and the to-be-processed area selected by the user is determined as long as the application area of the finger of the user is determined by the second processing module 13.
It should be noted that the two methods for determining the area to be processed are both specific embodiments of the present invention, and other determination methods may be selected in other embodiments, which are not limited to the above methods, and any method capable of determining the area to be processed in the embodiment of the present invention is within the protection scope of the embodiment of the present invention.
And the processing module 14 is used for processing the area to be processed into one or more preset processing effects.
In the embodiment of the present invention, after the obtaining module 11, the first determining module 02, and the second determining module 13 finally determine the to-be-processed area of the to-be-processed object on the to-be-processed picture, the to-be-processed area may be processed according to the selected preset processing effect, so as to achieve the selected processing effect. As is known from the above, the treatment effect may include, but is not limited to, one or more of the following effects: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings. Since the effect processing method is generally applied to various image processing technologies, a specific implementation method for each effect is not described herein again, the processing effect may be implemented in any one or more ways at present, and specific processing methods, algorithms, and tools are not limited.
In the embodiment of the present invention, how to process a region to be processed into a preset processing effect is described below by taking the region to be processed as a part of an object to be processed as an example.
Optionally, the processing module 14 processes the region to be processed into one or more preset processing effects, including steps S201 to S203:
s201, determining an edge area and a non-edge area of the area to be processed.
In the embodiment of the invention, in order to ensure that the transition between the partial region after the treatment and the region which is not treated is good and the treatment effect is more vivid, the transition treatment, namely the gradual change treatment, can be carried out on the edge region of the region to be treated. Before that, it needs to be preferentially determined which part of the region to be processed is the edge region and which part is the non-edge region.
Optionally, the edge region refers to a region formed by points in the to-be-processed region, where a distance from an edge line of the to-be-processed region is smaller than or equal to a preset distance threshold. Conversely, the non-edge area refers to an area formed by points in the to-be-processed area, the distance between which and the edge line of the to-be-processed area is greater than a preset distance threshold.
In the embodiment of the present invention, the preset distance threshold may be defined by itself according to different application scenarios, and is not limited specifically herein.
S202, processing the non-edge area into the selected first processing effect.
In the embodiment of the present invention, after the edge area and the non-edge area of the area to be processed are determined in step S201, corresponding processing may be performed on different areas. The normal effect processing can be directly performed on the non-edge area, for example, if the first processing effect, such as a wax painting effect, is selected, the non-edge area can be directly processed into the wax painting effect in its entirety.
And S203, processing the edge area into a gradual change effect with the first processing effect from nothing to some.
In the embodiment of the present invention, for the edge region, in order to achieve a realistic processing effect and achieve a perfect transition between the region to be processed and the region not to be processed, the region may be subjected to a gradual change effect from nothing to achieve the first processing effect, such as the wax painting effect described above. Since the gradual change processing is widely applied to various image processing technologies, the detailed description of the gradual change effect processing method is omitted here.
It should be noted that, since step S202 and step S203 are different processing methods for different areas, any one step may be processed first without being divided into different processing sequences.
Optionally, the apparatus further comprises: a detection module 15 and a revocation module 16.
In the embodiment of the present invention, in consideration that a user may have a modification action in a picture processing process, there may be a cancel command for a certain operation, and therefore, in the embodiment of the present invention, a detection module 15 and a cancel module 16 are further provided.
A detecting module 15, configured to detect whether there is a cancel instruction or a selection instruction with a processing effect different from the currently selected processing effect.
In the embodiment of the present invention, the detection module 15 may detect a cancel instruction sent by a user, and the specific detection method may be to detect a preset action of a cancel key or detect a preset operation of the user, where the preset operation corresponds to the cancel instruction.
In the embodiment of the present invention, the detecting module 15 may further detect a selection instruction sent by the user, where the selection instruction corresponds to the selected different processing effects. The specific detection method may be to detect actions of preset selection keys for various processing effects, or to detect one or more preset operations of a user, where each preset operation corresponds to a selection instruction for a different processing effect.
It should be noted that the above detection manner is only an optional embodiment of the present invention, and other detection methods or algorithms may be selected in other embodiments, and a specific implementation method is not limited.
And the cancelling module 16 is configured to cancel, when a cancelling instruction is detected, the processing data for processing the area surrounded by the previous sliding track, or cancel the processing data corresponding to the previous smearing action.
In the embodiment of the present invention, when the detection module 15 detects that there is a cancel instruction, if the area to be processed desired by the area to be processed is currently selected, the processing data for processing the area surrounded by the previous sliding track may be cancelled, or the processing data corresponding to the previous smearing action may be cancelled, so as to cancel the area selection operation in the previous step. If the current user is determining the object to be processed, the data established when the last object to be processed is determined may be revoked to revoke the last determined object to be processed. If the current user is performing corresponding effect processing on the area to be processed, the data established by the previous step of effect processing can be cancelled, so that the previous step of effect processing is cancelled.
The processing module 14 is further configured to, when a selection instruction of a different processing effect is detected, process the to-be-processed area according to the reselected third processing effect on the basis of the current second processing effect.
In the embodiment of the present invention, when the detection module 15 detects that there is a new processing effect, that is, the selection instruction of the third processing effect (for example, the wax painting), the processing of the wax painting effect may be continued on the basis of the current second processing effect, for example, the watercolor painting effect, so that the current photo or picture can also present the wax painting effect after the subsequent processing.
Having thus described all of the features of the embodiments of the present invention, it should be noted that the foregoing is only one or more embodiments of the present invention, and that other embodiments may be selected in other contexts, and any embodiment similar or identical to an embodiment of the present invention, and any combination of features of the present invention is within the scope of the embodiments of the present invention.
To achieve the above object, a second embodiment of the present invention further provides a photo processing method, as shown in fig. 4 and fig. 5, it should be noted that any embodiment of the photo processing apparatus described above is applicable to the method embodiment of the present invention, and is not described herein again, and the method includes steps S301 to S304:
s301, acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons.
S302, one or more same scenes and/or persons in the photo to be processed are determined as objects to be processed.
S303, determining one or more to-be-processed areas of the to-be-processed object.
S304, processing the area to be processed into one or more preset processing effects.
Optionally, the preset processing effect includes: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings.
Optionally, the acquiring the to-be-processed photo includes:
and detecting the photographing operation of the preset clone camera.
And when the preset cloning camera finishes the photographing operation, sending out a prompt for reminding whether special effect processing needs to be carried out on the newly photographed picture.
And when a confirmation instruction for carrying out special effect processing on the newly shot photo is received, putting the newly shot photo into a preset special effect editing interface, and taking the newly shot photo as a photo to be processed.
Optionally, determining one or more regions to be processed of the object to be processed comprises:
detecting a sliding track of a finger on an object to be processed, and taking an area surrounded by the sliding track as an area to be processed; or,
and detecting an smearing area of the finger on the object to be processed, and taking the smearing area as the area to be processed.
Optionally, the processing the area to be processed to a preset one or more processing effects comprises:
and determining an edge area and a non-edge area of the area to be processed.
The non-edge area is processed to a selected first processing effect.
And processing the edge area into a gradual change effect of the first processing effect from nothing to any.
The edge area is an area formed by points in the area to be processed, wherein the distance between the points and the edge line of the area to be processed is smaller than or equal to a preset distance threshold.
Optionally, the method further comprises:
it is detected whether there is a cancel instruction or a select instruction for a different processing effect than the currently selected processing effect.
And when a cancel instruction is detected, canceling the processing data for processing the area surrounded by the previous sliding track, or canceling the processing data corresponding to the previous smearing action.
And when a selection instruction of different processing effects is detected, processing the area to be processed according to the reselected third processing effect on the basis of the current second processing effect.
The present invention provides a photo processing apparatus including: the device comprises an acquisition module, a first determination module, a second determination module and a processing module. The acquisition module is used for acquiring a photo to be processed; wherein, the photo to be processed is a clone photo containing a plurality of same scenes and/or persons. And the first determining module is used for determining one or more same scenes and/or people in the photo to be processed as the object to be processed. And the second determination module is used for determining one or more to-be-processed areas of the to-be-processed object. And the processing module is used for processing the area to be processed into one or more preset processing effects. Through the scheme of the embodiment of the invention, a user with a common shooting technology and a common processing technology can obtain a more distinctive picture, so that the novelty of the picture is improved, and the user experience is improved; and the special effect processing is carried out on the photos, so that the photos can achieve the effect of being unreal and true, the creation interest of terminal photo fans can be stimulated, and the attraction to products is increased.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the present specification and drawings, or used directly or indirectly in other related fields, are included in the scope of the present invention.

Claims (10)

1. A photo processing apparatus, characterized in that the apparatus comprises: the device comprises an acquisition module, a first determination module, a second determination module and a processing module;
the acquisition module is used for acquiring a photo to be processed;
the first determining module is used for determining one or more same scenes and/or persons in the photo to be processed as objects to be processed;
the second determination module is used for determining one or more to-be-processed areas of the to-be-processed object;
the processing module is used for processing the area to be processed into one or more preset processing effects.
2. A photograph processing apparatus as claimed in claim 1, characterized in that said preset processing effect comprises: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings.
3. The apparatus of claim 1, wherein the acquiring module acquires the photo to be processed comprises:
detecting the photographing operation of a preset clone camera;
when the preset cloning camera finishes the photographing operation, sending out a prompt for judging whether special effect processing needs to be carried out on the newly photographed picture;
and when a confirmation instruction for carrying out special effect processing on the new shot photo is received, putting the new shot photo into a preset special effect editing interface, and taking the new shot photo as the photo to be processed.
4. The photo processing apparatus of claim 1, wherein the second determining module determining one or more to-be-processed regions of the to-be-processed object comprises:
detecting a sliding track of a finger on the object to be processed, and taking an area surrounded by the sliding track as an area to be processed; or,
detecting an smearing area of a finger on the object to be processed, and taking the smearing area as the area to be processed.
5. The photo processing method of claim 1, wherein the processing module processing the area to be processed to a preset one or more processing effects comprises:
determining an edge area and a non-edge area of the area to be processed;
processing the non-edge region into a selected first processing effect;
processing the edge area into a gradual change effect of the first processing effect from nothing to nothing;
the edge area is an area formed by points in the area to be processed, wherein the distance between the points and the edge line of the area to be processed is smaller than or equal to a preset distance threshold.
6. A method of processing a photograph, the method comprising:
acquiring a photo to be processed;
determining one or more same scenes and/or persons in the photo to be processed as objects to be processed;
determining one or more to-be-processed areas of the to-be-processed object;
and processing the area to be processed into one or more preset processing effects.
7. The photo processing method of claim 6, wherein the preset processing effect comprises: pencil sketches, color sketches, watercolors, oil paintings, blue art style sketches, old film style sketches, smooth sketches, watercolor paintings, and wax paintings.
8. The picture processing method of claim 6, wherein obtaining the picture to be processed comprises:
detecting the photographing operation of a preset clone camera;
when the preset cloning camera finishes the photographing operation, sending out a prompt for judging whether special effect processing needs to be carried out on the newly photographed picture;
and when a confirmation instruction for carrying out special effect processing on the new shot photo is received, putting the new shot photo into a preset special effect editing interface, and taking the new shot photo as the photo to be processed.
9. The photo processing method of claim 6, wherein the determining one or more to-be-processed regions of the to-be-processed object comprises:
detecting a sliding track of a finger on the object to be processed, and taking an area surrounded by the sliding track as an area to be processed; or,
detecting an smearing area of a finger on the object to be processed, and taking the smearing area as the area to be processed.
10. A photograph processing method as claimed in claim 6, wherein said processing said area to be processed to a preset one or more processing effects comprises:
determining an edge area and a non-edge area of the area to be processed;
processing the non-edge region into a selected first processing effect;
processing the edge area into a gradual change effect of the first processing effect from nothing to nothing;
the edge area is an area formed by points in the area to be processed, wherein the distance between the points and the edge line of the area to be processed is smaller than or equal to a preset distance threshold.
CN201610708502.9A 2016-08-23 2016-08-23 Photo processing device and method Pending CN106331482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610708502.9A CN106331482A (en) 2016-08-23 2016-08-23 Photo processing device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610708502.9A CN106331482A (en) 2016-08-23 2016-08-23 Photo processing device and method

Publications (1)

Publication Number Publication Date
CN106331482A true CN106331482A (en) 2017-01-11

Family

ID=57742341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610708502.9A Pending CN106331482A (en) 2016-08-23 2016-08-23 Photo processing device and method

Country Status (1)

Country Link
CN (1) CN106331482A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010106A (en) * 2017-11-22 2018-05-08 努比亚技术有限公司 A kind of method for displaying image, terminal and computer-readable recording medium
CN108449590A (en) * 2018-03-30 2018-08-24 盎锐(上海)信息科技有限公司 Image processing method and device
CN111862349A (en) * 2019-04-26 2020-10-30 北京字节跳动网络技术有限公司 Virtual brush implementation method and device and computer readable storage medium
CN112445398A (en) * 2019-09-04 2021-03-05 上海掌门科技有限公司 Method, electronic device and computer readable medium for editing pictures
CN113497898A (en) * 2020-04-02 2021-10-12 北京字节跳动网络技术有限公司 Video special effect configuration file generation method, video rendering method and device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1530825A (en) * 2003-03-12 2004-09-22 英业达股份有限公司 Image displaying method
CN101431616A (en) * 2007-11-06 2009-05-13 奥林巴斯映像株式会社 Image synthesis device and method
US20090201382A1 (en) * 2008-02-13 2009-08-13 Casio Computer Co., Ltd. Imaging apparatus for generating stroboscopic image
CN102075680A (en) * 2009-11-20 2011-05-25 索尼公司 Image processing apparatus, image processing method and program
CN102075682A (en) * 2009-11-20 2011-05-25 索尼公司 Image capturing apparatus, image processing apparatus, control method thereof and program
CN103856726A (en) * 2010-09-03 2014-06-11 卡西欧计算机株式会社 Image processing device and image processing method
CN104104798A (en) * 2014-07-23 2014-10-15 深圳市中兴移动通信有限公司 Method for shooting light painting video and mobile terminal
CN104394313A (en) * 2014-10-27 2015-03-04 成都理想境界科技有限公司 Special effect video generating method and device
CN105025223A (en) * 2015-07-03 2015-11-04 广东欧珀移动通信有限公司 Method of controlling shooting of multiple cameras and shooting terminal
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN105847728A (en) * 2016-04-13 2016-08-10 腾讯科技(深圳)有限公司 Information processing method and terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1530825A (en) * 2003-03-12 2004-09-22 英业达股份有限公司 Image displaying method
CN101431616A (en) * 2007-11-06 2009-05-13 奥林巴斯映像株式会社 Image synthesis device and method
US20090201382A1 (en) * 2008-02-13 2009-08-13 Casio Computer Co., Ltd. Imaging apparatus for generating stroboscopic image
CN102075680A (en) * 2009-11-20 2011-05-25 索尼公司 Image processing apparatus, image processing method and program
CN102075682A (en) * 2009-11-20 2011-05-25 索尼公司 Image capturing apparatus, image processing apparatus, control method thereof and program
CN103856726A (en) * 2010-09-03 2014-06-11 卡西欧计算机株式会社 Image processing device and image processing method
CN104104798A (en) * 2014-07-23 2014-10-15 深圳市中兴移动通信有限公司 Method for shooting light painting video and mobile terminal
CN104394313A (en) * 2014-10-27 2015-03-04 成都理想境界科技有限公司 Special effect video generating method and device
CN105025223A (en) * 2015-07-03 2015-11-04 广东欧珀移动通信有限公司 Method of controlling shooting of multiple cameras and shooting terminal
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN105847728A (en) * 2016-04-13 2016-08-10 腾讯科技(深圳)有限公司 Information processing method and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PETA-VISION: "克隆相机(Clone Camera) v2.0.5 汉化版", 《西西软件园--WWW.CR173.COM》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010106A (en) * 2017-11-22 2018-05-08 努比亚技术有限公司 A kind of method for displaying image, terminal and computer-readable recording medium
CN108449590A (en) * 2018-03-30 2018-08-24 盎锐(上海)信息科技有限公司 Image processing method and device
CN108449590B (en) * 2018-03-30 2020-02-11 盎锐(上海)信息科技有限公司 Image processing method and device
CN111862349A (en) * 2019-04-26 2020-10-30 北京字节跳动网络技术有限公司 Virtual brush implementation method and device and computer readable storage medium
US11561651B2 (en) 2019-04-26 2023-01-24 Beijing Bytedance Network Technology Co., Ltd. Virtual paintbrush implementing method and apparatus, and computer readable storage medium
CN112445398A (en) * 2019-09-04 2021-03-05 上海掌门科技有限公司 Method, electronic device and computer readable medium for editing pictures
CN113497898A (en) * 2020-04-02 2021-10-12 北京字节跳动网络技术有限公司 Video special effect configuration file generation method, video rendering method and device
US11856152B2 (en) 2020-04-02 2023-12-26 Beijing Bytedance Network Technology Co., Ltd. Video special effect configuration file generation method and apparatus, and video rendering method and apparatus

Similar Documents

Publication Publication Date Title
CN106454121B (en) Double-camera shooting method and device
CN106909274B (en) Image display method and device
CN105100491B (en) A kind of apparatus and method for handling photo
CN106911850B (en) Mobile terminal and screen capturing method thereof
CN106657782B (en) Picture processing method and terminal
CN106097284B (en) A kind of processing method and mobile terminal of night scene image
CN106534693B (en) A kind of photo processing method, device and terminal
CN104932697B (en) Gesture unlocking method and device
CN106331482A (en) Photo processing device and method
CN106791480A (en) A kind of terminal and video skimming creation method
CN106851113A (en) A kind of photographic method and mobile terminal based on dual camera
CN106131327B (en) Terminal and image acquisition method
CN105791541B (en) Screenshot method and mobile terminal
CN107018334A (en) A kind of applied program processing method and device based on dual camera
CN106993134B (en) Image generation device and method and terminal
CN105302899A (en) Mobile terminal and picture processing method
CN105049612A (en) Method of realizing recording and device of realizing recording
CN106454074A (en) Mobile terminal and shooting processing method
CN106375610B (en) Photo processing method and terminal
CN106791449B (en) Photo shooting method and device
CN105744508B (en) Game data backup method and mobile terminal
CN105898158B (en) A kind of data processing method and electronic equipment
CN105045508B (en) The setting device and method of system model
CN105827981A (en) Mobile terminal photographing method and device
CN105138235A (en) Picture processing apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170111