CN106254643B - Mobile terminal and picture processing method - Google Patents

Mobile terminal and picture processing method Download PDF

Info

Publication number
CN106254643B
CN106254643B CN201610618953.3A CN201610618953A CN106254643B CN 106254643 B CN106254643 B CN 106254643B CN 201610618953 A CN201610618953 A CN 201610618953A CN 106254643 B CN106254643 B CN 106254643B
Authority
CN
China
Prior art keywords
image
sub
mobile terminal
unit
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610618953.3A
Other languages
Chinese (zh)
Other versions
CN106254643A (en
Inventor
马亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Hanquan Information Technology Co ltd
Original Assignee
Ruian Zhizao Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ruian Zhizao Technology Co ltd filed Critical Ruian Zhizao Technology Co ltd
Priority to CN201610618953.3A priority Critical patent/CN106254643B/en
Publication of CN106254643A publication Critical patent/CN106254643A/en
Application granted granted Critical
Publication of CN106254643B publication Critical patent/CN106254643B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Abstract

The invention discloses a mobile terminal and a picture processing method. The mobile terminal includes: the dividing unit is used for dividing the picture to be processed into M image areas based on a first operation and informing the M image areas to the selecting unit, wherein M is a natural number which is more than or equal to 2; the selection unit is used for selecting N image areas from the M image areas based on a second operation and informing the display unit of the N image areas, wherein N is a natural number less than or equal to M; and the display unit is used for determining the image attributes of the N image areas based on the third operation and displaying the N image areas according to the image attributes.

Description

Mobile terminal and picture processing method
Technical Field
The invention relates to the technical field of image processing, in particular to a mobile terminal and a picture processing method.
Background
In recent years, with the rapid development of internet technology and mobile communication network technology, many intelligent terminals such as mobile phones and tablet computers have a photographing function, and users often modify photos after using the intelligent terminals to take photos, for example, after users take photos by using the intelligent terminals, the users can modify the taken photos through image processing software to change the image attributes such as contrast, saturation, color and the like of the photos, so that the taken photos are more attractive.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
in the existing image processing method, only the whole picture can be displayed according to the same picture attribute, but different picture attributes cannot be set for different objects in one picture.
Disclosure of Invention
In view of this, embodiments of the present invention provide a mobile terminal and a picture processing method, which can display different image areas in a picture according to different image attributes, so as to improve the display effect of the picture.
The technical scheme of the embodiment of the invention is realized as follows:
an embodiment of the present invention provides a mobile terminal, including:
the dividing unit is used for dividing the picture to be processed into M image areas based on a first operation and informing the M image areas to the selecting unit, wherein M is a natural number which is more than or equal to 2;
the selection unit is used for selecting N image areas from the M image areas based on a second operation and informing the display unit of the N image areas, wherein N is a natural number less than or equal to M;
and the display unit is used for determining the image attributes of the N image areas based on the third operation and displaying the N image areas according to the image attributes.
In the above embodiment, the dividing unit includes:
the first display subunit is used for displaying a first selection control of the picture to be processed based on a first sub-operation of the first operation and notifying the first selection control to the dividing subunit;
the dividing subunit is configured to divide the to-be-processed picture into M image regions based on a second sub-operation acting on the first selection control.
In the above embodiment, the selection unit includes:
the second display subunit is used for displaying second selection controls of the M image areas based on a third sub-operation of the second operation, and notifying the second selection controls to the selection subunit;
the selecting subunit is configured to select, based on the fourth sub-operation of the second selection control, N image regions from among the M image regions.
In the above embodiment, the display unit includes:
a third display subunit, configured to display a third selection control for an image attribute based on a fifth sub-operation of the third operation, and notify the determination subunit of the third selection control;
the determining subunit is configured to determine, based on a sixth sub-operation that acts on the third selection control, image attributes corresponding to the N image regions.
In the above embodiment, the mobile terminal further includes: a splicing unit;
the display unit is further configured to notify the splicing unit of the to-be-processed picture after the N image regions are displayed according to the image attributes;
and the splicing unit is used for splicing a plurality of predetermined pictures to be processed into a Flash animation.
The embodiment of the invention also provides a picture processing method, which comprises the following steps:
dividing a picture to be processed into M image areas based on a first operation, wherein M is a natural number greater than or equal to 2;
selecting N image areas from the M image areas based on a second operation, wherein N is a natural number less than or equal to M;
and determining the image attributes of the N image areas based on the third operation, and displaying the N image areas according to the image attributes.
In the above embodiment, the dividing the picture to be processed into M image areas based on the first operation includes:
displaying a first selection control of the picture to be processed based on a first sub-operation of the first operation;
dividing the picture to be processed into M image areas based on a second sub-operation acting on the first selection control.
In the above embodiment, the selecting N image regions among the M image regions based on the second operation includes:
displaying a second selection control of the M image regions based on a third sub-operation of the second operation;
selecting N image regions among the M image regions based on a fourth sub-operation of the second selection control.
In the above embodiment, the determining the image attributes of the N image regions based on the third operation includes:
displaying a third selection control for an image attribute based on a fifth sub-operation of the third operation;
and determining image attributes corresponding to the N image areas based on a sixth sub-operation acting on the third selection control.
In the above embodiment, the method further comprises:
and based on the fourth operation, splicing a plurality of predetermined pictures to be processed into a Flash animation.
Therefore, in the technical scheme of the embodiment of the invention, the picture to be processed is divided into M image areas based on the first operation, then N image areas are determined in the M image areas based on the second operation, and the N image areas are displayed according to the corresponding image attributes. That is, in the proposed technical solution of the present invention, after dividing the picture to be processed into M image areas based on the first operation, the mobile terminal may display different image areas according to different image attributes. Therefore, compared with the prior art, the mobile terminal and the picture processing method provided by the embodiment of the invention can display different image areas in one picture according to different image attributes, so that the display effect of the picture can be improved; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
Drawings
Fig. 1 is a schematic hardware configuration diagram of an alternative mobile terminal 100 for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal 100 shown in FIG. 1;
fig. 3 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a mobile terminal according to a second embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating an implementation flow of a picture processing method according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating an implementation method for dividing image regions according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating an implementation method for selecting an image region according to an embodiment of the present invention;
fig. 8 is a flowchart illustrating an implementation method for determining an image attribute according to an embodiment of the present invention.
Detailed Description
It should be understood that the embodiments described herein are only for explaining the technical solutions of the present invention, and are not intended to limit the scope of the present invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a PAD computer (PAD), a Portable Multimedia Player (PMP), a navigation device, and the like.
Fig. 1 is a schematic hardware structure of a mobile terminal 100 for implementing various embodiments of the present invention, and as shown in fig. 1, the mobile terminal 100 may include: a wireless communication unit 110, a user input unit 120, a sensing unit 130, an output unit 140, a memory 150, an interface unit 160, a controller 170, and a power supply unit 180, etc. Fig. 1 illustrates the mobile terminal 100 having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. The elements of the mobile terminal 100 will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit 110 may include: at least one of a mobile communication module 111, a wireless internet module 112, and a short-range communication module 113.
The mobile communication module 111 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 112 supports wireless internet access of the mobile terminal 100. The wireless internet module 112 may be internally or externally coupled to the terminal. The wireless internet access technology referred to by the wireless internet module 112 may include Wireless Local Area Network (WLAN), wireless compatibility authentication (Wi-Fi), wireless broadband (Wibro), worldwide interoperability for microwave access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
The short-range communication module 113 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd the like.
The user input unit 120 may generate key input data to control various operations of the mobile terminal 100 according to a command input by a user. The user input unit 120 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects a change in resistance, pressure, capacitance, etc. due to being touched), a jog wheel, a jog stick, etc. In particular, when the touch pad is superimposed on the display unit 141 in the form of a layer, a touch screen may be formed.
The sensing unit 130 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 130 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 130 can detect whether the power supply unit 180 supplies power or whether the interface unit 160 is coupled with an external device.
The interface unit 160 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port (a typical example is a Universal Serial Bus (USB) port), a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like.
The interface unit 160 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 160 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal 100 is accurately mounted on the cradle.
The output unit 140 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 140 may include a display unit 141, an audio output module 142, and the like.
The display unit 141 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 141 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 141 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, or the like.
Meanwhile, when the display unit 141 and the touch pad are stacked on each other in the form of layers to form a touch screen, the display unit 141 may function as an input device and an output device. The display unit 141 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, mobile terminal 100 may include two or more display units (or other display devices), for example, mobile terminal 100 may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 142 may convert audio data received by the wireless communication unit 110 or stored in the memory 150 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 142 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 142 may include a speaker, a buzzer, and the like.
The memory 150 may store software programs or the like for processing and controlling operations performed by the controller 170, or may temporarily store data (e.g., a phonebook, messages, still images, videos, etc.) that has been output or is to be output. Also, the memory 150 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 150 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 150 through a network connection.
The controller 170 generally controls the overall operation of the mobile terminal 100. For example, the controller 170 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 170 may include a multimedia module 171 for reproducing or playing back multimedia data, and the multimedia module 171 may be constructed within the controller 170 or may be constructed separately from the controller 170. The controller 170 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 180 receives external power or internal power and provides appropriate power required to operate the respective elements and components under the control of the controller 170.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 170. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in memory 150 and executed by controller 170.
Up to this point, the mobile terminal 100 has been described in terms of its functions. Hereinafter, the slide-type mobile terminal 100 among various types of mobile terminals 100, such as the folder-type, bar-type, swing-type, slide-type mobile terminal 100, etc., will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal 100, and is not limited to the slide type mobile terminal 100.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which the mobile terminal 100 according to the present invention is capable of operating will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz, 5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular BS270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
The mobile communication module 111 of the wireless communication unit 110 in the mobile terminal accesses the mobile communication network based on the necessary data (including the user identification information and the authentication information) of the mobile communication network (such as the mobile communication network of 2G/3G/4G, etc.) built in the mobile terminal to transmit the mobile communication data (including the uplink mobile communication data and the downlink mobile communication data) for the services of web browsing, network multimedia playing, etc. of the mobile terminal user.
The wireless internet module 112 of the wireless communication unit 110 implements the function of the wireless hotspot by operating the relevant protocol function of the wireless hotspot, the wireless hotspot supports the access of a plurality of mobile terminals (any mobile terminal except the mobile terminal), mobile communication data (including uplink mobile communication data and downlink mobile communication data) are transmitted for mobile terminal user's services such as web browsing, network multimedia playing, etc. by multiplexing the mobile communication connection between the mobile communication module 111 and the mobile communication network, since the mobile terminal essentially multiplexes the mobile communication connection between the mobile terminal and the communication network for transmitting mobile communication data, the traffic of mobile communication data consumed by the mobile terminal is charged to the communication tariff of the mobile terminal by a charging entity on the side of the communication network, thereby consuming the data traffic of the mobile communication data included in the communication tariff contracted for use by the mobile terminal.
Based on the above hardware structure of the mobile terminal 100 and the communication system, various embodiments of the method of the present invention are proposed.
Fig. 3 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention. As shown in fig. 3, a mobile terminal provided in an embodiment of the present invention may specifically include: a dividing unit 301, a selecting unit 302, and a display unit 303; wherein the content of the first and second substances,
a dividing unit 301, configured to divide a picture to be processed into M image regions based on a first operation, and notify the M image regions to a selecting unit 302, where M is a natural number greater than or equal to 2.
In a specific embodiment of the present invention, the dividing unit 301 may first display a first selection control of the to-be-processed picture based on a first sub-operation of the first operation; the dividing unit 301 may then divide the picture to be processed into M image regions based on the second sub-operation acting on the first selection control.
Preferably, in an embodiment of the present invention, the dividing unit 301 may extract an operation parameter of the first sub-operation, where the operation parameter may include: the first sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the dividing unit 301 may further determine whether the operation parameter of the first sub-operation meets a preset condition, and when the operation parameter of the first sub-operation meets the preset condition, the dividing unit 301 may display a first selection control of the to-be-processed picture.
Preferably, in an embodiment of the present invention, the dividing unit 301 may divide the picture to be processed into M image regions based on the second sub-operation acting on the first selection control. Specifically, the second sub-operation may include: selecting operation of the first division mode and selecting operation of the second division mode. Wherein, the first division mode may be: dividing a picture to be processed into M image areas according to preset segmentation lines; the second division manner may be: dividing the picture to be processed into M image areas according to the shooting object. Specifically, the dividing unit 301 may divide the picture to be processed into M image areas by using an existing picture dividing method.
A selecting unit 302 configured to select N image regions among the M image regions based on a second operation, and notify the display unit 303 of the N image regions, where N is a natural number equal to or less than M.
In an embodiment of the present invention, the selecting unit 302 may first display a second selection control of the M image regions based on a third sub-operation of the second operation; the selection unit 302 may then select N image regions among the M image regions based on a fourth sub-operation of the second selection control. Wherein the fourth sub-operation may be a selected operation in a second selection control.
Preferably, in an embodiment of the present invention, the selecting unit 302 may extract an operation parameter of the third sub-operation, wherein the operation parameter may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter, and a location parameter of the mobile terminal. The selection unit 302 may then select N image regions among the M image regions based on a fourth sub-operation of the second selection control.
A display unit 303, configured to determine image attributes of the N image regions based on the third operation, and display the N image regions according to the image attributes.
In a specific embodiment of the present invention, the display unit 303 may display a third selection control of the image attribute based on a fifth sub-operation of the third operation; the display unit 303 may then determine image properties corresponding to the N image regions based on a sixth sub-operation acting on the third selection control.
Preferably, in an embodiment of the present invention, the display unit 303 may extract an operation parameter of the third sub-operation, wherein the operation parameter may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the display unit 303 may further determine whether the operation parameter of the third sub-operation satisfies a preset condition, and when the operation parameter of the third sub-operation satisfies the preset condition, the display unit 303 may display a third selection control of the image attribute.
Preferably, in an embodiment of the present invention, the display unit 303 may determine the image attributes corresponding to the N image areas based on a sixth sub-operation applied to the third selection control. Specifically, the image attributes may include: at least one of color, brightness, contrast, sharpness, and saturation.
Fig. 4 is a schematic structural diagram of a mobile terminal according to a second embodiment of the present invention. As shown in fig. 4, the dividing unit includes:
the first display subunit 3011 is configured to display a first selection control of the to-be-processed picture based on a first sub-operation of the first operation, and notify the dividing subunit 3012 of the first selection control.
In an embodiment of the present invention, the first display subunit 3011 may extract an operation parameter of the first sub-operation, where the operation parameter may include: the first sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the first display subunit 3011 may further determine whether the operation parameter of the first sub-operation meets a preset condition, and when the operation parameter of the first sub-operation meets the preset condition, the first display subunit 3011 may display a first selection control of the to-be-processed picture.
And a dividing unit 3012, configured to divide the picture to be processed into M image regions based on a second sub-operation applied to the first selection control.
In a specific embodiment of the present invention, the second sub-operation may include: selecting operation of the first division mode and selecting operation of the second division mode. Wherein, the first division mode may be: dividing a picture to be processed into M image areas according to preset segmentation lines; the second division manner may be: dividing the picture to be processed into M image areas according to the shooting object. Specifically, the dividing subunit 3012 may adopt an existing picture dividing method to divide the picture to be processed into M image regions.
Further, the selecting unit 302 includes:
and a second display subunit 3021 configured to display a second selection control for the M image regions based on a third sub-operation of the second operation, and notify the selection subunit 3022 of the second selection control.
In a specific embodiment of the present invention, the second display sub-unit 3021 may display the second selection controls of the M image regions based on a third sub-operation of the second operation. Preferably, the second display subunit 3021 may extract the operation parameters of the third sub-operation, where the operation parameters may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter, and a location parameter of the mobile terminal.
A selection subunit 3022, configured to select N image regions from among the M image regions based on the fourth sub-operation of the second selection control.
In a specific embodiment of the present invention, the selection subunit 3022 may determine N image regions among the M image regions based on the fourth sub-operation of the second selection control. Wherein the fourth sub-operation may be a selected operation in a second selection control.
Further, the display unit 303 includes:
and a third display sub-unit 3031, configured to display a third selection control of the image attribute based on a fifth sub-operation of the third operation, and notify the determination sub-unit 3032 of the third selection control.
In a specific embodiment of the present invention, the third display sub-unit 3031 may display a third selection control of the image property based on a fifth sub-operation of the third operation. Preferably, the third display sub-unit 3031 may extract the operation parameters of the third sub-operation, wherein the operation parameters may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the third display sub-unit 3031 may further determine whether the operation parameter of the third sub-operation satisfies a preset condition, and when the operation parameter of the third sub-operation satisfies the preset condition, the third display sub-unit 3031 may display a third selection control of the image attribute.
The determining subunit 3032 is configured to determine, based on the sixth sub-operation acting on the third selection control, image attributes corresponding to the N image regions.
In a specific embodiment of the present invention, the determining subunit 3032 may determine the image attributes corresponding to the N image regions based on a sixth sub-operation acting on the third selection control. Specifically, the image attributes may include: at least one of color, brightness, contrast, sharpness, and saturation.
Further, the mobile terminal further includes: a splicing unit 304;
the display unit 303 is further configured to notify the splicing unit 304 of the to-be-processed picture after the N image regions are displayed according to the image attributes.
And the splicing unit 304 is configured to splice a plurality of predetermined pictures to be processed into a Flash animation.
In an embodiment of the present invention, the splicing unit 304 may splice a plurality of predetermined pictures to be processed into a Flash animation based on the fourth operation.
According to the mobile terminal provided by the embodiment of the invention, after the picture to be processed is divided into M image areas based on the first operation, different image areas can be displayed according to different image attributes. Therefore, compared with the prior art, the mobile terminal provided by the embodiment of the invention can display different image areas in one picture according to different image attributes, so that the display effect of the picture can be improved; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
Fig. 5 is a schematic flow chart illustrating an implementation of the picture processing method in the embodiment of the present invention, and as shown in fig. 5, the picture processing method may include the following steps:
step 501, dividing the picture to be processed into M image areas based on the first operation.
In the embodiment of the present invention, the mobile terminal may divide the image to be processed into M image areas by using a plurality of methods, where M is a natural number greater than or equal to 2.
Fig. 6 is a schematic flow chart of an implementation method for dividing image areas in the embodiment of the present invention, and as shown in fig. 6, dividing a picture to be processed into M image areas may include the following steps:
step 501a, displaying a first selection control of the picture to be processed based on a first sub-operation of the first operation.
In an embodiment of the present invention, the mobile terminal may extract an operation parameter of the first sub-operation, where the operation parameter may include: the first sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the mobile terminal may further determine whether the operation parameter of the first sub-operation satisfies a preset condition, and when the operation parameter of the first sub-operation satisfies the preset condition, the mobile terminal may display a first selection control of the to-be-processed picture.
And step 501b, dividing the picture to be processed into M image areas based on the second sub-operation acting on the first selection control.
In a specific embodiment of the present invention, the second sub-operation may include: selecting operation of the first division mode and selecting operation of the second division mode. Wherein, the first division mode may be: dividing a picture to be processed into M image areas according to preset segmentation lines; the second division manner may be: dividing the picture to be processed into M image areas according to the shooting object. Specifically, the mobile terminal may use an existing picture dividing method to divide the picture to be processed into M image areas.
According to the above analysis, through the steps 501a to 501b, the mobile terminal can divide the picture to be processed into M image areas, so that N image areas can be selected from the M image areas.
Step 502, based on the second operation, selecting N image areas from the M image areas.
In a specific embodiment of the present invention, the mobile terminal may select N image areas among the M image areas based on the second operation.
Fig. 7 is a flowchart illustrating an implementation method for selecting image regions according to an embodiment of the present invention, and as shown in fig. 7, the method for determining N image regions in M image regions may include the following steps:
and 502a, displaying a second selection control of the M image areas based on a third sub-operation of the second operation.
In a specific embodiment of the present invention, the mobile terminal may display a second selection control of the M image areas based on a third sub-operation of the second operation. Preferably, the mobile terminal may extract an operation parameter of the third sub-operation, wherein the operation parameter may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter, and a location parameter of the mobile terminal.
And step 502b, selecting N image areas from the M image areas based on the fourth sub-operation of the second selection control.
In a specific embodiment of the present invention, the mobile terminal may determine N image regions among the M image regions based on the fourth sub-operation of the second selection control. Wherein the fourth sub-operation may be a selected operation in a second selection control.
As can be seen from the above analysis, through the above steps 501a to 501b, the mobile terminal can select N image areas from the M image areas, so that the mobile terminal can display the N image areas according to the corresponding image attributes.
Step 503, based on the third operation, determining the image attributes of the N image areas, and displaying the N image areas according to the image attributes.
In particular embodiments of the present invention, the mobile terminal may determine the image attributes of the N image regions in a variety of ways. Fig. 8 is a flowchart illustrating an implementation method for determining image attributes in the embodiment of the present invention, and as shown in fig. 8, the method for determining image attributes of N image regions may include the following steps:
step 503a displays a third selection control for the image property based on a fifth sub-operation of the third operation.
In a specific embodiment of the present invention, the mobile terminal may display a third selection control of the image property based on a fifth sub-operation of the third operation. Preferably, the mobile terminal may extract the operation parameters of the third sub-operation, wherein the operation parameters may include: the third sub-operation acts on at least one of a time parameter, a pressure parameter and a position parameter of the mobile terminal; preferably, the mobile terminal may further determine whether the operation parameter of the third sub-operation satisfies a preset condition, and when the operation parameter of the third sub-operation satisfies the preset condition, the mobile terminal may display a third selection control of the image attribute.
Step 503b, determining image attributes corresponding to the N image areas based on the sixth sub-operation acting on the third selection control.
In a specific embodiment of the present invention, the mobile terminal may determine the image attributes corresponding to the N image areas based on a sixth sub-operation acting on the third selection control. Specifically, the image attributes may include: at least one of color, brightness, contrast, sharpness, and saturation.
Preferably, in the embodiment of the present invention, the mobile terminal may further splice a plurality of predetermined pictures to be processed into a Flash animation based on the fourth operation.
According to the picture processing method provided by the embodiment of the invention, after the mobile terminal divides the picture to be processed into M image areas based on the first operation, different image areas can be displayed according to different image attributes. Therefore, compared with the prior art, the picture processing method provided by the embodiment of the invention can display different image areas in one picture according to different image attributes, so that the display effect of the picture can be improved; moreover, the method is simple and convenient to realize, convenient to popularize and wide in application range.
The dividing unit 301, the selecting unit 302 and the displaying unit 303 in the mobile terminal provided by the embodiment of the invention can be realized by a processor in the mobile terminal; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. A mobile terminal, characterized in that the mobile terminal comprises:
the dividing unit is used for dividing the picture to be processed into M image areas based on a first operation and informing the M image areas to the selecting unit, wherein M is a natural number which is more than or equal to 2; the dividing unit includes: the first display subunit is used for displaying a first selection control of the picture to be processed based on a first sub-operation of the first operation and notifying the first selection control to the dividing subunit; the dividing subunit is configured to divide the to-be-processed picture into M image regions based on a second sub-operation acting on the first selection control;
the selection unit is used for selecting N image areas from the M image areas based on a second operation and informing the display unit of the N image areas, wherein N is a natural number less than or equal to M; the selection unit includes: the second display subunit is used for displaying second selection controls of the M image areas based on a third sub-operation of the second operation, and notifying the second selection controls to the selection subunit; the selecting subunit is configured to select N image regions from among the M image regions based on a fourth sub-operation of the second selection control;
and the display unit is used for determining the image attributes of the N image areas based on the third operation and displaying the N image areas according to the image attributes.
2. The mobile terminal according to claim 1, wherein the display unit comprises:
a third display subunit, configured to display a third selection control for an image attribute based on a fifth sub-operation of the third operation, and notify the determination subunit of the third selection control;
the determining subunit is configured to determine, based on a sixth sub-operation that acts on the third selection control, image attributes corresponding to the N image regions.
3. The mobile terminal of claim 1, wherein the mobile terminal further comprises: a splicing unit;
the display unit is further configured to notify the splicing unit of the to-be-processed picture after the N image regions are displayed according to the image attributes;
and the splicing unit is used for splicing a plurality of predetermined pictures to be processed into a Flash animation.
4. A picture processing method, characterized in that the method comprises:
dividing a picture to be processed into M image areas based on a first operation, wherein M is a natural number greater than or equal to 2; the dividing, based on the first operation, the picture to be processed into M image regions includes: displaying a first selection control of the picture to be processed based on a first sub-operation of the first operation; dividing the picture to be processed into M image areas based on a second sub-operation acting on the first selection control;
selecting N image areas from the M image areas based on a second operation, wherein N is a natural number less than or equal to M; the selecting, based on the second operation, N image regions among the M image regions includes: displaying a second selection control of the M image regions based on a third sub-operation of the second operation; selecting N image regions among the M image regions based on a fourth sub-operation of the second selection control;
and determining the image attributes of the N image areas based on the third operation, and displaying the N image areas according to the image attributes.
5. The method of claim 4, wherein determining image attributes for the N image regions based on the third operation comprises:
displaying a third selection control for an image attribute based on a fifth sub-operation of the third operation;
and determining image attributes corresponding to the N image areas based on a sixth sub-operation acting on the third selection control.
6. The method of claim 4, further comprising:
and based on the fourth operation, splicing a plurality of predetermined pictures to be processed into a Flash animation.
CN201610618953.3A 2016-07-29 2016-07-29 Mobile terminal and picture processing method Active CN106254643B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610618953.3A CN106254643B (en) 2016-07-29 2016-07-29 Mobile terminal and picture processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610618953.3A CN106254643B (en) 2016-07-29 2016-07-29 Mobile terminal and picture processing method

Publications (2)

Publication Number Publication Date
CN106254643A CN106254643A (en) 2016-12-21
CN106254643B true CN106254643B (en) 2020-04-24

Family

ID=57606841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610618953.3A Active CN106254643B (en) 2016-07-29 2016-07-29 Mobile terminal and picture processing method

Country Status (1)

Country Link
CN (1) CN106254643B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109568B (en) * 2019-04-29 2023-09-22 努比亚技术有限公司 Image processing method, device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609902A (en) * 2010-12-10 2012-07-25 卡西欧计算机株式会社 Image processing apparatus, image processing method, and storage medium
CN104935787A (en) * 2015-05-29 2015-09-23 努比亚技术有限公司 Image processing method and device
CN105574866A (en) * 2015-12-15 2016-05-11 努比亚技术有限公司 Image processing method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9704059B2 (en) * 2014-02-12 2017-07-11 International Business Machines Corporation Anomaly detection in medical imagery

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609902A (en) * 2010-12-10 2012-07-25 卡西欧计算机株式会社 Image processing apparatus, image processing method, and storage medium
CN104935787A (en) * 2015-05-29 2015-09-23 努比亚技术有限公司 Image processing method and device
CN105574866A (en) * 2015-12-15 2016-05-11 努比亚技术有限公司 Image processing method and apparatus

Also Published As

Publication number Publication date
CN106254643A (en) 2016-12-21

Similar Documents

Publication Publication Date Title
CN106909274B (en) Image display method and device
CN106775391B (en) Interface switching device and method
CN105468158B (en) Color adjustment method and mobile terminal
CN106648369A (en) Switching device and method for application program
CN106803860B (en) Storage processing method and device for terminal application
CN107071263B (en) Image processing method and terminal
CN105376411A (en) Mobile terminal flashlight control method and device
CN105681582A (en) Control color adjusting method and terminal
WO2018050080A1 (en) Mobile terminal, picture processing method and computer storage medium
CN106547439B (en) Method and device for processing message
CN106131285B (en) Call method and terminal
CN105760055A (en) Mobile terminal and control method thereof
CN106161790B (en) Mobile terminal and control method thereof
CN105554382A (en) Mobile terminal and shooting control method thereof
CN104866095A (en) Mobile terminal, and method and apparatus for managing desktop thereof
CN106792878B (en) Data traffic monitoring method and device
CN105722142A (en) Mobile terminal and multilink-based data streaming method
CN106201482A (en) A kind of data processing method and electronic equipment
CN106856542B (en) Method and device for automatically adjusting icon color according to wallpaper color
CN105722246A (en) Network speed superposition device and method
CN106897044B (en) Screen color temperature consistency fault-tolerant method and terminal
CN106020693B (en) Desktop page entering method and electronic equipment
CN107690178B (en) Method for reducing power consumption, mobile terminal and computer readable storage medium
CN106254643B (en) Mobile terminal and picture processing method
CN106900037B (en) Display method and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200331

Address after: 325200, Wenzhou, Zhejiang, Ruian province Anyang Times Building A street, building four northeast head

Applicant after: RUIAN ZHIZAO TECHNOLOGY Co.,Ltd.

Address before: 518000 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor

Applicant before: NUBIA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230404

Address after: 402-1/402-2, No. 7 Caibin Road, Science City, Huangpu District, Guangzhou City, Guangdong Province, 510700

Patentee after: Guangzhou Hanquan Information Technology Co.,Ltd.

Address before: 325200 northeast first floor, building a, times building, Anyang street, Ruian City, Wenzhou City, Zhejiang Province

Patentee before: RUIAN ZHIZAO TECHNOLOGY Co.,Ltd.