CN109242768B - Image processing method and terminal equipment - Google Patents

Image processing method and terminal equipment Download PDF

Info

Publication number
CN109242768B
CN109242768B CN201811270121.2A CN201811270121A CN109242768B CN 109242768 B CN109242768 B CN 109242768B CN 201811270121 A CN201811270121 A CN 201811270121A CN 109242768 B CN109242768 B CN 109242768B
Authority
CN
China
Prior art keywords
image
feature
processing
input
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811270121.2A
Other languages
Chinese (zh)
Other versions
CN109242768A (en
Inventor
郑达川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811270121.2A priority Critical patent/CN109242768B/en
Publication of CN109242768A publication Critical patent/CN109242768A/en
Application granted granted Critical
Publication of CN109242768B publication Critical patent/CN109242768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/04
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Abstract

The embodiment of the invention discloses an image processing method and terminal equipment, which are applied to the technical field of communication and can solve the problem of low image processing efficiency in the prior art. The method comprises the following steps: acquiring a first characteristic of a first image to be processed, wherein the first characteristic comprises at least one of a content characteristic, a size characteristic, a brightness characteristic and a color characteristic; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. The method is applied in the scene of image processing.

Description

Image processing method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and terminal equipment.
Background
With the popularization of terminal technology, the application of terminals is becoming more and more widespread. In daily life, in the process of using the terminal by a user, the terminal is often triggered to process some images, for example, the terminal is triggered to beautify the face in the images; or triggering the terminal to adjust the contrast of the image; or the trigger terminal adjusts the brightness of the image, etc.
Currently, in the case where a user is required to process a large number of images, the user may need to repeatedly trigger the terminal a plurality of times to perform a large number of processing operations, resulting in low efficiency in processing images.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, which are used for solving the problem of lower image processing efficiency in the prior art.
In order to solve the technical problems, the embodiment of the invention is realized as follows:
in a first aspect, an image processing method is provided, applied to a terminal device, and the method includes: acquiring a first characteristic of a first image to be processed, wherein the first characteristic comprises at least one of a content characteristic, a size characteristic, a brightness characteristic and a color characteristic; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image.
In a second aspect, there is provided a terminal device comprising: the device comprises an acquisition module, a processing module and a display module; the acquisition module is used for acquiring first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics; the processing module is used for processing the first feature of the first image according to a first processing parameter corresponding to the second feature under the condition that the first feature acquired by the acquisition module accords with the second feature pre-stored in the terminal equipment; the display module is used for displaying the first image processed by the processing module.
In a third aspect, there is provided a terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the image processing method according to the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method according to the first aspect.
In the embodiment of the invention, the terminal equipment can acquire the first characteristic of the first image to be processed, wherein the first characteristic comprises at least one of content characteristic, size characteristic, brightness characteristic and color characteristic; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. According to the scheme, the terminal equipment can acquire the first characteristic of the first image to be processed, and automatically processes the first characteristic of the first image according to the first processing parameter corresponding to the second characteristic under the condition that the first characteristic accords with the second characteristic pre-stored in the terminal equipment, so that the terminal can complete the processing of the image without triggering the terminal under the condition that the user processes the image, and the image processing efficiency can be improved.
Drawings
Fig. 1 is a schematic diagram of a possible architecture of an android operating system according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an image processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a second image processing method according to an embodiment of the present invention;
fig. 4 is a schematic diagram III of an image processing method according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a fourth image processing method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram II of a terminal device according to an embodiment of the present invention;
fig. 8 is a schematic hardware diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first and second and the like in the description and in the claims, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order of the objects. For example, the first input and the second input, etc., are used to distinguish between different inputs, and are not used to describe a particular order of inputs.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The terminal device in the embodiment of the invention can be a terminal device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present invention is not limited specifically.
The software environment to which the image processing method provided by the embodiment of the invention is applied is described below by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, respectively: an application program layer, an application program framework layer, a system runtime layer and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third party application programs) in the android operating system.
The application framework layer is a framework of applications, and developers can develop some applications based on the application framework layer while adhering to the development principle of the framework of the applications.
The system runtime layer includes libraries (also referred to as system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of the android operating system, and belongs to the bottommost layer of the software hierarchy of the android operating system. The kernel layer provides core system services and a driver related to hardware for the android operating system based on a Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may be operated based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can realize the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The embodiment of the invention provides an image processing method and terminal equipment, wherein the terminal equipment can acquire first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. According to the scheme, the terminal equipment can acquire the first characteristic of the first image to be processed, and automatically processes the first characteristic of the first image according to the first processing parameter corresponding to the second characteristic under the condition that the first characteristic accords with the second characteristic pre-stored in the terminal equipment, so that the terminal can complete the processing of the image without triggering the terminal under the condition that the user processes the image, and the image processing efficiency can be improved.
The terminal device in the embodiment of the invention can be a mobile terminal device or a non-mobile terminal device. The mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant, PDA), etc.; the non-mobile terminal device may be a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiment of the present invention is not particularly limited.
The execution body of the image processing method provided by the embodiment of the invention can be the terminal equipment (including mobile terminal equipment and non-mobile terminal equipment), or can be a functional module and/or a functional entity capable of realizing the image processing method in the terminal equipment, and the execution body can be specifically determined according to actual use requirements, and the embodiment of the invention is not limited. An exemplary description will be given below of an image processing method provided in an embodiment of the present invention, taking a terminal device as an example.
As shown in fig. 2, the image processing method provided in the embodiment of the present invention may include S11-S13.
S11, the terminal equipment acquires first characteristics of a first image to be processed.
Wherein the first feature comprises at least one of a content feature, a size feature, a brightness feature, and a color feature.
Alternatively, the content features may include face features in the image or font features in the image, and so on.
Alternatively, the above-mentioned dimensional feature may be an aspect ratio of the image, i.e., a ratio of a width of the image to a height of the image.
Alternatively, the luminance characteristic may be a luminance distribution of the image (i.e. the number of pixels corresponding to the different luminance values that are counted), and the luminance distribution in the image may be represented generally in the form of a histogram.
Alternatively, the color feature may be a color in the image.
S12, the terminal device processes the first feature of the first image according to the first processing parameter corresponding to the second feature when the first feature accords with the second feature pre-stored in the terminal device.
The terminal device may pre-store a plurality of sets of correspondence between the features of the image and the processing parameters (the correspondence stored in the embodiment of the present invention includes a correspondence between the second feature and the first processing parameter), compare the first feature acquired by the terminal device with the stored image features (i.e., the image features in the correspondence between the features of the pre-stored image and the processing parameters), and if the first feature matches the second feature in the image features stored in the terminal device, process the first feature of the first image according to the first processing parameter corresponding to the second feature.
Alternatively, taking the aspect ratio of the image as an example, the correspondence between the saved second feature and the first processing parameter may be a correspondence between the aspect ratio of the image and the cropping ratio of the image.
For example, as shown in table 1 below, the correspondence relationship between the aspect ratios of the plurality of sets of images and the cropping ratios of the images may be stored in the terminal device.
TABLE 1
Aspect ratio of an image Image cropping ratio
1:1 1:1
1:2 2:3
2:3 3:2
In table 1, when the aspect ratio of the image is 1:1, the image may be cropped according to the cropping ratio 1:1 of the corresponding image (i.e. the size of the image is not changed); when the aspect ratio of the image is 1:2, the image can be cut according to the corresponding cutting proportion of 2:3 of the image, namely, the aspect ratio of the image cutting is cut to be 2:3; when the aspect ratio of the image is 2:3, the image can be cropped according to the cropping proportion of the corresponding image of 3:2, namely, the aspect ratio of the image cropping is cropped to 3:2.
For example, assuming that the second feature in the embodiment of the present invention is that the aspect ratio of the image is 1:2 and the first processing parameter is the cropping ratio of the image is 2:3, when the first feature of the acquired first image is that the aspect ratio is 1:2, the first feature may be considered to conform to the second feature, and the aspect ratio of the first image is cropped according to the cropping ratio of 2:3.
Alternatively, taking the first feature as the brightness distribution of the image as an example, the corresponding relationship between the stored second feature and the first processing parameter may be the corresponding relationship between the brightness distribution of the image and the brightness adjustment parameter. Wherein the brightness adjustment parameter may include at least one of a brightness value, an exposure parameter, and a contrast of the image.
Alternatively, taking the first feature as an example of a face in the image, the corresponding relationship between the stored second feature and the first processing parameter may be a corresponding relationship between the face and parameters of the beautifying functions such as skin grinding, nose thinning, and eyes enlarging.
S13, the terminal equipment displays the processed first image.
The embodiment of the invention provides an image processing method, wherein a terminal device can acquire first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. According to the scheme, the terminal equipment can acquire the first characteristic of the first image to be processed, and automatically processes the first characteristic of the first image according to the first processing parameter corresponding to the second characteristic under the condition that the first characteristic accords with the second characteristic pre-stored in the terminal equipment, so that the terminal can complete the processing of the image without triggering the terminal under the condition that the user processes the image, and the image processing efficiency can be improved.
Optionally, in conjunction with fig. 2, as shown in fig. 3, before the terminal device obtains the first feature of the first image to be processed, the image processing method provided in the embodiment of the present invention further includes the following S14.
S14, the terminal equipment receives a first input of a user.
Wherein the first input is for selecting the first image. Alternatively, the first input may be a click input to the first image.
Alternatively, S11 may be replaced with S11a described below.
S11a, the terminal equipment responds to a first input to acquire a first characteristic of a first image to be processed.
Optionally, with reference to fig. 3, as shown in fig. 4, before S14, the image processing method provided in the embodiment of the present invention further includes the following S15 to S18.
S15, the terminal equipment receives a second input of the user.
Wherein the second input is used to select a second image to be processed. Alternatively, the first input may be a click input to the first image.
S16, the terminal equipment responds to the second input and displays a second image, wherein the second image comprises the second characteristic.
S17, the terminal equipment receives a third input of the user on the second image.
S18, the terminal equipment responds to the third input, processes the second characteristic according to the first processing parameter corresponding to the third input, and stores the first corresponding relation.
The first corresponding relation is a corresponding relation between the first processing parameter and the second characteristic.
In S18, the terminal device may be triggered manually to process the image through the input of the user, and the correspondence (i.e., the first correspondence) between the processed image features and the processing parameters is saved.
In the embodiment of the invention, the user can save the corresponding relation between the image characteristics and the processing parameters through the S15-S18.
Optionally, in the embodiment of the present invention, the user may further receive a correspondence between the saved image feature and the processing parameter sent by the professional user, and save the correspondence. The professional user may be a user with a professional skill for processing the image, and the correspondence sent by the professional user may be in the form of a data table or in the form of a derived configuration file.
Optionally, in conjunction with fig. 4, as shown in fig. 5, before S15, the image processing method provided in the embodiment of the present invention further includes S19 described below.
S19, storing the second corresponding relation.
The second corresponding relation is a corresponding relation between the second processing parameter and the second feature.
It should be noted that, in the embodiment of the present invention, the manner of saving the second corresponding relationship is similar to the manner of saving the first corresponding relationship shown in fig. 4, and is not repeated here.
As shown in fig. 5, S18 may be replaced with S18a described below.
S18a, the terminal equipment responds to the third input, processes the second characteristic according to the first processing parameter corresponding to the third input, and updates the stored second corresponding relation to the first corresponding relation.
Optionally, in the embodiment of the present invention, the user may trigger the terminal to update the stored correspondence according to the image processing method shown in fig. 5, so that the user may adjust the stored correspondence in time, so that the image processing is more flexible.
Optionally, in an embodiment of the present invention, the interface for displaying the first image may include m options for processing parameters, where each option for processing parameters includes at least one option for sub-processing parameters. Wherein, the option of at least one sub-processing parameter in the options of one processing parameter is the option of at least one sub-processing parameter with the frequency of use of the user being greater than or equal to a preset frequency threshold value in the options of one processing parameter, and m is an integer greater than or equal to 1.
It can be appreciated that the preset frequency threshold may be set according to practical situations, and embodiments of the present invention are not limited specifically.
By way of example, the options for the m processing parameters may be graffiti color, brush size, borders, fonts, and the like. Taking the option of the processing parameter as a font as an example, all the sub-processing parameters of the option of the processing parameter may be of all font types, and at least one sub-processing parameter option included in the option of the processing parameter may be of at least one font type of regular script, song Ti and bold with the highest use frequency.
According to the embodiment of the invention, the image can be automatically processed according to the corresponding relation between the stored image characteristics and the processing parameters, the time for selecting the options of the processing parameters when the user manually processes the image is reduced, and the options of m processing parameters can be displayed in the interface for displaying the first image, so that the user can manually select the options of the processing parameters to finely adjust the image on the basis of automatically processing the image, and the image refinement and individuation processing can be realized.
It should be noted that, in the embodiments of the present invention, the image processing methods shown in the foregoing drawings are all described by way of example with reference to one drawing in the embodiments of the present invention. In specific implementation, the image processing method shown in each drawing may be further implemented in combination with any other drawing that may be illustrated in the foregoing embodiments, and will not be described herein.
As shown in fig. 6, an embodiment of the present invention provides a terminal device 130, where the terminal device 130 includes an obtaining module 131, a processing module 132, and a display module 133.
The obtaining module 131 is configured to obtain a first feature of the first image to be processed, where the first feature includes at least one of a content feature, a size feature, a brightness feature, and a color feature.
The processing module 132 is configured to process the first feature of the first image according to a first processing parameter corresponding to the second feature, where the first feature acquired by the acquiring module 131 matches the second feature stored in the terminal device.
The display module 133 is configured to display the first image processed by the processing module 132.
Optionally, in conjunction with fig. 6, as shown in fig. 7, the terminal device 130 further includes a receiving module 134. A receiving module 134 for receiving a first input from a user, the first input being for selecting a first image; the acquiring module 135 is specifically configured to acquire the first feature of the first image in response to the first input received by the receiving module 134.
Optionally, the receiving module 134 is further configured to receive a second input from the user before receiving the first input from the user, where the second input is used to select the second image to be processed. A display module 133 further for displaying a second image in response to the second input received by the receiving module 134, the second image comprising the second feature; the receiving module 134 is further configured to receive a third input of the second image displayed by the display module from the user; the processing module 132 is further configured to process the second feature according to the first processing parameter corresponding to the third input in response to the third input received by the receiving module 134, and store a first correspondence, where the first correspondence is a correspondence between the first processing parameter and the second feature.
Optionally, the processing module 132 is further configured to store a second correspondence, where the second correspondence is a correspondence between a second processing parameter and a second feature, before the receiving module 134 receives a second input of the user; the processing module 132 is specifically configured to update the saved second correspondence to the first correspondence.
Optionally, the interface for displaying the first image includes m options of processing parameters, where each option of processing parameters includes at least one option of sub-processing parameters; wherein, the option of at least one sub-processing parameter in the options of one processing parameter is the option of at least one sub-processing parameter with the frequency of use of the user being greater than or equal to a preset frequency threshold value in the options of one processing parameter, and m is an integer greater than or equal to 1.
The terminal device provided by the embodiment of the present invention can implement each process shown in the foregoing method embodiment, and in order to avoid repetition, details are not repeated here.
The embodiment of the invention provides terminal equipment, which can acquire first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. According to the scheme, the terminal equipment can acquire the first characteristic of the first image to be processed, and automatically processes the first characteristic of the first image according to the first processing parameter corresponding to the second characteristic under the condition that the first characteristic accords with the second characteristic pre-stored in the terminal equipment, so that the terminal can complete the processing of the image without triggering the terminal under the condition that the user processes the image, and the image processing efficiency can be improved.
Fig. 8 is a hardware schematic of a terminal device implementing various embodiments of the present invention, where the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 8 does not constitute a limitation of the terminal device, and the terminal device may comprise more or less components than shown, or may combine certain components, or may have a different arrangement of components. In the embodiment of the invention, the terminal equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal equipment, a wearable device, a pedometer and the like.
Wherein the processor 110 is configured to obtain a first feature of a first image to be processed, where the first feature includes at least one of a content feature, a size feature, a brightness feature, and a color feature; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; the control display unit 106 displays the processed first image.
The embodiment of the invention provides terminal equipment, which can acquire first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics; processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature pre-stored in the terminal equipment; displaying the processed first image. According to the scheme, the terminal equipment can acquire the first characteristic of the first image to be processed, and automatically processes the first characteristic of the first image according to the first processing parameter corresponding to the second characteristic under the condition that the first characteristic accords with the second characteristic pre-stored in the terminal equipment, so that the terminal can complete the processing of the image without triggering the terminal under the condition that the user processes the image, and the image processing efficiency can be improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be configured to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the received downlink data with the processor 110; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 may also communicate with networks and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user to send and receive e-mail, browse web pages, access streaming media, etc.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal device 100. The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used for receiving an audio or video signal. The input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of a still image or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. Microphone 1042 may receive sound and be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode.
The terminal device 100 further comprises at least one sensor 105, such as a light sensor, a motion sensor and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 1061 and/or the backlight when the terminal device 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when the accelerometer sensor is stationary, and can be used for recognizing the gesture (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking) and the like of the terminal equipment; the sensor 105 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. Further, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. In particular, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 110 to determine the type of touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components for implementing the input and output functions of the terminal device, in some embodiments, the touch panel 1071 may be integrated with the display panel 1061 to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 108 is an interface to which an external device is connected to the terminal apparatus 100. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and an external device.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 110 is a control center of the terminal device, connects respective parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power source 111 (e.g., a battery) for supplying power to the respective components, and optionally, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. In addition, the terminal device 100 includes some functional modules, which are not shown, and will not be described herein.
The embodiment of the invention also provides a terminal device, which can include a processor, a memory and a computer program stored in the memory and capable of running on the processor, wherein the computer program can realize each process executed by the terminal device in the method embodiment and can achieve the same technical effect when being executed by the processor, and the repetition is avoided, and the description is omitted here.
The embodiment of the invention provides a computer readable storage medium, which is characterized in that the computer readable storage medium stores a computer program, and when the computer program is executed by a processor, the computer program realizes each process executed by a terminal device in the embodiment of the method, and can achieve the same technical effect, so that repetition is avoided, and no redundant description is provided herein. The computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, an optical disk, or the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (7)

1. An image processing method applied to a terminal device, the method comprising:
acquiring a first characteristic of a first image to be processed, wherein the first characteristic comprises at least one of a content characteristic, a size characteristic, a brightness characteristic and a color characteristic;
processing the first feature of the first image according to a first processing parameter corresponding to a second feature when the first feature accords with the second feature stored in the terminal equipment;
displaying the processed first image;
before the acquiring the first feature of the first image to be processed, the method further includes:
receiving a second input from a user, the second input being used to select a second image to be processed;
in response to the second input, displaying a second image, the second image including the second feature;
receiving a third input of a user to the second image;
responding to the third input, processing the second feature according to the first processing parameter corresponding to the third input, and storing a first corresponding relation between the first processing parameter and the second feature;
wherein the interface for displaying the first image comprises m options of processing parameters, and each option of the processing parameters comprises at least one option of sub-processing parameters; the option of at least one sub-process parameter in the options of one process parameter is at least one sub-option with highest user use frequency in the options of all sub-process parameters in the options of one process parameter, and m is an integer greater than or equal to 1.
2. The method of claim 1, wherein prior to the acquiring the first characteristic of the first image to be processed, the method further comprises:
receiving a first input from a user, the first input for selecting the first image;
the acquiring a first feature of a first image to be processed includes:
in response to the first input, a first feature of a first image to be processed is acquired.
3. The method of claim 1, wherein prior to the receiving the second input from the user, the method further comprises:
storing a second corresponding relation, wherein the second corresponding relation is a corresponding relation between a second processing parameter and the second feature;
the storing the first correspondence includes:
updating the stored second corresponding relation to the first corresponding relation.
4. A terminal device, comprising: the device comprises an acquisition module, a processing module and a display module;
the acquisition module is used for acquiring first characteristics of a first image to be processed, wherein the first characteristics comprise at least one of content characteristics, size characteristics, brightness characteristics and color characteristics;
the processing module is configured to process, when the first feature acquired by the acquiring module meets a second feature stored in the terminal device, the first feature of the first image according to a first processing parameter corresponding to the second feature;
the display module is used for displaying the first image processed by the processing module;
the receiving module is further configured to receive a second input of a user before the obtaining module obtains the first feature of the first image to be processed, where the second input is used to select the second image to be processed;
the display module is further configured to display a second image in response to the second input received by the receiving module, the second image including the second feature;
the receiving module is further used for receiving a third input of a second image displayed by the display module by a user;
the processing module is further configured to process the second feature according to the first processing parameter corresponding to the third input in response to the third input received by the receiving module, and store a first correspondence, where the first correspondence is a correspondence between the first processing parameter and the second feature;
wherein the interface for displaying the first image comprises m options of processing parameters, and each option of the processing parameters comprises at least one option of sub-processing parameters; the option of at least one sub-process parameter in the options of one process parameter is at least one sub-option with highest user use frequency in the options of all sub-process parameters in the options of one process parameter, and m is an integer greater than or equal to 1.
5. The terminal device of claim 4, wherein the terminal device further comprises a receiving module;
the receiving module is used for receiving a first input of a user, wherein the first input is used for selecting the first image;
the acquisition module is specifically configured to acquire the first feature of the first image in response to the first input received by the receiving module.
6. The terminal device of claim 4, wherein the terminal device,
the processing module is further configured to store a second correspondence, where the second correspondence is a correspondence between a second processing parameter and the second feature, before the receiving module receives a second input from the user;
the processing module is specifically configured to update the stored second correspondence to the first correspondence.
7. A terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the image processing method according to any one of claims 1 to 3.
CN201811270121.2A 2018-10-29 2018-10-29 Image processing method and terminal equipment Active CN109242768B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811270121.2A CN109242768B (en) 2018-10-29 2018-10-29 Image processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811270121.2A CN109242768B (en) 2018-10-29 2018-10-29 Image processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN109242768A CN109242768A (en) 2019-01-18
CN109242768B true CN109242768B (en) 2023-09-22

Family

ID=65079084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811270121.2A Active CN109242768B (en) 2018-10-29 2018-10-29 Image processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN109242768B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930979A (en) * 2020-07-29 2020-11-13 广州华多网络科技有限公司 Image processing method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413268A (en) * 2013-08-06 2013-11-27 厦门美图移动科技有限公司 Photographing method capable of automatically optimizing facial form
CN103632165A (en) * 2013-11-28 2014-03-12 小米科技有限责任公司 Picture processing method, device and terminal equipment
CN106657793A (en) * 2017-01-11 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN107704292A (en) * 2017-10-18 2018-02-16 维沃移动通信有限公司 Picture method to set up and mobile terminal in a kind of application program
WO2018045961A1 (en) * 2016-09-06 2018-03-15 努比亚技术有限公司 Image processing method, and terminal and storage medium
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108616682A (en) * 2018-05-18 2018-10-02 维沃移动通信有限公司 A kind of camera head protecting method and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413268A (en) * 2013-08-06 2013-11-27 厦门美图移动科技有限公司 Photographing method capable of automatically optimizing facial form
CN103632165A (en) * 2013-11-28 2014-03-12 小米科技有限责任公司 Picture processing method, device and terminal equipment
WO2018045961A1 (en) * 2016-09-06 2018-03-15 努比亚技术有限公司 Image processing method, and terminal and storage medium
CN106657793A (en) * 2017-01-11 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN107704292A (en) * 2017-10-18 2018-02-16 维沃移动通信有限公司 Picture method to set up and mobile terminal in a kind of application program
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108616682A (en) * 2018-05-18 2018-10-02 维沃移动通信有限公司 A kind of camera head protecting method and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于ARM的嵌入式数字图像处理方法分析;赵云鹏;《同行》;20160223(第04期);全文 *

Also Published As

Publication number Publication date
CN109242768A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
US20220276909A1 (en) Screen projection control method and electronic device
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN110062105B (en) Interface display method and terminal equipment
CN107886321B (en) Payment method and mobile terminal
CN109085968B (en) Screen capturing method and terminal equipment
CN108898555B (en) Image processing method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN110750189B (en) Icon display method and device
CN107749046B (en) Image processing method and mobile terminal
CN111562896B (en) Screen projection method and electronic equipment
CN107734172B (en) Information display method and mobile terminal
CN109816759B (en) Expression generating method and device
CN111061450B (en) Parameter adjusting method and electronic equipment
CN110928407B (en) Information display method and device
CN110536005B (en) Object display adjustment method and terminal
CN110096203B (en) Screenshot method and mobile terminal
CN110012151B (en) Information display method and terminal equipment
CN109669656B (en) Information display method and terminal equipment
CN109343811B (en) Display adjustment method and terminal equipment
CN107729100B (en) Interface display control method and mobile terminal
CN110740265B (en) Image processing method and terminal equipment
CN110780751B (en) Information processing method and electronic equipment
CN110851098B (en) Video window display method and electronic equipment
CN109819331B (en) Video call method, device and mobile terminal
CN109491631B (en) Display control method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant