CN107864333B - Image processing method, device, terminal and storage medium - Google Patents
Image processing method, device, terminal and storage medium Download PDFInfo
- Publication number
- CN107864333B CN107864333B CN201711090044.8A CN201711090044A CN107864333B CN 107864333 B CN107864333 B CN 107864333B CN 201711090044 A CN201711090044 A CN 201711090044A CN 107864333 B CN107864333 B CN 107864333B
- Authority
- CN
- China
- Prior art keywords
- image
- image information
- image processing
- processing mode
- terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 14
- 238000012545 processing Methods 0.000 claims abstract description 153
- 238000000034 method Methods 0.000 claims abstract description 40
- 238000012549 training Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 22
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 238000013461 design Methods 0.000 description 13
- 238000007726 management method Methods 0.000 description 9
- 230000001788 irregular Effects 0.000 description 7
- 238000013500 data storage Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000007373 indentation Methods 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 235000009470 Theobroma cacao Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 244000240602 cacao Species 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
The application discloses an image processing method, an image processing device, a terminal and a storage medium, belongs to the technical field of terminals, and is applied to the terminal, wherein the method comprises the following steps: when the terminal is in a shooting state, acquiring image information of a shooting image; determining an image processing mode of the image according to the image information; and processing the image according to the image processing mode. According to the method and the device, the terminal can determine the image processing mode of the image according to the image information of the image which is shot, and then automatically process the image, so that frequent manual operation of a user is not needed, and the use by the user is facilitated.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an image processing method, an image processing apparatus, a terminal, and a storage medium.
Background
With the rapid development of terminal technology, the functions of terminals such as mobile phones and tablet computers are increasingly powerful, and the terminals become indispensable tools in work and life of people gradually. A camera is usually installed in a terminal to implement a shooting function, and in order to improve the aesthetic degree of a shot image, the terminal often processes the shot image to achieve an image beautification effect desired by a user.
In the related art, when a user uses a terminal to shoot, if the user wants to beautify a shot image, an image beautifying function of the terminal can be started, and the terminal can process the shot image; if the user does not want to beautify the shot image, the image beautifying function of the terminal can be closed, and the terminal does not process the shot image at the moment.
Disclosure of Invention
The embodiment of the invention provides an image processing method, an image processing device, a terminal and a storage medium, which can be used for solving the problem that a terminal in the related art cannot automatically process a shot image, and the technical scheme is as follows:
according to a first aspect of the embodiments of the present invention, there is provided an image processing method applied to a terminal, the method including:
when the terminal is in a shooting state, acquiring image information of a shooting image;
determining an image processing mode of the image according to the image information;
and processing the image according to the image processing mode.
Optionally, the determining an image processing mode of the image according to the image information includes:
acquiring an image processing mode corresponding to the image information from a corresponding relation between the stored image information and the image processing mode;
and determining an image processing mode corresponding to the image information as the image processing mode of the image.
Optionally, the determining an image processing mode of the image according to the image information includes:
determining a category to which the image information belongs by specifying a classifier;
and determining the image processing mode of the image according to the category to which the image information belongs.
Optionally, before determining the category to which the image information belongs by the specified classifier, the method further includes:
acquiring a plurality of preset image information sets, wherein at least one preset image information included in each preset image information set in the plurality of preset image information sets belongs to the same category;
and training the classifier to be trained by using the plurality of preset image information sets to obtain the specified classifier.
Optionally, the method further comprises:
acquiring image information and an image processing mode of each image in a plurality of images obtained by shooting;
determining the category to which the image information of each image belongs according to the image processing mode of each image;
and updating the specified classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs.
According to a second aspect of the embodiments of the present invention, there is provided an image processing apparatus applied to a terminal, the apparatus including:
the terminal comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring image information of a shooting image when the terminal is in a shooting state;
the first determining module is used for determining the image processing mode of the image according to the image information;
and the processing module is used for processing the image according to the image processing mode.
Optionally, the first determining module includes:
the first obtaining submodule is used for obtaining an image processing mode corresponding to the image information from the corresponding relation between the stored image information and the image processing mode;
and the first determining submodule is used for determining an image processing mode corresponding to the image information as the image processing mode of the image.
Optionally, the first determining module includes:
a second determination sub-module for determining a category to which the image information belongs by specifying a classifier;
and the third determining submodule is used for determining the image processing mode of the image according to the category to which the image information belongs.
Optionally, the first determining module further includes:
the second obtaining submodule is used for obtaining a plurality of preset image information sets, and at least one piece of preset image information included in each preset image information set in the plurality of preset image information sets belongs to the same category;
and the training submodule is used for training the classifier to be trained by using the plurality of preset image information sets to obtain the specified classifier.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring the image information and the image processing mode of each image in the plurality of images obtained by shooting;
the second determining module is used for determining the category of the image information of each image according to the image processing mode of each image;
and the updating module is used for updating the specified classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs.
According to a third aspect of embodiments of the present invention, there is provided a terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor being configured to perform the steps of the method of the first aspect.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, when the terminal is in a shooting state, the user is indicated to use the terminal to shoot the image, the image information of the image which is shot can be obtained at the moment, the image processing mode of the image is determined according to the image information of the image, and then the terminal can automatically process the image according to the image processing mode of the image, so that frequent manual operation of the user is avoided, the use by the user is facilitated, the flexibility of image processing is improved, and the image processing requirement of the user is met.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another image processing method provided by the embodiment of the invention;
fig. 3A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 3B is a schematic structural diagram of a first determining module according to an embodiment of the present invention;
fig. 3C is a schematic structural diagram of a second first determining module according to an embodiment of the present invention;
fig. 3D is a schematic structural diagram of a third first determining module according to an embodiment of the present invention;
FIG. 3E is a schematic structural diagram of another image processing apparatus according to an embodiment of the present invention;
fig. 4A is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 4B is a schematic structural diagram of another terminal according to an embodiment of the present invention;
FIG. 4C is a schematic structural diagram of a full-screen provided in an embodiment of the present invention;
FIG. 4D is a schematic structural diagram of a curved screen according to an embodiment of the present invention;
fig. 4E is a schematic structural diagram of a first irregular-shaped screen provided in the embodiment of the present invention;
fig. 4F is a schematic structural diagram of a second irregular screen provided in the embodiment of the present invention;
fig. 4G is a schematic structural diagram of a third irregular screen provided in the embodiment of the present invention;
fig. 4H is a schematic structural diagram of a fourth irregular screen provided in the embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiments of the present invention in detail, an application scenario related to the embodiments of the present invention will be described.
In daily life, people often use a terminal as a shooting tool, and in order to improve the aesthetic degree of a shot image, the terminal often processes the shot image so as to achieve an image beautifying effect desired by a user. At present, a user can manually turn on or turn off an image beautifying function of a terminal to control the processing of a terminal on a shot image, so that the image processing process is complicated, and the user is inconvenient to use. To this end, embodiments of the present invention provide an image processing method, which can automatically process an image according to image information of the image being captured, thereby improving flexibility of image processing.
Next, an image processing method according to an embodiment of the present invention will be described in detail with reference to the drawings.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, which is applied to a terminal. Referring to fig. 1, the method comprises the steps of:
step 101: when the terminal is in a photographing state, image information of an image being photographed is acquired.
Step 102: an image processing mode of the image is determined based on the image information.
Step 103: the image is processed according to an image processing mode.
In the embodiment of the invention, when the terminal is in a shooting state, the user is indicated to use the terminal to shoot the image, the image information of the image which is shot can be obtained at the moment, the image processing mode of the image is determined according to the image information of the image, and then the terminal can automatically process the image according to the image processing mode of the image, so that frequent manual operation of the user is avoided, the use by the user is facilitated, the flexibility of image processing is improved, and the image processing requirement of the user is met.
Optionally, determining an image processing mode of the image according to the image information includes:
acquiring an image processing mode corresponding to the image information from the corresponding relation between the stored image information and the image processing mode;
and determining the image processing mode corresponding to the image information as the image processing mode of the image.
Optionally, determining an image processing mode of the image according to the image information includes:
determining a category to which the image information belongs by specifying a classifier;
and determining the image processing mode of the image according to the category to which the image information belongs.
Optionally, before determining the category to which the image information belongs by specifying the classifier, the method further includes:
acquiring a plurality of preset image information sets, wherein at least one preset image information included in each preset image information set in the plurality of preset image information sets belongs to the same category;
and training the classifier to be trained by using a plurality of preset image information sets to obtain the specified classifier.
Optionally, the method further comprises:
acquiring image information and an image processing mode of each image in a plurality of images obtained by shooting;
determining the category to which the image information of each image belongs according to the image processing mode of each image;
the specified classifier is updated according to the image information of each of the plurality of images and the category to which the image information of each image belongs.
All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present application, and the embodiment of the present invention is not described in detail.
Fig. 2 is a flowchart of an image processing method according to an embodiment of the present invention. Embodiments of the present invention will be discussed in conjunction with fig. 2 for the embodiment shown in fig. 1. Referring to fig. 2, the method comprises the steps of:
step 201: when the terminal is in a photographing state, image information of an image being photographed is acquired.
It should be noted that the terminal being in the shooting state refers to a state in which a camera provided in the terminal is turned on, and an image being shot is an image displayed in a view finder of the terminal in practical application.
In addition, for a certain image, the image information of the image may include various information related to the image, such as shooting data of the image, image data, and the like, the shooting data may include a shooting location of the image, a shooting device, a shooting time, a shooting mode, and the like, and the image data may include a shooting object of the image, pixel values, pixel positions, and the like of pixel points of the image. For example, the image information of the image may be EXIF (Exchangeable image file) information of the image.
Further, for an image, the shooting mode of the image may include a self-timer mode, a non-self-timer mode, a portrait mode, a landscape mode, and the like, and the shooting parameters in the respective shooting modes are different, for example, the self-timer mode may be a mode when shooting is performed using a front camera, the non-self-timer mode may be a mode when shooting is performed using a rear camera, the portrait mode may be a mode when shooting is performed using a large aperture, and the landscape mode may be a mode when shooting is performed using a small aperture.
When the terminal is in a shooting state, it is indicated that the user is using the terminal to shoot an image, and image information of the image being shot needs to be acquired in order to realize automatic processing of the image being shot subsequently. For example, when the terminal is in a shooting state, image information of an image being shot can be acquired as a shooting object-portrait and a shooting mode-self-timer shooting mode.
Step 202: and determining the image processing mode of the image according to the image information of the image.
The image processing mode of a certain image is a mode used when the image is processed, and may include various modes such as beautiful portrait, beautiful landscape, and non-beautiful landscape.
Specifically, step 202 may include the following two possible implementations.
A first possible implementation: acquiring an image processing mode corresponding to the image information of the image from the corresponding relation between the stored image information and the image processing mode; and determining an image processing mode corresponding to the image information of the image as the image processing mode of the image.
It should be noted that, the terminal may store the corresponding relationship between the image information and the image processing mode in advance, and then directly determine the image processing mode of the image according to the corresponding relationship between the image information and the image processing mode, the determination process is simple, and the determination speed is high.
For example, if the image information of the image includes image data and shot data of the image, the image data of the image is a shot object of the image, the shot object of the image may be a human being, the shot data of the image may be a shot mode of the image, and the shot mode of the image may be a self-portrait mode, the image processing mode corresponding to the image information of the image may be acquired as a human beautification from the correspondence between the image information and the image processing mode as shown in table 1 below, and the human beautification may be determined as the image processing mode of the image.
TABLE 1
Image information | Image processing mode |
Portrait, self-timer mode | Portrait enhancement |
Portrait, non-self-portrait mode | Not beautify |
Landscape, self-portrait mode | Not beautify |
Landscape, non-self-portrait mode | Landscaping |
It should be noted that, in the embodiments of the present invention, only the correspondence between the image information and the image processing mode shown in table 1 is taken as an example for description, and table 1 does not limit the embodiments of the present invention.
Further, from the stored correspondence between the image information and the image processing mode, before the image processing mode to which the image information of the image corresponds is acquired, the correspondence between the image information and the image processing mode may also be created. Specifically, when a setting instruction is detected, image information and an image processing mode carried in the setting instruction may be acquired, and the image information and the image processing mode may be stored correspondingly to obtain a corresponding relationship between the image information and the image processing mode.
It should be noted that the setting instruction is used to create a corresponding relationship between the image information carried by the setting instruction and the image processing mode, the setting instruction may be triggered by a user, and the user may trigger the setting instruction through a specified operation, where the specified operation may be a single-click operation, a double-click operation, a voice operation, and the like.
A second possible implementation: determining a category to which image information of the image belongs by specifying a classifier; and determining the image processing mode of the image according to the category to which the image information of the image belongs.
It should be noted that, the designated classifier is used to classify the image, and in practical application, after a certain image is input into the designated classifier, the designated classifier may determine the category to which the image belongs from a plurality of preset categories, and then output the category to which the image belongs.
In addition, the plurality of preset categories may be preset, for example, the plurality of preset categories may include a portrait beautification category, a landscape beautification category, a non-beautification category, and the like, which is not limited in the embodiment of the present invention.
When the terminal determines the image processing mode of the image according to the category to which the image information of the image belongs, the terminal may acquire the corresponding image processing mode from the stored correspondence between the category and the image processing mode according to the category to which the image information of the image belongs, and determine the acquired image processing mode as the image processing mode of the image.
For example, if the category to which the image information of the image belongs may be a person beautification category, the corresponding image processing mode may be obtained as a person beautification from the correspondence between the category and the image processing mode shown in table 2 below according to the person beautification category to which the image information of the image belongs, and the person beautification may be determined as the image processing mode of the image.
TABLE 2
Category to which image information belongs | Image processing mode |
Class of portrait enhancement | Portrait enhancement |
Class of not beautified | Not beautify |
Landscaping categories | Landscaping |
It should be noted that the embodiment of the present invention is described by taking only the correspondence between the categories and the image processing modes shown in table 2 as an example, and table 2 is not intended to limit the embodiment of the present invention.
Further, before determining the category to which the image information of the image belongs by the designated classifier, the designated classifier may also be generated first. Specifically, a plurality of preset image information sets may be obtained first, and then the plurality of preset image information sets are used to train the classifier to be trained, so as to obtain the designated classifier.
It should be noted that the preset image information sets may be preset, and the preset image information sets may be stored in the terminal, or may be stored in other storage devices, and the terminal may obtain the preset image information sets from the other storage devices through a wired connection or a wireless connection.
In addition, at least one preset image information included in each of the preset image information sets belongs to the same category, that is, the preset image information included in each of the preset image information sets is image information having a category identifier, and at least one preset image information included in each of the preset image information sets has the same category identifier.
And then, training the classifier to be trained by using the plurality of preset image information sets, and when the specified classifier is obtained, training the classifier to be trained by adopting a supervised learning mode to obtain the specified classifier, wherein the supervised learning means that parameters in the classifier are continuously adjusted by a preset adjusting algorithm under the condition of giving input and output of the classifier, so that the classifier reaches the process of required performance, and the classifier is the specified classifier when reaching the required performance.
Further, the terminal can update the designated classifier according to the plurality of images obtained by shooting, and specifically, the terminal can acquire image information and an image processing mode of each image in the plurality of images obtained by shooting; determining the category to which the image information of each image belongs according to the image processing mode of each image; the specified classifier is updated based on the image information of each of the plurality of images and the category to which the image information of each image belongs.
When the terminal determines the category to which the image information of each image belongs according to the image processing mode of each image, the terminal may acquire, for each of the plurality of images, a corresponding category from the stored correspondence between the image processing mode and the category according to the image processing mode of the image, and determine the acquired category as the category to which the image information of the image belongs.
For example, if the image processing mode of the image is a human face beautification mode, the corresponding category may be obtained from the correspondence between the image processing mode and the category shown in table 3 below according to the image processing mode of the image, and the human face beautification category may be determined as the category to which the image information of the image belongs.
TABLE 3
Image processing mode | Categories |
Portrait enhancement | Class of portrait enhancement |
Not beautify | Class of not beautified |
Landscaping | Landscaping categories |
It should be noted that the embodiment of the present invention is described by taking only the correspondence between the image processing modes and the types shown in table 3 as an example, and table 3 does not limit the embodiment of the present invention.
A specific implementation process of updating the designated classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs is similar to an implementation process of training the classifier to be trained by using the plurality of preset image information sets in the second possible implementation manner in step 202 to obtain the designated classifier, which is not described in detail in this embodiment of the present invention.
It is to be noted that, in the embodiment of the present invention, not only the designated classifier can be obtained by training using the plurality of preset image information sets, but also the designated classifier can be updated according to the image information of the image captured by the terminal, so that the designated classifier can flexibly adapt to different photographing habits of each user, and has a stronger pertinence and a higher classification accuracy.
Step 203: the image is processed according to an image processing mode of the image.
And when the image processing mode of the image is determined, the terminal can automatically process the image according to the image processing mode of the image. For example, the image processing mode of the image being shot by the terminal is portrait beautification, and at this time, the terminal automatically performs portrait beautification processing on the image being shot.
In the embodiment of the invention, when the terminal is in a shooting state, the user is indicated to use the terminal to shoot the image, the image information of the image which is shot can be obtained at the moment, the image processing mode of the image is determined according to the image information of the image, and then the terminal can automatically process the image according to the image processing mode of the image, so that frequent manual operation of the user is avoided, the use by the user is facilitated, the flexibility of image processing is improved, and the image processing requirement of the user is met.
Next, an image processing apparatus according to an embodiment of the present invention will be described.
Fig. 3A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention. Referring to fig. 3A, the apparatus includes a first obtaining module 301, a first determining module 302, and a processing module 303.
A first obtaining module 301, configured to obtain image information of an image being photographed when the terminal is in a photographing state.
A first determining module 302, configured to determine an image processing mode of the image according to the image information.
A processing module 303, configured to process the image according to the image processing mode.
Optionally, referring to fig. 3B, the first determining module 302 includes:
a first obtaining sub-module 3021, configured to obtain an image processing mode corresponding to the image information from the correspondence between the stored image information and the image processing mode.
A first determining submodule 3022 configured to determine an image processing mode corresponding to the image information as an image processing mode of the image.
Optionally, referring to fig. 3C, the first determining module 302 includes:
a second determining sub-module 3023 for determining the category to which the image information belongs by specifying the classifier.
A third determining submodule 3024 configured to determine an image processing mode of the image according to the category to which the image information belongs.
Optionally, referring to fig. 3D, the first determining module 302 further includes:
the second obtaining sub-module 3025 is configured to obtain a plurality of preset image information sets, where at least one preset image information included in each of the preset image information sets belongs to the same category.
The training submodule 3026 is configured to train a classifier to be trained by using a plurality of preset image information sets, so as to obtain an assigned classifier.
Optionally, referring to fig. 3E, the apparatus further comprises:
a second obtaining module 304, configured to obtain image information and an image processing mode of each of the plurality of captured images.
A second determining module 305, configured to determine a category to which the image information of each image belongs according to the image processing mode of each image.
An updating module 306, configured to update the specified classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs.
In the embodiment of the invention, when the terminal is in a shooting state, the user is indicated to use the terminal to shoot the image, the image information of the image which is shot can be obtained at the moment, the image processing mode of the image is determined according to the image information of the image, and then the terminal can automatically process the image according to the image processing mode of the image, so that frequent manual operation of the user is avoided, the use by the user is facilitated, the flexibility of image processing is improved, and the image processing requirement of the user is met.
It should be noted that: in the image processing apparatus provided in the above embodiment, when processing an image, only the division of the above functional modules is taken as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments and are not described herein again.
Referring to fig. 4A and 4B, a schematic structural diagram of the terminal 100 according to an exemplary embodiment of the present application is shown. The terminal 100 may be a mobile phone, a tablet computer, a notebook computer, etc. The terminal 100 in the present application may include one or more of the following components: a processor 110, a memory 120, and a touch display screen 130.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 120 includes a non-transitory computer-readable medium. The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like; the storage data area may store data (such as audio data, a phonebook) created according to the use of the terminal 100, and the like.
Taking an operating system as an Android (Android) system as an example, programs and data stored in the memory 120 are shown in fig. 4A, and a Linux kernel layer 220, a system runtime library layer 240, an application framework layer 260, and an application layer 280 are stored in the memory 120. The Linux kernel layer 220 provides underlying drivers for various hardware of the terminal 100, such as a display driver, an audio driver, a camera driver, a bluetooth driver, a Wi-Fi driver, power management, and the like. The system runtime library layer 240 provides the main feature support for the Android system through some C/C + + libraries. For example, the SQLite library provides support for a database, the OpenGL/ES library provides support for 3D drawing, the Webkit library provides support for a browser kernel, and the like. Also provided in the system Runtime layer 240 is an Android Runtime library (Android Runtime), which mainly provides some core libraries that can allow developers to write Android applications using the Java language. The application framework layer 260 provides various APIs that may be used in building applications, and developers may build their own applications by using these APIs, such as activity management, window management, view management, notification management, content provider, package management, session management, resource management, and location management. At least one application program runs in the application layer 280, and the application programs may be a contact program, a short message program, a clock program, a camera application, etc. of the operating system; or an application program developed by a third-party developer, such as an instant messaging program, a photo beautification program, and the like.
Taking an operating system as an IOS system as an example, programs and data stored in the memory 120 are shown in fig. 4B, and the IOS system includes: a Core operating system Layer 320(Core OS Layer), a Core Services Layer 340(Core Services Layer), a Media Layer 360(Media Layer), and a touchable Layer 380(Cocoa Touch Layer). The kernel operating system layer 320 includes an operating system kernel, drivers, and underlying program frameworks that provide functionality closer to hardware for use by program frameworks located in the kernel services layer 340. The core services layer 340 provides system services and/or program frameworks, such as a Foundation framework, an account framework, an advertisement framework, a data storage framework, a network connection framework, a geographic location framework, a motion framework, and so forth, that are needed by the application. The media layer 360 provides audiovisual interfaces for applications, such as graphics-related interfaces, audio-related interfaces, video-related interfaces, and audio/video transmission technology wireless broadcast (AirPlay) interfaces. The touchable layer 380 provides various common interface-related frameworks for application development, and the touchable layer 380 is responsible for user touch interaction operations on the terminal 100. Such as a local notification service, a remote push service, an advertising framework, a game tool framework, a messaging User Interface (UI) framework, a User Interface UIKit framework, a map framework, and so forth.
In the framework shown in FIG. 4B, the framework associated with most applications includes, but is not limited to: a base framework in the core services layer 340 and a UIKit framework in the touchable layer 380. The base framework provides many basic object classes and data types, provides the most basic system services for all applications, and is UI independent. While the class provided by the UIKit framework is a basic library of UI classes for creating touch-based user interfaces, iOS applications can provide UIs based on the UIKit framework, so it provides an infrastructure for applications for building user interfaces, drawing, processing and user interaction events, responding to gestures, and the like.
The touch display screen 130 is used for receiving a touch operation of a user on or near the touch display screen using any suitable object such as a finger, a touch pen, or the like, and displaying a user interface of each application program. The touch display 130 is generally disposed on a front panel of the terminal 130. The touch display screen 130 may be designed as a full-face screen, a curved screen, or a profile screen. The touch display screen 130 can also be designed to be a combination of a full-face screen and a curved-face screen, and a combination of a special-shaped screen and a curved-face screen only needs to use the material of the flexible screen for the touch display screen 130, which is not limited in this embodiment. Wherein:
full screen
A full screen may refer to a screen design where the touch display screen 130 occupies a screen fraction of the front panel of the terminal 100 that exceeds a threshold (e.g., 80% or 90% or 95%). One way of calculating the screen occupation ratio is as follows: (area of touch display 130/area of front panel of terminal 100) × 100%; another way to calculate the screen ratio is: (area of actual display area in touch display 130/area of front panel of terminal 100) × 100%; another calculation method of the screen occupation ratio is as follows: (diagonal of touch display screen 130/diagonal of front panel at terminal 100) × 100%. In the illustrative example shown in fig. 4C, almost all areas on the front panel of the terminal 100 are the touch display 130, and all areas on the front panel 40 of the terminal 100 except for the edge generated by the center frame 41 are the touch display 130. The four corners of the touch display screen 130 may be right angles or rounded.
A full-screen may also be a screen design that integrates at least one front panel component within or underneath the touch screen display 130. Optionally, the at least one front panel component comprises: cameras, fingerprint sensors, proximity light sensors, distance sensors, etc. In some embodiments, other components on the front panel of the conventional terminal are integrated in all or a part of the area of the touch display screen 130, such as after splitting the light sensing element in the camera into a plurality of light sensing pixels, each light sensing pixel is integrated in a black area in each display pixel in the touch display screen 130. The full-screen has a higher screen-to-screen ratio due to the integration of at least one front panel component inside the touch display screen 130.
Of course, in other embodiments, the front panel component of the front panel of the conventional terminal may be disposed at the side or back of the terminal 100, such as disposing the ultrasonic fingerprint sensor below the touch screen 130, disposing the bone conduction receiver inside the terminal 130, and disposing the camera head in a pluggable structure at the side of the terminal.
In some optional embodiments, when the terminal 100 employs a full-screen, a single side, or two sides (e.g., two left and right sides), or four sides (e.g., four upper, lower, left and right sides) of the middle frame of the terminal 100 is provided with an edge touch sensor 120, and the edge touch sensor 120 is configured to detect at least one of a touch operation, a click operation, a press operation, a slide operation, and the like of a user on the middle frame. The edge touch sensor 120 may be any one of a touch sensor, a thermal sensor, a pressure sensor, and the like. The user may apply operations on the edge touch sensor 120 to control applications in the terminal 100.
Curved surface screen
A curved screen refers to a screen design where the screen area of touch display screen 130 does not lie in one plane. Generally, curved screens present at least one such section: the section is in a curved shape, and the projection of the curved screen in any plane direction perpendicular to the section is a planar screen design, wherein the curved shape can be U-shaped. Alternatively, a curved screen refers to a screen design where at least one side is curved. Alternatively, the curved screen means that at least one side edge of the touch display screen 130 extends to cover the middle frame of the terminal 100. Since the side of the touch display screen 130 extends to cover the middle frame of the terminal 100, that is, the middle frame which does not have the display function and the touch function originally is covered as the displayable area and/or the operable area, the curved screen has a higher screen occupation ratio. Alternatively, as in the example shown in fig. 4D, the curved screen refers to a screen design in which the left and right sides 42 are curved; or, the curved screen refers to a screen design in which the upper and lower sides are curved; or, the curved screen refers to a screen design in which the upper side, the lower side, the left side and the right side are all in a curved shape. In an alternative embodiment, the curved screen is made of a touch screen material with certain flexibility.
Special-shaped screen
The special-shaped screen is a touch display screen with an irregular shape, and the irregular shape is not a rectangle or a rounded rectangle. Optionally, the irregular screen refers to a screen design in which a protrusion, a notch and/or a hole is/are formed on the rectangular or rounded rectangular touch display screen 130. Alternatively, the protrusions, indentations, and/or cutouts may be located at the edges of the touch screen display 130, at the center of the screen, or both. When the protrusion, the notch and/or the dug hole are arranged on one edge, the protrusion, the notch and/or the dug hole can be arranged in the middle or at two ends of the edge; when the projection, notch and/or cutout is provided in the center of the screen, it may be provided in one or more of an upper region, an upper left region, a left side region, a lower left region, a lower right region, a right side region, and an upper right region of the screen. When the projections, the notches and the dug holes are arranged in a plurality of areas, the projections, the notches and the dug holes can be distributed in a concentrated mode or in a dispersed mode; the distribution may be symmetrical or asymmetrical. Optionally, the number of projections, indentations and/or cutouts is also not limited.
The special-shaped screen covers the upper forehead area and/or the lower forehead area of the touch display screen as the displayable area and/or the operable area, so that the touch display screen occupies more space on the front panel of the terminal, and the special-shaped screen also has a larger screen occupation ratio. In some embodiments, the indentation and/or cutout is configured to receive at least one front panel component therein, the front panel component including at least one of a camera, a fingerprint sensor, a proximity light sensor, a distance sensor, an earpiece, an ambient light level sensor, and a physical key.
For example, the notch may be provided on one or more edges, and the notch may be a semicircular notch, a right-angled rectangular notch, a rounded rectangular notch, or an irregularly shaped notch. In the example shown in fig. 4E, the special-shaped screen may be a screen design having a semicircular notch 43 at the center of the upper edge of the touch display screen 130, and the semicircular notch 43 is used to accommodate at least one front panel component of a camera, a distance sensor (also called a proximity sensor), an earpiece, and an ambient light sensor; as schematically shown in fig. 4F, the irregular screen may be a screen design in which a semicircular notch 44 is formed at a central position of the lower edge of the touch display screen 130, and the semicircular notch 44 is free for accommodating at least one of a physical key, a fingerprint sensor, and a microphone; in an exemplary example shown in fig. 4G, the special-shaped screen may be a screen design in which a semi-elliptical notch 45 is formed in the center of the lower edge of the touch display screen 130, and a semi-elliptical notch is also formed on the front panel of the terminal 100, and the two semi-elliptical notches form an elliptical area for accommodating a physical key or a fingerprint identification module; in the illustrative example shown in fig. 4H, the contoured screen may be a screen design having at least one aperture 45 in the upper half of the touch screen display 130, the aperture 45 being free to receive at least one of a camera, a distance sensor, an earpiece, and an ambient light level sensor.
In addition, those skilled in the art will appreciate that the configuration of terminal 100 as illustrated in the above-described figures is not intended to be limiting of terminal 100, and that terminals may include more or less components than those illustrated, or some components may be combined, or a different arrangement of components. For example, the terminal 100 further includes a radio frequency circuit, an input unit, a sensor, an audio circuit, a Wireless Fidelity (WiFi) module, a power supply, a bluetooth module, and other components, which are not described herein again.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as a memory comprising instructions, executable by a processor of an apparatus to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
That is, the instructions in the computer readable storage medium, when executed by the processor of the apparatus, may implement the method of the embodiment shown in fig. 1 or fig. 2.
In the above embodiments, the implementation may be wholly or partly realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with embodiments of the invention, to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (4)
1. An image processing method applied to a terminal is characterized by comprising the following steps:
when the terminal is in a shooting state, acquiring image information of a picture being shot, wherein the image information comprises shooting data and image data, the shooting data at least comprises a shooting mode, and the image data at least comprises a shooting object;
determining an image processing mode of the image according to the image information, wherein the image processing mode is used for indicating a mode of beautifying the image;
processing the image according to the image processing mode;
wherein the determining an image processing mode of the image according to the image information comprises:
determining the category to which the image information belongs through an appointed classifier, wherein the appointed classifier is obtained by training a classifier to be trained through a plurality of preset image information sets, and at least one piece of preset image information included in each preset image information set in the plurality of preset image information sets belongs to the same category;
determining an image processing mode of the image according to the category to which the image information belongs;
the method further comprises the following steps:
acquiring image information and an image processing mode of each image in a plurality of images obtained by shooting, wherein the image processing mode of each image is a mode adopted by a user when the user carries out image processing on each image according to the shooting habit;
determining the category to which the image information of each image belongs according to the image processing mode of each image;
and updating the specified classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs.
2. An image processing apparatus applied to a terminal, the apparatus comprising:
the terminal comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring image information of an image which is being shot when the terminal is in a shooting state, the image information comprises shooting data and image data, the shooting data at least comprises a shooting mode, and the image data at least comprises a shooting object;
the first determining module is used for determining an image processing mode of the image according to the image information, wherein the image processing mode is used for indicating a mode of beautifying the image;
the processing module is used for processing the image according to the image processing mode;
wherein the first determination module comprises a second determination submodule and a third determination submodule;
the second determining submodule is used for determining the category to which the image information belongs through an appointed classifier, wherein the appointed classifier is obtained by training a classifier to be trained through a plurality of preset image information sets, and at least one preset image information included in each preset image information set in the plurality of preset image information sets belongs to the same category;
the third determining submodule is used for determining an image processing mode of the image according to the category to which the image information belongs;
the device further comprises:
the second acquisition module is used for acquiring image information and an image processing mode of each image in a plurality of images obtained by shooting, wherein the image processing mode of each image is a mode adopted by a user when the user carries out image processing on each image according to the shooting habit;
the second determining module is used for determining the category to which the image information of each image belongs according to the image processing mode of each image;
and the updating module is used for updating the specified classifier according to the image information of each image in the plurality of images and the category to which the image information of each image belongs.
3. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor is configured to perform the steps of the method of claim 1.
4. A computer-readable storage medium having stored therein instructions which, when executed on a computer, cause the computer to perform the method of claim 1.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010357605.1A CN111526290B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
CN201711090044.8A CN107864333B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711090044.8A CN107864333B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010357605.1A Division CN111526290B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107864333A CN107864333A (en) | 2018-03-30 |
CN107864333B true CN107864333B (en) | 2020-04-21 |
Family
ID=61701371
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010357605.1A Expired - Fee Related CN111526290B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
CN201711090044.8A Expired - Fee Related CN107864333B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010357605.1A Expired - Fee Related CN111526290B (en) | 2017-11-08 | 2017-11-08 | Image processing method, device, terminal and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN111526290B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109639932A (en) * | 2019-02-28 | 2019-04-16 | 努比亚技术有限公司 | Image processing method, mobile terminal and computer readable storage medium |
CN110225220A (en) * | 2019-04-26 | 2019-09-10 | 广东虎彩影像有限公司 | A kind of automatic photo fix system |
CN110225221A (en) * | 2019-04-26 | 2019-09-10 | 广东虎彩影像有限公司 | A kind of automatic photo fix method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103413270A (en) * | 2013-08-15 | 2013-11-27 | 北京小米科技有限责任公司 | Method and device for image processing and terminal device |
CN106339719A (en) * | 2016-08-22 | 2017-01-18 | 微梦创科网络科技(中国)有限公司 | Image identification method and image identification device |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100591101C (en) * | 2006-11-30 | 2010-02-17 | 华晶科技股份有限公司 | Method for automatically adjusting shooting parameters according to shooting operation habit of the user |
CN101706872A (en) * | 2009-11-26 | 2010-05-12 | 上海交通大学 | Universal open type face identification system |
KR20120071192A (en) * | 2010-12-22 | 2012-07-02 | 삼성전자주식회사 | Digital photographing apparatus and control method thereof |
JP2014068227A (en) * | 2012-09-26 | 2014-04-17 | Nikon Corp | Image pick-up system |
JP6188400B2 (en) * | 2013-04-26 | 2017-08-30 | オリンパス株式会社 | Image processing apparatus, program, and image processing method |
CN103533241B (en) * | 2013-10-14 | 2017-05-10 | 厦门美图网科技有限公司 | Photographing method of intelligent filter lens |
CN103533244A (en) * | 2013-10-21 | 2014-01-22 | 深圳市中兴移动通信有限公司 | Shooting device and automatic visual effect processing shooting method thereof |
CN103617432B (en) * | 2013-11-12 | 2017-10-03 | 华为技术有限公司 | A kind of scene recognition method and device |
CN103810504B (en) * | 2014-01-14 | 2017-03-22 | 三星电子(中国)研发中心 | Image processing method and device |
CN105138963A (en) * | 2015-07-31 | 2015-12-09 | 小米科技有限责任公司 | Picture scene judging method, picture scene judging device and server |
CN105844287B (en) * | 2016-03-15 | 2019-06-07 | 民政部国家减灾中心 | A kind of the domain adaptive approach and system of classification of remote-sensing images |
CN106169081B (en) * | 2016-06-29 | 2019-07-05 | 北京工业大学 | A kind of image classification and processing method based on different illumination |
CN106657810A (en) * | 2016-09-26 | 2017-05-10 | 维沃移动通信有限公司 | Filter processing method and device for video image |
CN106791394A (en) * | 2016-12-20 | 2017-05-31 | 北京小米移动软件有限公司 | Image processing method and device |
CN107155060A (en) * | 2017-04-19 | 2017-09-12 | 北京小米移动软件有限公司 | Image processing method and device |
CN107231470B (en) * | 2017-05-15 | 2020-06-23 | 努比亚技术有限公司 | Image processing method, mobile terminal and computer readable storage medium |
CN107203978A (en) * | 2017-05-24 | 2017-09-26 | 维沃移动通信有限公司 | A kind of image processing method and mobile terminal |
-
2017
- 2017-11-08 CN CN202010357605.1A patent/CN111526290B/en not_active Expired - Fee Related
- 2017-11-08 CN CN201711090044.8A patent/CN107864333B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103413270A (en) * | 2013-08-15 | 2013-11-27 | 北京小米科技有限责任公司 | Method and device for image processing and terminal device |
CN106339719A (en) * | 2016-08-22 | 2017-01-18 | 微梦创科网络科技(中国)有限公司 | Image identification method and image identification device |
Also Published As
Publication number | Publication date |
---|---|
CN107864333A (en) | 2018-03-30 |
CN111526290A (en) | 2020-08-11 |
CN111526290B (en) | 2021-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109034115B (en) | Video image recognizing method, device, terminal and storage medium | |
US20210048939A1 (en) | Icon display method, device, and terminal | |
CN110475072B (en) | Method, device, terminal and storage medium for shooting image | |
CN107506123B (en) | It adjusts operation and executes method, apparatus and terminal | |
CN109101157B (en) | Sidebar icon setting method and device, terminal and storage medium | |
US20210173550A1 (en) | Method for icon display, terminal, and storage medium | |
US11146739B2 (en) | Method for image shooting, terminal device, and storage medium | |
CN107632874B (en) | Interface display method and device and terminal | |
CN108803964B (en) | Buoy display method, device, terminal and storage medium | |
CN107688430B (en) | Wallpaper replacing method, device, terminal and storage medium | |
CN109656445B (en) | Content processing method, device, terminal and storage medium | |
WO2019233307A1 (en) | User interface display method and apparatus, and terminal and storage medium | |
CN107748656B (en) | Picture display method, device, terminal and storage medium | |
CN107864333B (en) | Image processing method, device, terminal and storage medium | |
US11102397B2 (en) | Method for capturing images, terminal, and storage medium | |
CN111127469A (en) | Thumbnail display method, device, storage medium and terminal | |
CN111352560B (en) | Screen splitting method and device, electronic equipment and computer readable storage medium | |
CN108845733B (en) | Screen capture method, device, terminal and storage medium | |
US11194598B2 (en) | Information display method, terminal and storage medium | |
CN109714474B (en) | Content copying method, device, terminal and storage medium | |
CN107644072B (en) | Data deleting method and device | |
CN107832682B (en) | Information display method and device and terminal | |
CN116095498A (en) | Image acquisition method, device, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200421 |