CN108989555B - Image processing method and related product - Google Patents

Image processing method and related product Download PDF

Info

Publication number
CN108989555B
CN108989555B CN201810751349.7A CN201810751349A CN108989555B CN 108989555 B CN108989555 B CN 108989555B CN 201810751349 A CN201810751349 A CN 201810751349A CN 108989555 B CN108989555 B CN 108989555B
Authority
CN
China
Prior art keywords
image
scoring
unscored
target
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810751349.7A
Other languages
Chinese (zh)
Other versions
CN108989555A (en
Inventor
曹威
陈标
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810751349.7A priority Critical patent/CN108989555B/en
Publication of CN108989555A publication Critical patent/CN108989555A/en
Application granted granted Critical
Publication of CN108989555B publication Critical patent/CN108989555B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method and a related product, wherein the method comprises the following steps: when the electronic equipment detects that the operation state is the preset state type, the image information of the unscored images in the image library is obtained, the grading strategy corresponding to the image information is determined, the unscored images are graded according to the grading strategy to obtain the target grading value, therefore, the electronic equipment can grade the images under the state that the electronic equipment is not used by a user before the recall video is created, and the electronic equipment does not need to score in the process of creating the recall video, so that the process of creating the recall video can be simplified, and the efficiency and the speed of creating the recall video by the electronic equipment can be improved.

Description

Image processing method and related product
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an image processing method and a related product.
Background
With the rapid development and the increasing popularity of the technology of the intelligent terminal (such as a smart phone), the technology of the intelligent terminal is now an indispensable electronic product in the daily life of users. The mobile phone photo album and other applications can screen the pictures with the same theme according to the screening conditions such as time and place, and form an exclusive picture set for a user to look up the pictures in a specific time period or place.
Disclosure of Invention
The embodiment of the application provides an image processing method and a related product, provides a method for creating a recall video, and is beneficial to improving the efficiency and speed of creating the recall video by electronic equipment.
In a first aspect, an embodiment of the present application provides an image processing method, which is applied to an electronic device, and the method includes:
when the running state of the electronic equipment is detected to be a preset state type, acquiring image information of an unscored image in an image library of the electronic equipment;
determining a scoring strategy corresponding to the image information;
and scoring the unscored image according to the scoring strategy to obtain a target scoring value.
In a second aspect, an embodiment of the present application provides an image processing apparatus applied to an electronic device, the image processing apparatus including:
the acquisition unit is used for acquiring image information of an unscored image in an image library of the electronic equipment when the running state of the electronic equipment is detected to be a preset state type;
a determination unit configured to determine a scoring policy corresponding to the image information;
and the execution unit is used for grading the unscored image according to the grading strategy to obtain a target grading value.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
It can be seen that in the embodiment of the application, when the electronic device detects that the operating state is the preset state type, the image information of the unscored image in the image library is obtained, the scoring strategy corresponding to the image information is determined, the unscored image is scored according to the scoring strategy, and the target scoring value is obtained.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1A is a schematic structural diagram of an example electronic device provided in an embodiment of the present application;
fig. 1B is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of another image processing method provided in the embodiments of the present application;
FIG. 3 is a schematic flowchart of another image processing method provided in the embodiments of the present application;
fig. 4 is another schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 5B is a modified structure of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of another electronic device provided in the embodiment of the present application.
Detailed description of the invention
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic devices involved in the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices (e.g., smartwatches, smartbands, pedometers, etc.), computing devices or other processing devices connected to a wireless modem, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal device), and so on, which have wireless communication functions. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure, where the electronic device 100 includes: casing 110, set up in circuit board 120 in casing 110 and set up in display screen 130 on casing 110, be provided with processor 121 on the circuit board 120, processor 121 connects display screen 130, wherein, the display screen can include touch-control display screen, touch-control display screen is used for receiving the user to its execution to electronic equipment's touch-control operation.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
The recall video is a video which is generated according to an image set in the electronic equipment and played in a slide form, and is used for a user to recall the image set. The method comprises the steps that a recall function is added in an album application of an iOS system or an Android system, when the recall function is triggered, a plurality of pictures are selected from the album to synthesize a video file, the video file is a recall video, and therefore the recall video does not occupy a storage space, is a file in a non-video format and is similar to a slide.
Referring to fig. 1B, fig. 1B is a schematic flowchart of an image processing method according to an embodiment of the present application, where the image processing method is applied to the electronic device shown in fig. 1A, and the method includes:
101. when the running state of the electronic equipment is detected to be a preset state type, acquiring image information of an unscored image in an image library of the electronic equipment.
In the embodiment of the application, the electronic device may detect a current running state, and when the running state is a preset state type, obtain image information of an unscored image in the image library, where the unscored image may be an image that is shot or downloaded by the electronic device within a recent period of time and then stored.
Optionally, the preset state type includes a screen-off state, a charging state, or a static state, where the static state refers to a state where the electronic device does not have an application running in the foreground, and when the electronic device is in the screen-off state, the charging state, or the static state, it indicates that the electronic device is not used by the user, the process of the present scheme may be started, and when the electronic device is in the state used by the user, the process is terminated.
102. And determining a grading strategy corresponding to the image information.
In this embodiment of the present application, a mapping relationship between image information and a scoring policy may be preset, and then a scoring policy corresponding to the image information is determined according to the mapping relationship, where the image information may include at least one of: the present invention relates to a method for capturing an image, and more particularly, to a method for capturing an image, a device for capturing an image, and a storage medium thereof, wherein the method includes the steps of capturing an object included in the image, for example, a capturing target focused when capturing the image, and for example, a main capturing content of a plurality of capturing contents included in the image, the main capturing content may be a capturing content located in the middle of the image or a capturing content occupying the largest area of the image, the capturing scene is a scene where the image is captured, and the capturing scene may be, for example, a day, a night, a sun, a shadow, an indoor space, an outdoor space, a sea, a street, and the like, and is not limited thereto, and the capturing environment may include at least one of: temperature environment, lighting environment, weather environment, such as sunny day, rainy day, cloudy day, snowy day, wind, etc.
Optionally, a scoring policy corresponding to the image information is determined, a feature parameter set corresponding to the image information may be determined, and then a plurality of parameters in the feature parameter set may be scored, where the feature parameter set may be a combination of the following parameter features: color, exposure, sharpness, expressiveness, etc., may be scored according to different sets of feature parameters for different pieces of image information, and specifically, may be set for different subjects, for example, when the subject is a person, a first set of feature parameters corresponding to the person may be set, and when the subject is a scene, a second set of feature parameters corresponding to the scene may be set, the first set of feature parameters being different from the second set of feature parameters, and similarly, may be set for different shooting times, shooting locations, shooting scenes, or shooting environments.
Optionally, when the image information is a shooting object, if the shooting object of the unscored image is a person, the feature parameter set includes at least one of the following feature parameters: color, exposure, definition and expressive force, wherein the expressive force is facial feature information of the facial image in each continuous shooting picture; if the shooting object of the unscored image is a scene, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, and sharpness.
Different characteristic parameters are scored according to different shooting objects, so that different images are scored in a targeted mode, and the accuracy of scoring results is improved.
103. And scoring the unscored image according to the scoring strategy to obtain a target scoring value.
In the embodiment of the application, the unscored image may be scored according to a feature parameter set, specifically, for each feature parameter in the feature parameter set, a mapping relationship between the feature parameter and a score value may be preset, and then, a score value corresponding to the feature parameter may be determined according to the mapping relationship.
Optionally, scoring the unscored images according to a set of feature parameters may comprise the steps of:
31. determining a scoring rule set corresponding to the feature parameter set according to a mapping relation between preset feature parameters and scoring rules, wherein the plurality of scoring rules are in one-to-one correspondence with the plurality of feature parameters;
32. scoring the plurality of characteristic parameters in the characteristic parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of score values, wherein each characteristic parameter corresponds to one score value;
33. and determining a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of characteristic parameters.
Wherein, for each feature parameter in the feature parameter set, a scoring rule may be preset, for example, when the feature parameter is color, the scoring rule may be a mapping relationship between pixels and score values, when the feature parameter is exposure, the scoring rule may be a mapping relationship between exposure and score values, when the feature parameter is sharpness, the scoring rule may be a mapping relationship between sharpness and score values, when the feature parameter is expression, the expression may be a mapping relationship between expression and score values, optionally, for the same feature parameter, different mapping relationships between feature parameters and score values may be set according to different image information, for example, when the feature parameter is color, if the shooting object of the unscored image is a person, the first pixel may be set to correspond to the score value a, the second pixel may be set to correspond to the score value b, if the shooting object of the unscored image is a scene, the first pixel may be assigned a value of c and the second pixel may be assigned a value of d, where a > b > c > d.
Further, the scoring value of each feature parameter can be determined according to the scoring rule corresponding to each feature parameter, so as to obtain a plurality of scoring values corresponding to a plurality of feature parameters, and finally, the target scoring value is determined according to the following formula:
the target score value is the first score value, the first weight + the second score value, the second weight + the third score value.
For example, the electronic device may score the scored images in the electronic device in a screen-off state, a charging state or a state where an application is not running in a foreground or a background, specifically, for each scored image, image information of the unscored image may be obtained, assuming that a shooting object of the unscored image is a person, scoring may be performed on feature parameters of the unscored image in four aspects of color, exposure, definition and expressiveness to obtain a first score value, a second score value, a third score value and a fourth score value, and then determining a target score value of the unscored image according to the first weight, the second weight, the third weight and the fourth weight respectively corresponding to the four feature parameters, and further, scoring a plurality of unscored images in the electronic device by the scheme provided by the present application, so that the electronic device does not need to create a target video each time, the image scoring is carried out once, and the scored images in the image library can not be repeatedly scored, so that the power consumption of the electronic equipment is saved.
It can be seen that in the embodiment of the application, when the electronic device detects that the operating state is the preset state type, the image information of the unscored image in the image library is obtained, the scoring strategy corresponding to the image information is determined, the unscored image is scored according to the scoring strategy, and the target scoring value is obtained.
Referring to fig. 2, fig. 2 is a schematic flow chart of another image processing method according to an embodiment of the present application, and the image processing method described in this embodiment is applied to the electronic device shown in fig. 1A, and the method may include the following steps:
201. and when the running state of the electronic equipment is detected to be a preset state type, acquiring the image information of the unscored image in the image library of the electronic equipment.
202. And determining a grading strategy corresponding to the image information.
203. And scoring the unscored image according to the scoring strategy to obtain a target scoring value.
The specific implementation process of the steps 201-203 can refer to the corresponding description in the method shown in fig. 1B, and will not be described herein again.
204. And acquiring a target video generation instruction.
In this embodiment of the application, the target video generation instruction may be an instruction triggered by a user through an electronic device, or may be automatically generated when the electronic device meets a preset condition, where the preset condition may be, for example, a time condition, a location condition, an operation condition, and the like, and the application is not limited.
205. And screening a second number of target images from a first number of images in an image library of the electronic equipment according to the target video generation instruction, the target scoring value and scoring values of other scored images in the image library, wherein the first number is larger than the second number.
Alternatively, the first number may be the number of all images in the image library, for example, 500 images in the image library are shared, the first number may be 500, the first number may also be the number of images within a preset image filtering range acquired by the electronic device, and the user may set the image filtering range through the electronic device, for example, in 500 images, an image filtering range from 101 th to 200 th is selected, and then the first number is 100.
Wherein, the second number of target images are selected from the first number of images, and the second number of target images can be selected according to the grade values from large to small, and the second number can be 40 sheets, 50 sheets and the like.
The second number may be 5 sheets, 6 sheets, 10 sheets, etc., and is not limited to the only number.
206. And forming an image set by the second quantity of target images, and generating a target video according to the image set.
In this embodiment of the application, the target video is a recall video, where the recall video is played in a slide form generated according to an image set in an electronic device, and when a user uses the recall video, the electronic device may receive an instruction for creating the target video triggered by the user, and then create the target video, or the electronic device may start a process for creating the target video under a preset condition that a trigger condition is met, where the trigger condition may be, for example, a preset time point, and the embodiment of the application is not limited.
Optionally, in this embodiment of the present application, a play policy for the target video may also be determined, where the play policy may include at least one of the following: duration of play, order of play, title, cover, theme, background music, etc.
In the embodiment of the application, the process of creating the recall video is separated from the process of scoring the image, so that tasks executed by the electronic equipment are reasonably distributed, and the efficiency and the speed of creating the recall video by the electronic equipment can be improved.
It can be seen that, in the embodiment of the present application, when detecting that the operating state is the preset state type, the electronic device acquires image information of an unscored image in the image library, determines a scoring policy corresponding to the image information, scores the unscored image according to the scoring policy to obtain a target score, screens a second number of target images from a first number of images in the image library according to the target score and scores of other scored images in the image library of the electronic device, where the first number is greater than the second number, the second number of target images form an image set, and generates a recall video according to the image set, it can be seen that the electronic device scores the images in a state where the electronic device is not used by a user before the recall video is created, and the electronic device does not need to score in a process of creating the recall video, so that, therefore, tasks executed by the electronic equipment are reasonably distributed, and the efficiency and the speed of creating the recall video by the electronic equipment can be improved.
Referring to fig. 3, fig. 3 is a schematic flowchart of an image processing method according to an embodiment of the present application, as shown in fig. 3, the electronic device of fig. 1A, and the image processing method includes:
301. and when the running state of the electronic equipment is detected to be a preset state type, acquiring the image information of the unscored image in the image library of the electronic equipment.
302. And determining a characteristic parameter set corresponding to the image information.
303. And determining a grading rule set corresponding to the feature parameter set according to a preset mapping relation between the feature parameters and the grading rules, wherein the grading rules are in one-to-one correspondence with the feature parameters.
304. And scoring the plurality of characteristic parameters in the characteristic parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of scoring values, wherein each characteristic parameter corresponds to one scoring value.
305. And determining a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of characteristic parameters.
306. And acquiring a target video generation instruction.
307. And screening a second number of target images from a first number of images in an image library of the electronic equipment according to the target video generation instruction, the target scoring value and scoring values of other scored images in the image library, wherein the first number is larger than the second number.
308. And forming an image set by the second quantity of target images, and generating a target video according to the image set.
The specific implementation process of steps 301-305 can be described with reference to the method shown in fig. 1B, and the specific implementation process of steps 306-308 can be described with reference to the method shown in fig. 1B, and will not be described herein again.
It can be seen that, in the embodiment of the present application, when detecting that the operating state is the preset state type, the electronic device obtains image information of an unscored image in the image library, determines a feature parameter set corresponding to the image information, determines a scoring rule set corresponding to the feature parameter set according to a mapping relationship between preset feature parameters and scoring rules, where the scoring rules correspond to the feature parameters one by one, scores a plurality of feature parameters in the feature parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of score values, where each feature parameter corresponds to one score value, determines a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of feature parameters, and then, according to the target score value and score values of other scored images in the image library of the electronic device, screens a second number of target images from a first number of images in the image library, the first number is larger than the second number, the image set is formed by the target images of the second number, the recall video is generated according to the image set, therefore, the electronic device can score the images when the electronic device is not used by a user before the recall video is created, and the electronic device does not need to score in the process of creating the recall video, so that tasks executed by the electronic device are reasonably distributed, and the efficiency and the speed of creating the recall video by the electronic device can be improved.
The following is a device for implementing the image processing method, specifically as follows:
in accordance with the above, please refer to fig. 4, in which fig. 4 is an electronic device according to an embodiment of the present application, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
when the running state of the electronic equipment is detected to be a preset state type, acquiring image information of an unscored image in an image library of the electronic equipment;
determining a scoring strategy corresponding to the image information;
and scoring the unscored image according to the scoring strategy to obtain a target scoring value.
In one possible example, after scoring the unrated image according to the scoring policy to obtain a target score value, the program further comprises instructions for:
acquiring a target video generation instruction;
screening a second number of target images from a first number of images in an image library of the electronic equipment according to the target video generation instruction, the target scoring value and scoring values of other scored images in the image library, wherein the first number is larger than the second number;
and forming an image set by the second quantity of target images, and generating a target video according to the image set.
In one possible example, in connection with the determining a scoring policy corresponding to the image information, the program includes instructions for:
determining a characteristic parameter set corresponding to the image information;
in the aspect of scoring the unscored images according to the scoring policy to obtain a target score value, the program comprises instructions for:
and scoring the unscored images according to the feature parameter set.
In one possible example, the set of feature parameters comprises a plurality of feature parameters, the program comprising instructions for, in said scoring the unscored image according to the set of feature parameters:
determining a scoring rule set corresponding to the feature parameter set according to a mapping relation between preset feature parameters and scoring rules, wherein the plurality of scoring rules are in one-to-one correspondence with the plurality of feature parameters;
scoring the plurality of characteristic parameters in the characteristic parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of score values, wherein each characteristic parameter corresponds to one score value;
and determining a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of characteristic parameters.
In one possible example, the image information includes a photographic subject;
if the shooting object of the unscored image is a person, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, definition and expressive force, wherein the expressive force is facial feature information of the facial image in each continuous shooting picture;
if the shooting object of the unscored image is a scene, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, and sharpness.
In one possible example, the preset state type includes a screen-off state, a charging state or a static state, where the static state refers to a state where the electronic device does not have an application running in the foreground.
Referring to fig. 5A, fig. 5A is a schematic structural diagram of an image processing apparatus according to the present embodiment. The image processing apparatus is applied to an electronic device, and comprises an acquisition unit 501, a determination unit 502, and an execution unit 503, wherein,
the obtaining unit 501 is configured to obtain image information of an unscored image in an image library of the electronic device when it is detected that the operating state of the electronic device is a preset state type;
the determining unit 502 is configured to determine a scoring policy corresponding to the image information;
the executing unit 503 is configured to score the unscored image according to the scoring policy to obtain a target score value.
Alternatively, as shown in fig. 5B, fig. 5B is a modified structure of the image processing apparatus shown in fig. 5A, which may further include, compared with fig. 5A: a screening unit 504 and a creating unit 505, wherein,
the acquisition unit is also used for acquiring a target video generation instruction;
the screening unit 504 is configured to screen a second number of target images from a first number of images in an image library of the electronic device according to the target video generation instruction, the target score value, and score values of other scored images in the image library, where the first number is greater than the second number;
the creating unit 505 is configured to form an image set from the second number of target images, and generate a target video according to the image set.
Optionally, in terms of determining the scoring policy corresponding to the image information, the determining unit 502 is specifically configured to:
determining a characteristic parameter set corresponding to the image information;
in the aspect of scoring the unscored image according to the scoring policy to obtain a score value, the execution unit is specifically configured to:
and scoring the unscored images according to the feature parameter set.
Optionally, the feature parameter set includes a plurality of feature parameters, and in the aspect of scoring the unscored image according to the feature parameter set, the executing unit 503 is specifically configured to:
determining a scoring rule set corresponding to the feature parameter set according to a mapping relation between preset feature parameters and scoring rules, wherein the plurality of scoring rules are in one-to-one correspondence with the plurality of feature parameters;
scoring the plurality of characteristic parameters in the characteristic parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of score values, wherein each characteristic parameter corresponds to one score value;
and determining a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of characteristic parameters.
Optionally, the image information includes a photographic subject;
if the shooting object of the unscored image is a person, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, definition and expressive force, wherein the expressive force is facial feature information of the facial image in each continuous shooting picture;
if the shooting object of the unscored image is a scene, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, and sharpness.
Optionally, the preset state type includes a screen-off state, a charging state, or a static state, where the static state refers to a state where the electronic device does not have an application running in the foreground.
It can be seen that, in the image processing apparatus described in the embodiment of the present application, when detecting that the operating state is the preset state type, the electronic device acquires image information of an unscored image in the image library, determines a scoring policy corresponding to the image information, scores the unscored image according to the scoring policy, and obtains a target scoring value.
It is to be understood that the functions of each program module of the image processing apparatus of this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 6, for convenience of description, only the portions related to the embodiments of the present application are shown, and details of the specific technology are not disclosed, please refer to the method portion of the embodiments of the present application. The electronic device may be any terminal device including a mobile phone, a tablet computer, a PDA (personal digital assistant), a POS (point of sales), a vehicle-mounted computer, etc., taking the electronic device as the mobile phone as an example:
fig. 6 is a block diagram illustrating a partial structure of an electronic device provided in an embodiment of the present invention. As shown in fig. 6, the electronic device 610 may include control circuitry, which may include storage and processing circuitry 630. The storage and processing circuit 630 may be a memory, such as a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. Processing circuitry in the storage and processing circuitry 630 may be used to control the operation of the electronic device 610. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuit 630 may be used to run software in the electronic device 610 such as an internet browsing application, a Voice Over Internet Protocol (VOIP) phone call application, an email application, a media playing application, operating system functions, and the like. Such software may be used to perform control operations such as, for example, camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality based on status indicators such as status indicator lights of light emitting diodes, touch event detection based on a touch sensor, functionality associated with displaying information on multiple (e.g., layered) displays, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the electronic device 610, and the like, without limitation of embodiments of the present application.
The electronic device 610 may also include input-output circuitry 642. The input-output circuitry 642 may be used to enable the electronic device 610 to enable input and output of data, i.e., to allow the electronic device 610 to receive data from external devices and also to allow the electronic device 610 to output data from the electronic device 610 to external devices. Input-output circuitry 642 may further include sensor 632. The sensors 632 may include ambient light sensors, proximity sensors based on light and capacitance, touch sensors (e.g., optical and/or capacitive based touch sensors, ultrasonic sensors, wherein the touch sensors may be part of a touch display screen or may be used independently as a touch sensor structure), acceleration sensors, and other sensors, etc.
Input-output circuitry 642 may also include one or more displays, such as display 614. Display 614 may include one or a combination of liquid crystal displays, organic light emitting diode displays, electronic ink displays, plasma displays, displays using other display technologies. Display 614 may include an array of touch sensors (i.e., display 614 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The electronic device 610 can also include an audio component 636. The audio component 636 can be used to provide audio input and output functionality for the electronic device 610. Audio components 36 in electronic device 10 may include speakers, microphones, buzzers, tone generators, and other components for generating and detecting sound.
The communications circuitry 638 may be used to provide the electronic device 610 with the ability to communicate with external devices. The communication circuits 638 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 638 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, wireless communication circuitry in communication circuitry 638 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving near field coupled electromagnetic signals. For example, the communication circuitry 638 may include a near field communication antenna and a near field communication transceiver. The communications circuit 638 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuit and antenna, and so forth.
The electronic device 610 may further include a battery, power management circuitry, and other input-output units 640. The input-output unit 640 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, etc.
A user may input commands through the input-output circuitry 642 to control operation of the electronic device 610 and may use output data of the input-output circuitry 642 to enable receipt of status information and other outputs from the electronic device 610.
In the foregoing embodiments shown in fig. 1B, fig. 2, or fig. 3, the method flows of the steps may be implemented based on the structure of the electronic device.
In the embodiments shown in fig. 4 and fig. 5A to fig. 5B, the functions of the units may be implemented based on the structure of the electronic device.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program causes a computer to execute a part or all of the steps of any one of the image processing methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the image processing methods as set forth in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (8)

1. An image processing method applied to an electronic device, the method comprising:
when the running state of the electronic equipment is detected to be a preset state type, acquiring image information of an unscored image in an image library of the electronic equipment; the preset state type comprises a screen-off state, a charging state or a static state, wherein the static state refers to a state that the electronic equipment does not have application running in the foreground; determining a scoring strategy corresponding to the image information; grading the unscored image according to the grading strategy to obtain a target grading value;
acquiring a target video generation instruction;
according to the target video generation instruction, the target scoring value and scoring values of other scored images in an image library of the electronic equipment, screening a second number of target images from a first number of images in the image library according to a descending order of the scoring values, wherein the first number is larger than the second number;
and forming an image set by the second quantity of target images, and generating a target video according to the image set.
2. The method of claim 1, wherein determining a scoring policy corresponding to the image information comprises:
determining a characteristic parameter set corresponding to the image information;
the scoring the unscored image according to the scoring strategy to obtain a target scoring value comprises the following steps:
and scoring the unscored images according to the feature parameter set.
3. The method of claim 2, wherein the set of feature parameters includes a plurality of feature parameters, and wherein scoring the unscored image according to the set of feature parameters comprises:
determining a grading rule set corresponding to the feature parameter set according to a mapping relation between preset feature parameters and grading rules, wherein the plurality of grading rules correspond to the plurality of feature parameters one to one;
scoring the plurality of characteristic parameters in the characteristic parameter set according to a plurality of scoring rules in the scoring rule set to obtain a plurality of score values, wherein each characteristic parameter corresponds to one score value;
and determining a target score value according to the plurality of score values and a plurality of weight values corresponding to the plurality of characteristic parameters.
4. The method according to claim 2 or 3, characterized in that the image information comprises a photographic subject;
if the shooting object of the unscored image is a person, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, definition and expressive force, wherein the expressive force is facial feature information of a facial image in each continuous shooting picture;
if the shooting object of the unscored image is a scene, the characteristic parameter set comprises at least one of the following characteristic parameters: color, exposure, and sharpness.
5. An image processing apparatus applied to an electronic device, the image processing apparatus comprising:
the acquisition unit is used for acquiring image information of an unscored image in an image library of the electronic equipment when the running state of the electronic equipment is detected to be a preset state type; the preset state type comprises a screen-off state, a charging state or a static state, wherein the static state refers to a state that the electronic equipment does not have application running in the foreground; a determination unit configured to determine a scoring policy corresponding to the image information; the execution unit is used for grading the unscored image according to the grading strategy to obtain a target grading value;
the acquisition unit is also used for acquiring a target video generation instruction;
the screening unit is used for screening a second number of target images from a first number of images in the image library according to the target video generation instruction, the target scoring value and scoring values of other scored images in the image library of the electronic equipment from large to small according to the scoring values, wherein the first number is larger than the second number;
and the creating unit is used for forming an image set by the second quantity of target images and generating a target video according to the image set.
6. The apparatus according to claim 5, wherein, in said determining a scoring policy corresponding to the image information, the determining unit is specifically configured to:
determining a characteristic parameter set corresponding to the image information;
in the aspect of scoring the unscored image according to the scoring policy to obtain a score value, the execution unit is specifically configured to:
and scoring the unscored images according to the feature parameter set.
7. An electronic device comprising a processor, a memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-4.
8. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-4.
CN201810751349.7A 2018-07-10 2018-07-10 Image processing method and related product Active CN108989555B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810751349.7A CN108989555B (en) 2018-07-10 2018-07-10 Image processing method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810751349.7A CN108989555B (en) 2018-07-10 2018-07-10 Image processing method and related product

Publications (2)

Publication Number Publication Date
CN108989555A CN108989555A (en) 2018-12-11
CN108989555B true CN108989555B (en) 2021-06-04

Family

ID=64537565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810751349.7A Active CN108989555B (en) 2018-07-10 2018-07-10 Image processing method and related product

Country Status (1)

Country Link
CN (1) CN108989555B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110366043B (en) * 2019-08-20 2022-02-18 北京字节跳动网络技术有限公司 Video processing method and device, electronic equipment and readable medium
CN110825897A (en) * 2019-10-29 2020-02-21 维沃移动通信有限公司 Image screening method and device and mobile terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216672A (en) * 2014-08-20 2014-12-17 小米科技有限责任公司 Display control method and display control device
CN104484090A (en) * 2014-12-12 2015-04-01 深圳市财富之舟科技有限公司 Method for displaying pictures
CN105335473A (en) * 2015-09-30 2016-02-17 小米科技有限责任公司 Method and device for playing picture
CN107273510A (en) * 2017-06-20 2017-10-20 广东欧珀移动通信有限公司 Photo recommends method and Related product
CN107332981A (en) * 2017-06-14 2017-11-07 广东欧珀移动通信有限公司 Image processing method and related product

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216672A (en) * 2014-08-20 2014-12-17 小米科技有限责任公司 Display control method and display control device
CN104484090A (en) * 2014-12-12 2015-04-01 深圳市财富之舟科技有限公司 Method for displaying pictures
CN105335473A (en) * 2015-09-30 2016-02-17 小米科技有限责任公司 Method and device for playing picture
CN107332981A (en) * 2017-06-14 2017-11-07 广东欧珀移动通信有限公司 Image processing method and related product
CN107273510A (en) * 2017-06-20 2017-10-20 广东欧珀移动通信有限公司 Photo recommends method and Related product

Also Published As

Publication number Publication date
CN108989555A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
CN110543289B (en) Method for controlling volume and electronic equipment
CN110569095B (en) Method and electronic equipment for displaying page elements
CN109151338B (en) Image processing method and related product
CN108279950A (en) A kind of application program launching method and mobile terminal
CN105809647A (en) Automatic defogging photographing method, device and equipment
CN108495029A (en) A kind of photographic method and mobile terminal
CN111338725A (en) Interface layout method and related product
CN108833779B (en) Shooting control method and related product
CN108418916A (en) Image capturing method, mobile terminal based on double-sided screen and readable storage medium storing program for executing
CN111246102A (en) Shooting method, shooting device, electronic equipment and storage medium
CN110825897A (en) Image screening method and device and mobile terminal
CN111405180A (en) Photographing method, photographing device, storage medium and mobile terminal
CN108924439B (en) Image processing method and related product
CN108989555B (en) Image processing method and related product
CN109842723A (en) Terminal and its screen brightness control method and computer readable storage medium
CN108718389A (en) A kind of screening-mode selection method and mobile terminal
CN108650466A (en) The method and electronic equipment of photo tolerance are promoted when a kind of strong light or reversible-light shooting portrait
CN110198421B (en) Video processing method and related product
CN108616687A (en) A kind of photographic method, device and mobile terminal
CN111353946B (en) Image restoration method, device, equipment and storage medium
CN110955788A (en) Information display method and electronic equipment
CN111343321B (en) Backlight brightness adjusting method and related product
CN110266942B (en) Picture synthesis method and related product
CN110825288A (en) Image screening processing method and electronic equipment
CN111159551A (en) Display method and device of user-generated content and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant