CN113962840A - Image processing method, image processing device, electronic equipment and storage medium - Google Patents

Image processing method, image processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113962840A
CN113962840A CN202111222060.4A CN202111222060A CN113962840A CN 113962840 A CN113962840 A CN 113962840A CN 202111222060 A CN202111222060 A CN 202111222060A CN 113962840 A CN113962840 A CN 113962840A
Authority
CN
China
Prior art keywords
image
parameter
user
target
preference value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111222060.4A
Other languages
Chinese (zh)
Inventor
屈松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111222060.4A priority Critical patent/CN113962840A/en
Publication of CN113962840A publication Critical patent/CN113962840A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image processing method and device, electronic equipment and a storage medium, and belongs to the technical field of electronics. The specific scheme comprises the following steps: acquiring a first image; identifying a target object in the first image, and adjusting the image parameter of the region where the target object is located from the first parameter to a second parameter to obtain a second image; and the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.

Description

Image processing method, image processing device, electronic equipment and storage medium
Technical Field
The present application belongs to the field of electronic technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
With the development of mobile communication technology, capturing images using a capturing function of an electronic device has become an indispensable part of people's daily life. In general, in order to enrich the presentation form of an image, people can perform color editing on the image when post-processing the image, thereby presenting different visual effects.
In the prior art, a user can make an image show different color effects by editing parameter values of various image parameters, and then, the post-processing process usually takes a long time.
Disclosure of Invention
An embodiment of the present application provides an image processing method, an image processing apparatus, an electronic device, and a storage medium, which can solve the problem in the prior art that time consumed for editing an image is long.
In a first aspect, an embodiment of the present application provides an image processing method, including: acquiring a first image; identifying a target object in the first image, and adjusting the image parameter of the region where the target object is located from the first parameter to a second parameter to obtain a second image; and the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: the device comprises an acquisition module and a processing module. The acquisition module is used for acquiring a first image. The processing module is used for identifying a target object in the first image and adjusting the image parameter of the area where the target object is located from the first parameter to a second parameter to obtain a second image; and the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In an embodiment of the present application, a first image may be acquired; identifying a target object in the first image, and adjusting the image parameter of the region where the target object is located from the first parameter to a second parameter to obtain a second image; and the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter. According to the scheme, the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter, so that the electronic equipment can determine the image parameter of the area where the target object is located according to the user preference value during image processing, the image can have a satisfactory effect of a user before the user manually edits the image, and the problem that the time consumed for editing the image is long can be avoided.
Drawings
Fig. 1 is a schematic diagram of an image processing method provided in an embodiment of the present application;
FIG. 2 is a schematic interface operation diagram of an image processing method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a distribution of regions of a first image provided by an embodiment of the present application;
fig. 4 is a second schematic diagram of an image processing method according to an embodiment of the present application;
fig. 5 is a second schematic interface operation diagram of the image processing method according to the embodiment of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 7 is a hardware diagram of an electronic device provided by an embodiment of the present application;
fig. 8 is a second hardware schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The image processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
As shown in fig. 1, an embodiment of the present application provides an image processing method, which may be applied to an electronic device, and the method may include steps 101-102.
Step 101, acquiring a first image.
Optionally, the first image may be an image instantly photographed by a user through an electronic device, may also be an image in an electronic device gallery, and may also be an image downloaded by the user from a network through the electronic device, which may be determined specifically according to actual use requirements, and this is not limited in this embodiment of the application.
The following describes in detail the image processing method provided by the embodiment of the present application, taking the first image as an example of an image that is instantly taken by a user through an electronic device.
If the user wants to capture an image, the electronic device may be triggered to display a capture preview interface of the camera, which may include a setting control 21 and a capture control 22, as shown in fig. 2 (a). A user can perform a click input on the setting control 21, the electronic device can respond to the click input and display a setting interface, as shown in (b) of fig. 2, the setting interface can include a color style recommendation control 23, and the electronic device can automatically perform color style recommendation processing on a shot image when the color style recommendation control 23 is in an open state; with this color style recommendation control 23 in the off state, the electronic device can capture an image that does not contain a color style recommendation, i.e., a normal image. After the user triggers the color style recommendation control 23 to be in the on state, the user may align the shooting preview interface of the camera with the shooting object, and then the electronic device may receive a click input of the user to the shooting control 22 and shoot the first image in response to the click input.
And 102, identifying a target object in the first image, and adjusting the image parameter of the area where the target object is located from the first parameter to a second parameter to obtain a second image.
The user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
Since the first image is an image captured with the color style recommendation control 23 in the on state, the electronic device can perform the color style recommendation process on the first image. Specifically, the electronic device may first identify a target object in the first image, that is, a shooting object, and then adjust an image parameter of an area where the target object is located from the first parameter to the second parameter, so as to obtain the second image. The second image is an image that can be directly seen after the user clicks the shooting control 22, that is, the electronic device does not display the first image after shooting the first image, but directly displays the second image to the user after performing color style recommendation processing on the first image to obtain the second image.
Illustratively, as shown in fig. 3, the target object in the first image 30 is a sound, and the first image includes a region 31 where the sound is located and other regions 32. The electronic device may recognize that the target object in the first image 30 is sound, and adjust the image parameter of the area 31 where the sound is located from the first parameter to the second parameter, thereby obtaining a second image.
Alternatively, the image parameters may include parameters for adjusting the color style of the image, such as hue, brightness, contrast, exposure, and the like.
Alternatively, the target object may include a specific subject such as grass, shrub, sky, building, sunset, neon, desk, carpet, and stereo.
Optionally, the electronic device may determine the second parameter before performing step 102. Specifically, the electronic device may obtain a target history image, where the target history image includes a first object, and a category of the first object is the same as a category of the target object; then, determining a user preference value corresponding to the image parameter of the area where the first object is located in the target historical image; and finally, determining the image parameter of the area where the first object with the highest user preference value in the target historical image is located as the second parameter.
It should be noted that one user preference value may correspond to one image parameter, and may also correspond to multiple image parameters. In the case that one user preference value corresponds to one image parameter, the electronic device may determine, as the second parameter, an image parameter of a region where an object having a highest user preference value is located in each of the image parameters; in the case that one user preference value corresponds to a plurality of image parameters, the electronic device may determine the user preference values corresponding to all the parameters of the region where the first object is located in each target history image, and then select all the parameters of the region where the first object of the target history image having the highest user preference value is located from all the target history images as the second parameters.
Based on the scheme, the image parameter of the area where the first object with the highest user preference value in the target historical image is located can be determined as the second parameter, so that the image parameter of the area where the target object in the first image is located can be adjusted to be the image parameter with the highest user preference value, and the second image can have the effect which is more satisfactory to users.
Alternatively, the user preference value may be determined by a user's gaze dwell time in an area of the target history image where the first object is located. Specifically, the electronic device may determine a first user gaze dwell time of the user in an area where the first object is located in the target history image; and then, determining a user preference value corresponding to the image parameter of the area where each first object is located according to the first user sight line dwell time.
Based on the above scheme, since the user preference value can be determined according to the user sight line dwell time, a basis can be provided for determining the second parameter according to the user preference value.
Illustratively, the image parameter is brightness, and the target object is grass. The electronic device may identify the target object in the first image as grass and then obtain a target history image of the grass in a gallery, which may include, for example, target history image 1, target history image 2, and target history image 3. The first user sight line dwell time corresponding to the region where the target historical image 1 is located in the Chinese herbal place is 5 minutes, the first user sight line dwell time corresponding to the region where the target historical image 2 is located in the Chinese herbal place is 8 minutes, and the first user sight line dwell time corresponding to the region where the target historical image 3 is located in the Chinese herbal place is 10 minutes. If the brightness of the region in which the target history image 1 is located is 20, the brightness of the region in which the target history image 2 is located is 20, and the brightness of the region in which the target history image 3 is located is 25, the brightness of the region in which the target history image 1 is located is the same as the brightness of the region in which the target history image 2 is located, and therefore, the first user sight line dwell time corresponding to the two can be accumulated to 13 minutes. Accordingly, the user preference value corresponding to the luminance 20 is 13, and the user preference value corresponding to the luminance 25 is 10. Since the user preference value corresponding to the brightness 20 is higher than the user preference value corresponding to the brightness 25, the electronic device may determine the brightness 25 as the second parameter.
In the embodiment of the application, the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter, so that the electronic device can determine the image parameter of the region where the target object is located according to the user preference value during image processing, and thus, before the user manually edits the image, the image can have a satisfactory effect, and the problem that the time for editing the image is long can be avoided.
Optionally, with reference to fig. 1, as shown in fig. 4, after the step 101 is performed, the method may further include a step 103, and the step 102 may be specifically implemented by a step 102a described below.
Step 103, in the case where the first image includes a plurality of categories of objects, determines the object of the category with the highest priority as the target object.
When the first image includes a plurality of categories of objects, the electronic device may determine the priority of each category of objects first, and then determine the object of the category with the highest priority as the target object.
Optionally, the electronic device may determine the priority of the object of one category according to the user gaze dwell time corresponding to the object of the one category. Specifically, the electronic device may obtain a second user gaze dwell time corresponding to each category of object in the first image; and determining the priority corresponding to the object of each category according to the stay time of the second user sight line.
Illustratively, the first image includes an object 1 and an object 2. If the user's sight-line dwell time corresponding to the target history image including the object 1 is 5 minutes and the user's sight-line dwell time corresponding to the target history image including the object 2 is 7 minutes, the priority of the object 2 is higher than the priority of the object 1, and therefore, the object 2 can be determined as the target object.
Based on the above-described scheme, since the priority of the object of each category can be determined according to the user gaze stay time, in the case where a plurality of categories of objects are included in the first image, the object of the category having the highest priority can be determined as the target object.
Step 102a, adjusting the image parameters of all the areas of the first image to the second parameters.
After determining the target object in the first image, the electronic device may adjust the image parameters of all regions of the first image to the second parameter. That is, the image parameters of all image areas of the second image are the same.
In the embodiment of the application, since the object of the category with the highest priority can be determined as the target object, and the image parameters of all the areas of the first image are set as the second parameters, not only can the image effect of the area where the target object is located be ensured to meet the preference of the user, but also the overall harmony of the second image can be stronger.
Optionally, before performing the step 101, the method may further include the step 104-105.
And 104, receiving a first input of a user to the area where the target object is located by the electronic equipment under the condition that the shooting preview interface comprises the target object.
In the case where the target object is included in the shooting preview interface, the user can perform color editing on the region where the target object is located. Specifically, the user may perform a first input on the region where the target object is located, and accordingly, the electronic device may receive the first input.
Alternatively, the first input may be an input for adjusting the size of the image parameter. For example, the first input may be a click input to the adjustment control or a swipe input to the interface.
For example, as shown in fig. 5, the first input is a stroke input to the interface, and the interface area 1 includes the target object. The user can perform a stroke input in four directions, i.e., up, down, left, and right, respectively, starting from the center of the interface area 1.
Step 105, the electronic device responds to the first input, and adjusts the image parameters of the area where the target object is located.
Alternatively, the user may set the corresponding image parameters for the stroke input in different directions. For example, the user may trigger the electronic device to associate a swipe input in both the "up" and "down" directions with a hue in the image parameter, and a swipe input in both the "left" and "right" directions with a brightness in the image parameter.
Illustratively, the image parameters include hue and brightness. With continued reference to fig. 5, when the user performs a swipe input to "up" starting from the center of the interface area 1, the electronic device may increase the color tone of the area where the target object is located, and when the user performs a swipe input to "down" starting from the center of the interface area 1, the electronic device may decrease the color tone of the area where the target object is located; when the user performs a swipe input to the "left" starting from the center of the interface area 1, the electronic device may decrease the brightness of the area where the target object is located, and when the user performs a swipe input to the "right" starting from the center of the interface area 1, the electronic device may increase the brightness of the area where the target object is located.
In the embodiment of the application, before the first image is acquired, a user can perform first input on the area where the target object is located in the shooting preview interface, so that the electronic device is triggered to adjust the image parameters of the area where the target object is located, that is, the user can freely adjust the image parameters in the shot image, and therefore, the final image effect can better accord with the preference of the user, so that the image parameters of the target historical image in the electronic device have differences, and a basis is provided for the electronic device to automatically add the image effect.
Optionally, after obtaining the second image, i.e. after performing the step 102, the method may further include the step 106 and 107.
And 106, acquiring a first time when the sight of the user stays in the area where the target object is located under the condition that the second image is displayed.
After obtaining the second image, the user may trigger the electronic device to display the second image. In a case where the electronic device displays the second image, the electronic device may acquire a first time at which the user stays in the area where the target object is located.
And step 107, updating the user preference value corresponding to the second parameter according to the first time.
Because the image parameter of the area where the target object is located in the second image is the second parameter, after the first time is obtained, the electronic device may accumulate the first time to the user sight dwell time corresponding to the second parameter, and update the user preference value corresponding to the second parameter according to the accumulated user sight dwell time.
Illustratively, the second parameter is the brightness 25, and the first time is 5 s. If the user's gaze dwell time corresponding to the image area with the brightness of 10 in the electronic device is 35s, and the user's gaze dwell time corresponding to the image area with the brightness of 25 is 22s, the electronic device may add the first time of 5s to the user's gaze dwell time 22s corresponding to the image area with the brightness of 25, so that the user's gaze dwell time corresponding to the image area with the brightness of 25 is 27s, and then, the electronic device may update the user preference value corresponding to the brightness of 25 from 22 to 27.
Optionally, to improve accuracy, the electronic device may automatically filter image areas where the user gaze dwell time is less than a first threshold. That is, the user gaze dwell time corresponding to the image area of the second parameter may be the sum of the user gaze dwell times of the plurality of image areas, and the user gaze dwell time of each image area is greater than the first threshold.
In the embodiment of the application, the first time can be accumulated to the user sight line dwell time corresponding to the image area of the second parameter, so that the user sight line dwell time corresponding to the image area of each parameter value can be determined, and the second parameter can be determined according to the user sight line dwell time corresponding to each parameter value when the first image is processed.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method.
As shown in fig. 6, an embodiment of the present application further provides an image processing apparatus 600, which may include: an acquisition module 601 and a processing module 602. An obtaining module 601, configured to obtain a first image; the processing module 602 may be configured to identify a target object in the first image, and adjust an image parameter of a region where the target object is located from a first parameter to a second parameter, so as to obtain a second image; the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
Optionally, the obtaining module 601 may be further configured to obtain a target history image, where the target history image includes a first object, and a category of the first object is the same as a category of the target object; the processing module 602 may be further configured to determine a user preference value corresponding to an image parameter of an area where the first object is located in the target history image; and determining the image parameter of the area where the first object with the highest user preference value in the target historical image is located as the second parameter.
Optionally, the processing module 602 may be specifically configured to determine a first user gaze dwell time of an area in which the first object is located in the target history image; and determining a user preference value corresponding to the image parameter of the area where each first object is located according to the first user sight line dwell time.
Optionally, the processing module 602 may be further configured to determine, when the first image includes a plurality of categories of objects, an object of a category with a highest priority as the target object; and adjusting the image parameters of all the areas of the first image into the second parameters.
Optionally, the obtaining module 601 may be further configured to obtain a second user gaze dwell time corresponding to each category of object in the first image; the processing module 602 may further be configured to determine a priority corresponding to each category of objects according to the second user gaze dwell time.
In the embodiment of the application, the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter, so that the electronic device can determine the image parameter of the region where the target object is located according to the user preference value during image processing, and thus, before the user manually edits the image, the image can have a satisfactory effect, and the problem that the time for editing the image is long can be avoided.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 5, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 7, an electronic device 700 is further provided in an embodiment of the present application, and includes a processor 701, a memory 702, and a program or an instruction stored in the memory 702 and executable on the processor 701, where the program or the instruction is executed by the processor 701 to implement each process of the image processing method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 1010 may be configured to acquire a first image, identify a target object in the first image, and adjust an image parameter of a region where the target object is located from a first parameter to a second parameter to obtain a second image; the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
In the embodiment of the application, the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter, so that the electronic device can determine the image parameter of the region where the target object is located according to the user preference value during image processing, and thus, before the user manually edits the image, the image can have a satisfactory effect, and the problem that the time for editing the image is long can be avoided.
Optionally, the processor 1010 may be further configured to obtain a target history image, where the target history image includes a first object, and a category of the first object is the same as a category of the target object; determining a user preference value corresponding to an image parameter of an area where a first object is located in the target historical image; and determining the image parameter of the area where the first object with the highest user preference value in the target historical image is located as the second parameter.
In the embodiment of the application, since the image parameter of the region where the first object with the highest user preference value is located in the target history image can be determined as the second parameter, the image parameter of the region where the target object is located in the first image can be adjusted to the image parameter with the highest user preference value, so that the second image has a more satisfactory effect for the user.
Optionally, the processor 1010 may be specifically configured to determine a first user gaze dwell time of the user in an area where the first object is located in the target history image; and determining a user preference value corresponding to the image parameter of the area where each first object is located according to the first user sight line dwell time.
In the embodiment of the application, since the user preference value can be determined according to the stay time of the user's sight line, a basis can be provided for determining the second parameter according to the user preference value.
Optionally, the processor 1010 may be further configured to determine, as the target object, an object of a category with a highest priority if a plurality of categories of objects are included in the first image; and adjusting the image parameters of all the areas of the first image into the second parameters.
In the embodiment of the application, since the object of the category with the highest priority can be determined as the target object, and the image parameters of all the areas of the first image are set as the second parameters, not only can the image effect of the area where the target object is located be ensured to meet the preference of the user, but also the overall harmony of the second image can be stronger.
Optionally, the processor 1010 may be further configured to obtain a second user gaze dwell time corresponding to each category of object in the first image; and determining the priority corresponding to the object of each category according to the second user sight line dwell time.
In the embodiment of the present application, since the priority of the object of each category can be determined according to the user gaze stay time, in the case where a plurality of categories of objects are included in the first image, the object of the category with the highest priority can be determined as the target object.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An image processing method, comprising:
acquiring a first image;
identifying a target object in the first image, and adjusting the image parameter of the area where the target object is located from a first parameter to a second parameter to obtain a second image;
the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
2. The image processing method of claim 1, wherein after the acquiring the first image, the method further comprises:
acquiring a target historical image, wherein the target historical image comprises a first object, and the category of the first object is the same as that of the target object;
determining a user preference value corresponding to an image parameter of an area where a first object is located in the target historical image;
and determining the image parameter of the area where the first object with the highest user preference value in the target historical image is located as the second parameter.
3. The image processing method according to claim 2, wherein the determining of the user preference value corresponding to the image parameter of the region in which the first object is located in the target history image comprises:
determining a first user sight line dwell time of a user in an area where a first object is located in the target historical image;
and determining a user preference value corresponding to the image parameter of the area where each first object is located according to the first user sight line dwell time.
4. The image processing method according to any one of claims 1 to 3, wherein after the acquiring the first image, the method further comprises:
determining an object of a category having a highest priority as the target object in a case where objects of a plurality of categories are included in the first image;
the adjusting the image parameter of the region where the target object is located from the first parameter to the second parameter includes:
and adjusting the image parameters of all the areas of the first image into the second parameters.
5. The image processing method according to claim 4, wherein before determining the object of the highest priority class as the target object, the method further comprises:
acquiring second user sight line dwell time corresponding to each category of object in the first image;
and determining the priority corresponding to the object of each category according to the second user sight line dwell time.
6. An image processing apparatus characterized by comprising: the device comprises an acquisition module and a processing module;
the acquisition module is used for acquiring a first image;
the processing module is used for identifying a target object in the first image and adjusting the image parameter of the area where the target object is located from a first parameter to a second parameter to obtain a second image;
the user preference value corresponding to the second parameter is higher than the user preference value corresponding to the first parameter.
7. The image processing apparatus according to claim 6,
the acquisition module is further configured to acquire a target history image, where the target history image includes a first object, and a category of the first object is the same as a category of the target object;
the processing module is further configured to determine a user preference value corresponding to an image parameter of an area where the first object is located in the target history image; and determining the image parameter of the area where the first object with the highest user preference value in the target historical image is located as the second parameter.
8. The image processing device according to claim 7, wherein the processing module is specifically configured to determine a first user gaze dwell time of a user in an area of the target history image where the first object is located; and determining a user preference value corresponding to the image parameter of the area where each first object is located according to the first user sight line dwell time.
9. The image processing apparatus according to any one of claims 6 to 8,
the processing module is further configured to determine, as the target object, an object of a category with a highest priority, if the first image includes objects of a plurality of categories; and adjusting the image parameters of all the areas of the first image into the second parameters.
10. The image processing apparatus according to claim 9,
the acquisition module is further configured to acquire a second user sight dwell time corresponding to each category of object in the first image;
the processing module is further configured to determine a priority corresponding to each category of object according to the second user gaze retention time.
11. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the image processing method according to any one of claims 1 to 5.
12. A readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the image processing method according to any one of claims 1 to 5.
CN202111222060.4A 2021-10-20 2021-10-20 Image processing method, image processing device, electronic equipment and storage medium Pending CN113962840A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111222060.4A CN113962840A (en) 2021-10-20 2021-10-20 Image processing method, image processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111222060.4A CN113962840A (en) 2021-10-20 2021-10-20 Image processing method, image processing device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113962840A true CN113962840A (en) 2022-01-21

Family

ID=79464914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111222060.4A Pending CN113962840A (en) 2021-10-20 2021-10-20 Image processing method, image processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113962840A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117389745A (en) * 2023-12-08 2024-01-12 荣耀终端有限公司 Data processing method, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117389745A (en) * 2023-12-08 2024-01-12 荣耀终端有限公司 Data processing method, electronic equipment and storage medium
CN117389745B (en) * 2023-12-08 2024-05-03 荣耀终端有限公司 Data processing method, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111654635A (en) Shooting parameter adjusting method and device and electronic equipment
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN111857512A (en) Image editing method and device and electronic equipment
CN111835982B (en) Image acquisition method, image acquisition device, electronic device, and storage medium
CN113794834B (en) Image processing method and device and electronic equipment
CN112532885B (en) Anti-shake method and device and electronic equipment
CN113010126A (en) Display control method, display control device, electronic device, and medium
CN112269522A (en) Image processing method, image processing device, electronic equipment and readable storage medium
WO2023083089A1 (en) Photographing control display method and apparatus, and electronic device and medium
CN112887615B (en) Shooting method and device
CN112702531B (en) Shooting method and device and electronic equipment
CN113962840A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112948048A (en) Information processing method, information processing device, electronic equipment and storage medium
CN112734661A (en) Image processing method and device
CN112416172A (en) Electronic equipment control method and device and electronic equipment
CN111835937A (en) Image processing method and device and electronic equipment
CN111901519A (en) Screen light supplement method and device and electronic equipment
CN112312021B (en) Shooting parameter adjusting method and device
CN113873168A (en) Shooting method, shooting device, electronic equipment and medium
CN114245017A (en) Shooting method and device and electronic equipment
CN113747076A (en) Shooting method and device and electronic equipment
CN112532904A (en) Video processing method and device and electronic equipment
CN113489901B (en) Shooting method and device thereof
CN114071016B (en) Image processing method, device, electronic equipment and storage medium
CN112468794B (en) Image processing method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination