WO2022142875A1 - 图像处理方法、装置、电子设备及存储介质 - Google Patents

图像处理方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2022142875A1
WO2022142875A1 PCT/CN2021/132592 CN2021132592W WO2022142875A1 WO 2022142875 A1 WO2022142875 A1 WO 2022142875A1 CN 2021132592 W CN2021132592 W CN 2021132592W WO 2022142875 A1 WO2022142875 A1 WO 2022142875A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
adjusted
processed
image processing
target effect
Prior art date
Application number
PCT/CN2021/132592
Other languages
English (en)
French (fr)
Inventor
袁知洪
阮春雷
蔡文强
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2022142875A1 publication Critical patent/WO2022142875A1/zh
Priority to US18/344,759 priority Critical patent/US20230360286A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Definitions

  • the present disclosure relates to the technical field of image processing, and in particular, to an image processing method, an apparatus, an electronic device, and a storage medium.
  • the functions of the intelligent terminal to collect images are becoming more and more powerful, so there are more and more various application programs for corresponding processing of the images collected by the intelligent terminal.
  • the present disclosure provides an image processing method, an apparatus, an electronic device and a storage medium, which are used to solve the technical problem that the current image processing form is relatively simple and cannot meet the needs of users for diversified processing of image details.
  • the present disclosure provides an image processing method, including:
  • the layer to be processed includes a feather layer corresponding to the area to be adjusted in the image to be processed;
  • a result image is determined according to the adjusted image and the to-be-processed image.
  • an image processing apparatus including:
  • an acquisition module configured to acquire the brightness value of each pixel in the layer to be processed, where the layer to be processed is the feather layer corresponding to the area to be adjusted in the image to be processed;
  • the adjustment module is used to adjust the target effect parameters of the target pixel point according to the input parameters of the target effect item and the preset mapping table, so as to generate an adjusted image, wherein the brightness value of the target pixel point is related to the area to be adjusted.
  • the brightness value of the middle anchor point satisfies the preset relationship;
  • a determination module configured to determine a result image according to the adjusted image and the to-be-processed image.
  • the present disclosure also provides an electronic device, comprising:
  • a memory for storing executable instructions of the processing device
  • the processor is configured to execute any one of the possible image processing methods in the first aspect by executing the executable instructions.
  • an embodiment of the present disclosure further provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements any one of the possible image processing methods in the first aspect.
  • an embodiment of the present disclosure further provides a computer program product, including a computer program, which implements any one of the possible image processing methods in the first aspect when the computer program is executed by a processor.
  • an embodiment of the present disclosure provides a computer program, which implements any one of the possible image processing methods in the first aspect when the computer program is executed by a processor.
  • the present disclosure provides an image processing method, device, electronic device and storage medium.
  • the target pixel By first obtaining the brightness value of each pixel in the layer to be processed, and then, according to the input parameters of the target effect item and the preset mapping table, the target pixel The target effect parameters are adjusted to generate the adjusted image, and finally, the result image is determined according to the adjusted image and the image to be processed, so that various target effect items can be applied to the local area of the image to be processed to realize the local adjustment of the image, Further, the scope of application of image editing content and methods is broadened, so that users can make adjustments to local areas and details to enrich image processing methods.
  • FIG. 1 is an application scenario diagram of an image processing method according to an exemplary embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart of an image processing method according to an exemplary embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of an image processing method according to another exemplary embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of an interface for selecting a region to be adjusted shown in an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of an interface for adjusting the range of a region to be adjusted shown in an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a layer to be processed shown in an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of an image local single-item adjustment interface shown in an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of an image local multi-item adjustment interface shown in an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • the term “including” and variations thereof are open-ended inclusions, ie, "including but not limited to”.
  • the term “based on” is “based at least in part on.”
  • the term “one embodiment” means “at least one embodiment”; the term “another embodiment” means “at least one additional embodiment”; the term “some embodiments” means “at least some embodiments”. Relevant definitions of other terms will be given in the description below.
  • the functions of smart terminals for collecting images are becoming more and more powerful, so there are more and more application programs for corresponding processing of images collected by smart terminals.
  • these applications can beautify and add special effects to the captured image information.
  • these existing application programs often process images in a single form, and can only adjust the image as a whole, and cannot meet the needs of users for diversified processing of image details.
  • the existing image processing applications usually process the image globally, or fix it for specific areas of the image (for example: eyes, lips, cheeks, etc.), and users cannot Adjust any local range of the image according to its own processing requirements.
  • the aim is to first obtain the brightness value of each pixel in the layer to be processed, and then, according to the input parameters of the target effect item and the preset mapping table, the target effect parameters of the target pixels are determined. Adjustment is performed to generate an adjusted image, and finally, the resulting image is determined according to the adjusted image and the image to be processed, so that various target effect items can be applied to the local area of the image to be processed to achieve local adjustment of the image, and then the image
  • the scope of application of editing content and methods is broadened, allowing users to adjust local and details to enrich image processing methods, so that basic editing items such as contrast, brightness, saturation, light perception, color temperature, and hue can be applied to the image to be processed. Within the local scope of the editing content, the scope of application of the editing content is widened, and the user can adjust the local area and details, thereby broadening the processing methods of various effects of the image.
  • FIG. 1 is an application scenario diagram of an image processing method according to an exemplary embodiment of the present disclosure.
  • the image processing method provided in this embodiment can be applied to a terminal device 100, where the terminal device 100 can be a personal computer, a notebook computer, a tablet computer, a smart phone, a wearable electronic device, a smart home device, etc. equipment.
  • the terminal device 100 can be a personal computer, a notebook computer, a tablet computer, a smart phone, a wearable electronic device, a smart home device, etc. equipment.
  • the add control point icon S01 When the user performs local adjustment of the image to be processed, under the prompt of the add control point icon S01, a control point can be added in the to-be-adjusted area of the image to be processed, wherein the anchor point is determined by the control point, and the anchor point is taken as the center point. Determine the to-be-processed layer corresponding to the area to be adjusted.
  • the target effect parameters of the target pixels are adjusted to generate the adjusted image.
  • a result image is determined according to the adjusted image and the to-be-processed image, so as to achieve a processing effect of locally adjusting the to-be-adjusted area of the to-be-processed image.
  • the data query method is described in detail below through several specific implementation manners.
  • FIG. 2 is a schematic flowchart of an image processing method according to an exemplary embodiment of the present disclosure. As shown in FIG. 2, the image processing method provided by this embodiment includes:
  • Step 101 Obtain the brightness value of each pixel in the layer to be processed.
  • the brightness value of each pixel in the layer to be processed may be acquired, wherein the layer to be processed includes a feather layer corresponding to the area to be adjusted in the image to be processed.
  • the feathering layer has a feathering effect, in which the transparency of the feathering layer is gradually reduced from the center to the edge, so that the inner and outer connecting parts of the feathering layer are blurred and play a gradient effect to achieve the effect of natural connection. In this way, when the layer to be processed is adjusted, the effect of natural connection can be achieved between the layer to be processed and other unprocessed areas in the image to be processed.
  • Step 102 Adjust the target effect parameter of the target pixel point according to the input parameter of the target effect item and the preset mapping table, so as to generate an adjusted image.
  • the input parameters of the target effect item and the preset mapping table (for convenience of description, the preset LUT is selected as the preset mapping table for description below) ) Adjust the target effect parameters of the target pixel to generate an adjusted image, wherein the brightness value of the target pixel and the brightness value of the anchor point in the layer to be processed satisfy a preset relationship, and the anchor point is the center of the area to be adjusted. point.
  • the layer to be processed is the feathering layer corresponding to the area to be adjusted, that is, the range corresponding to the layer to be processed and the range corresponding to the area to be adjusted are the same range, therefore, the above anchor point is both the area to be adjusted.
  • the center point which is also the center point of the layer to be processed. It should be noted that, in order to make the effect of the local processing more natural, a pixel point having a corresponding characteristic relationship with the center point in the area to be adjusted may be selected as the target pixel point for processing.
  • the above-mentioned preset LUT can be configured according to the requirements of products and effects, so as to match different processing requirements.
  • Step 103 Determine a result image according to the adjusted image and the image to be processed.
  • a result image needs to be generated according to the adjusted image and the to-be-processed image, so that the processing effect of local adjustment is presented in the result image.
  • the brightness value of each pixel in the layer to be processed is obtained first, and then the target effect parameter of the target pixel is adjusted according to the input parameter of the target effect item and the preset mapping table, so as to generate an adjusted Finally, the result image is determined according to the adjusted image and the to-be-processed image, so that various target effect items can be applied to the local area of the to-be-processed image to achieve local adjustment of the image, thereby broadening the scope of application of image editing content and methods. Allows users to make adjustments to local and details to enrich image processing methods.
  • FIG. 3 is a schematic flowchart of an image processing method according to another exemplary embodiment of the present disclosure. As shown in FIG. 3 , the image processing method provided by this embodiment includes:
  • Step 201 Acquire a trigger instruction acting on the image to be processed, and determine the anchor point according to the trigger instruction.
  • a control point can be added to the to-be-processed image, for example, by clicking and selecting on the touch screen of the terminal device, thereby generating a trigger instruction acting on the to-be-processed image.
  • the anchor point is determined according to the trigger instruction, and the anchor point may be the action point of clicking on the touch screen.
  • the touch screen mode is only one of the trigger modes, and may also be a cursor selection mode, a coordinate input mode, and the like, and the input form of the trigger instruction is not specifically limited in this embodiment.
  • FIG. 4 is a schematic diagram of an interface for selecting a region to be adjusted according to an embodiment of the present disclosure. As shown in FIG. 4 , the user can add a control point, that is, the first anchor point S02 , by clicking on the image to be processed.
  • a control point that is, the first anchor point S02
  • Step 202 Acquire a sliding instruction acting on the second sliding rod object, determine a second sliding rod value according to the second sliding rod object, and determine a range input parameter according to the second sliding rod value.
  • the user can input parameters by sliding the sliding contacts on the sliding rod.
  • the terminal device determines the second sliding rod value according to the second sliding rod value, and determines the range input parameter according to the second sliding rod value, wherein the range input parameter is used to determine the area to be adjusted.
  • the range area of that is, by sliding the second slider object, the adjustment of the range of the area to be adjusted can be realized.
  • FIG. 5 is a schematic diagram of an interface for adjusting the range of an area to be adjusted according to an embodiment of the disclosure
  • FIG. 6 is a schematic diagram of a layer to be processed according to an embodiment of the disclosure.
  • the range input parameter can be input by sliding the slider S04 to determine the range of the layer S03 to be processed.
  • the range size of the area to be adjusted can be adjusted by sliding the slider S04, and then the range size of the layer to be processed S03 corresponding to the area to be adjusted can be changed, wherein the layer to be processed shown in FIG. 6 is the area to be adjusted.
  • the corresponding feather layer is the range size of the layer to be processed shown in FIG. 6 is the area to be adjusted.
  • Step 203 Obtain the brightness value of each pixel in the layer to be processed.
  • the brightness value of each pixel in the layer to be processed can be obtained.
  • anchorLum anchorColor.rgb*vec3(0.333,0.5,0.167);
  • anchorLum is the brightness value of the current pixel
  • anchorColor.rgb is the pixel value of the current pixel
  • 0.333, 0.5, and 0.167 are the weight values of the three channels of red (R), green (G), and blue (B), respectively. Multiplying the RGB channels of the color image by the corresponding weight values above can convert the color image into a corresponding grayscale image.
  • the layer to be processed includes a feather layer corresponding to the area to be adjusted in the image to be processed.
  • the feathering layer has a feathering effect, in which the transparency of the feathering layer is gradually reduced from the center to the edge, which blurs the connection between the inner and outer parts of the feathered layer, and acts as a gradient to achieve the effect of natural connection. Therefore, when the layer to be processed is adjusted, the effect of natural connection can be achieved with other unprocessed areas in the image to be processed.
  • Step 204 determining that the pixel point whose Gaussian distance between the brightness value and the brightness value of the anchor point is smaller than the preset distance value is the target pixel point.
  • the Gaussian distance between the brightness value of the target pixel point and the brightness value of the anchor point is smaller than the preset distance value. That is, if the Gaussian distance between the brightness value of the pixel point in the area to be adjusted and the brightness value of the anchor point is smaller than the preset distance value, it is determined that the pixel point is the above-mentioned target pixel point.
  • the square of the Gaussian distance is less than the preset distance as the preset relationship, adjust the RGB (such as brightness, contrast, etc.)
  • the target effect of where the degree of influence is determined by the slider value and the Gaussian distance. Specifically, the smaller the Gaussian distance, the greater the degree of influence, and the greater the value of the input parameter of the target effect item (for example, the greater the value corresponding to the slider value), the greater the degree of influence.
  • their RGB values are not changed.
  • the brightness value of the anchor point is large, and there are areas with small brightness values (such as eyebrows) in the layer to be processed.
  • the distance value is used as the threshold, and the pixels of the cheek part can be screened as target pixels for subsequent adjustment, while the eyebrow part is not processed, so that the effect of local adjustment is more coordinated. If the above preset distance value is not set as the threshold, all pixels in the layer to be processed are processed indiscriminately, which may easily lead to inconsistency of the locally processed image.
  • gaussDist is the Gaussian distance
  • srcLum is the brightness value of the pixel point
  • anchorLum is the brightness value of the center point.
  • Step 205 Acquire a first sliding instruction acting on the first sliding rod object, and determine the first sliding rod value according to the first sliding instruction.
  • Step 206 Determine a corresponding preset mapping table from a plurality of candidate mapping tables according to the parameter range in which the first slider value is located.
  • the first sliding instruction acting on the first sliding rod object may be acquired, and the first sliding rod value may be determined according to the first sliding rod object, and then the corresponding LUT may be determined according to the parameter range in which the first sliding rod value is located.
  • the first sliding rod value may be determined according to the first sliding rod object
  • the corresponding LUT may be determined according to the parameter range in which the first sliding rod value is located.
  • it can correspond to two LUT images, namely the LUT positive image and the LUT negative image, so as to adjust the basic conditional effects of the image, such as contrast, brightness, saturation, light perception, color temperature, and hue. Wait. Yes, when the slider value is greater than 0, the positive LUT image is applied to obtain the effect of positive adjustment; when the slider value is less than 0, the negative LUT image is applied to obtain the effect of negative adjustment.
  • the input parameter of the target effect item may also be determined according to the first slider value. For example, the larger the first slider value, the larger the input parameter of the target effect item, and correspondingly, the stronger the achieved target effect.
  • the first slider object is used to adjust the presentation effect of the effect item. For example, by sliding the first slider object, the contrast, brightness, saturation, light perception, color temperature, hue, etc. of a local position in the image can be adjusted.
  • the above-mentioned second slider object is used to adjust the to-be-adjusted range of the effect item. For example, by sliding the second slider object, the area of the to-be-adjusted range can be expanded or reduced.
  • Step 207 Adjust the target effect parameter of the target pixel point according to the input parameter of the target effect item and the preset mapping table to generate an adjusted image.
  • the target effect parameter of the target pixel can be adjusted according to the input parameter of the target effect item and the preset LUT to generate an adjusted image, wherein, The brightness value of the target pixel point and the brightness value of the anchor point in the layer to be processed satisfy a preset relationship, and the anchor point is the center point of the area to be adjusted.
  • a pixel point having a corresponding characteristic relationship with the center point of the area to be adjusted may be selected as the target pixel point for processing.
  • the above-mentioned preset LUT can be configured according to the requirements of products and effects, so as to match different processing requirements.
  • Step 208 Determine a result image according to the adjusted image and the image to be processed.
  • a result image needs to be generated according to the adjusted image and the to-be-processed image, so that the processing effect of local adjustment is presented in the result image.
  • the mixing intensity of each pixel can be determined according to the transparency of each pixel in the feather layer, and then the adjusted image and the image to be processed can be mixed according to the mixing intensity of each pixel to obtain the above result. image.
  • FIG. 7 is a schematic diagram of an image local single-item adjustment interface shown in an embodiment of the present disclosure.
  • the slider S04 can be slid to determine the brightness adjustment input parameter, and the local area can be adjusted according to the brightness adjustment input parameter.
  • the logo on the first anchor point S02 will be modified to the brightness icon S05 corresponding to the brightness adjustment.
  • the embodiments provided by the present disclosure can not only realize local single-item adjustment of the image, but also can perform local multi-item adjustment of the image.
  • Control points to adjust different target effect items Specifically, a control point can be added to the image to be processed, the brightness of the area corresponding to the control point can be adjusted, another control point can be added to the image to be processed, and the saturation can be adjusted for another area corresponding to the control point. adjust.
  • FIG. 8 is a schematic diagram of an image local multi-item adjustment interface shown in an embodiment of the present disclosure.
  • a local range of brightness adjustment corresponding to S05
  • a local range of structure adjustment corresponding to S06
  • a local range of contrast adjustment corresponding to S07
  • a local range of Saturation adjustment corresponding to S08.
  • the specific principles of structure adjustment, contrast adjustment, and saturation adjustment are similar to those of brightness adjustment, and will not be repeated here.
  • the first target effect parameter of the target pixel can be adjusted according to the first input parameter of the first target effect item and the LUT, and then the second target effect can be adjusted according to the second target effect.
  • the second input parameter of the project and the LUT continue to adjust the second target effect parameter of the adjusted target pixel.
  • each The effect of a target pixel is superimposed based on the effect that has been achieved before, that is, the output of the previous effect is the input of the next one.
  • the first to-be-adjusted area and the second to-be-adjusted area may correspond to different anchor points, or may correspond to the same anchor point.
  • the first target effect item and the second target effect item may be the same target effect item, or may be different target effect items.
  • both the first target effect item and the second target effect item may be brightness adjustment.
  • the first target effect item may be brightness adjustment
  • the second target effect item may be saturation adjustment.
  • an anchor point is selected on the image to be processed by inputting a trigger instruction, and the range area of the area to be adjusted is determined by sliding the second slider object, so that the anchor point and the range area are used to jointly determine the area to be adjusted area range, and then use the method of sliding the first slider object to determine the corresponding preset mapping table and the input parameters of the target effect item, so as to adjust the target within the area range according to the input parameters of the target effect item and the preset mapping table
  • the target effect parameters of the pixel points are adjusted, so as to realize the technical effect of adjusting the effect of any local position in the image to be processed through simple user operations.
  • FIG. 9 is a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure. As shown in FIG. 9 , the image processing apparatus 300 provided in this embodiment includes:
  • the obtaining module 301 is configured to obtain the brightness value of each pixel in the layer to be processed, where the layer to be processed is the feather layer corresponding to the area to be adjusted in the image to be processed;
  • the adjustment module 302 is configured to adjust the target effect parameters of the target pixel point according to the input parameters of the target effect item and the preset mapping table, so as to generate an adjusted image, wherein the brightness value of the target pixel point and the to-be-adjusted The brightness value of the anchor point in the area satisfies the preset relationship;
  • a determination module 303 configured to determine a result image according to the adjusted image and the to-be-processed image.
  • the preset relationship includes that the Gaussian distance is smaller than a preset distance value.
  • the anchor point is the center point of the area to be adjusted.
  • the obtaining module 301 is further configured to obtain a first sliding instruction acting on the first sliding rod object, and determine the first sliding rod value according to the first sliding rod object;
  • the determining module 303 is further configured to determine the corresponding preset mapping table from a plurality of candidate mapping tables according to the parameter range in which the first slider value is located.
  • the determining module 303 is further configured to determine the input parameter of the target effect item according to the first slider value.
  • the adjustment module 302 is specifically used for:
  • the adjusted second target effect parameter of the target pixel is continuously adjusted according to the second input parameter of the second target effect item and the preset mapping table.
  • the first target effect item acts on the first area to be adjusted
  • the second target effect item acts on the second area to be adjusted
  • the target pixel point is the first area to be adjusted
  • a common pixel point between the area and the second to-be-adjusted area, and the first to-be-adjusted area and the second to-be-adjusted area correspond to different anchor points.
  • the determining module 303 is specifically used for:
  • the adjusted image and the to-be-processed image are mixed according to the mixed intensity of each pixel point to obtain the result image.
  • the target effect item includes: at least one of contrast, brightness, saturation, light perception, color temperature, and hue.
  • the determining module 303 is further configured to determine the region to be adjusted according to the anchor point and the range input parameters.
  • the obtaining module 301 is further configured to obtain a trigger instruction acting on the to-be-processed image
  • the determining module determines the anchor point according to the trigger instruction.
  • the determining module 303 is specifically used for:
  • the range input parameter is determined according to the second slider value, and the range input parameter is used to determine the range area of the region to be adjusted.
  • image processing apparatus provided by the embodiment shown in FIG. 9 can be used to execute the method provided by any of the above-mentioned embodiments, and the specific implementation manner and technical effect are similar, and details are not repeated here.
  • FIG. 10 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure. As shown in FIG. 10 , it shows a schematic structural diagram of an electronic device 400 suitable for implementing an embodiment of the present disclosure.
  • Terminal devices in the embodiments of the present disclosure may include, but are not limited to, such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, PDA for short), tablet computers (Portable Android Device, PAD for short), portable multimedia Players (Portable Media Player, PMP for short), in-vehicle terminals (such as in-vehicle navigation terminals), wearable electronic devices, smart home devices, and other mobile terminals with image acquisition functions, as well as digital TVs, desktop computers, etc. Get the fixed terminal of the device.
  • the electronic device shown in FIG. 10 is only an example, and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 400 may include a processing device (such as a central processing unit, a graphics processing unit, etc.) 401, which may be stored according to a program stored in a read-only memory (Read-Only Memory, ROM for short) 402 or from a storage device
  • the device 408 loads the program in the random access memory (Random Access Memory, RAM for short) 403 to execute the above-mentioned functions defined in the methods of the embodiments of the present disclosure.
  • RAM Random Access Memory
  • various programs and data required for the operation of the electronic device 400 are also stored.
  • the processing device 401, the ROM 402, and the RAM 403 are connected to each other through a bus 404.
  • An Input/Output (I/O for short) interface 405 is also connected to the bus 404 .
  • an input device 406 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD for short) ), speaker, vibrator, etc. output device 407; storage device 408 including, eg, magnetic tape, hard disk, etc.; and communication device 409.
  • Communication means 409 may allow electronic device 400 to communicate wirelessly or by wire with other devices to exchange data.
  • FIG. 10 shows electronic device 400 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communication device 409, or from the storage device 408, or from the ROM 402.
  • the processing apparatus 401 When the computer program is executed by the processing apparatus 401, the above-mentioned functions defined in the methods of the embodiments of the present disclosure are executed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable Read-Only Memory (Erasable Programmable Read-Only Memory, referred to as EPROM, or flash memory), optical fiber, portable compact disk read-only memory (Compact Disc Read-Only Memory, referred to as CD-ROM), optical storage devices, magnetic storage devices, or Any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: electric wire, optical cable, RF (Radio Frequency, radio frequency for short), etc., or any suitable combination of the above.
  • the client and the server can use any currently known or future developed network protocols such as Hyper Text Transfer Protocol (HTTP) to communicate, and can communicate with any form or medium of Digital data communications (eg, communications networks) interconnect.
  • HTTP Hyper Text Transfer Protocol
  • Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), the Internet (eg, the Internet), and a peer-to-peer network (eg, ad hoc peer-to-peer network), as well as any Networks currently known or developed in the future.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
  • the above computer readable medium carries one or more programs, which when executed by the electronic device, cause the electronic device to perform the above functions defined in the methods of the embodiments of the present disclosure.
  • the electronic device when the above-mentioned one or more programs are executed by the electronic device, the electronic device can be made to execute: obtain the luminance value of each pixel in the layer to be processed, where the layer to be processed includes the image to be processed The feathering layer corresponding to the area to be adjusted in the middle; according to the input parameters of the target effect item and the preset mapping table, the target effect parameters of the target pixel are adjusted to generate an adjusted image, wherein the brightness value of the target pixel is The brightness value of the anchor point in the to-be-adjusted area satisfies a preset relationship; the result image is determined according to the adjusted image and the to-be-processed image.
  • Computer program code for performing operations of the present disclosure may be written in one or more programming languages, including but not limited to object-oriented programming languages—such as Java, Smalltalk, C++, and This includes conventional procedural programming languages - such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner. Among them, the name of the unit does not constitute a limitation of the unit itself under certain circumstances.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Parts
  • SOC System on Chip
  • CPLD Complex Programmable Logic Device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • an image processing method including:
  • the layer to be processed includes a feather layer corresponding to the area to be adjusted in the image to be processed;
  • the lightness value satisfies the preset relationship
  • a result image is determined according to the adjusted image and the to-be-processed image.
  • the preset relationship includes that the Gaussian distance is smaller than a preset distance value.
  • the anchor point is a center point of the area to be adjusted.
  • the method before the adjustment of the target effect parameter of the target pixel point according to the input parameter of the target effect item and the preset mapping table, the method further includes:
  • the corresponding preset mapping table is determined from a plurality of candidate mapping tables according to the parameter range in which the first slider value is located.
  • the method further includes:
  • the input parameter of the target effect item is determined according to the first slider value.
  • the adjustment of the target effect parameter of the target pixel point according to the input parameter of the target effect item and the preset mapping table includes:
  • the adjusted second target effect parameter of the target pixel is continuously adjusted according to the second input parameter of the second target effect item and the preset mapping table.
  • the first target effect item acts on the first area to be adjusted
  • the second target effect item acts on the second area to be adjusted
  • the target pixel point is the first area to be adjusted.
  • a common pixel point of an area to be adjusted and the second area to be adjusted, the first area to be adjusted and the second area to be adjusted correspond to different anchor points.
  • the determining a result image according to the adjusted image and the to-be-processed image includes:
  • the adjusted image and the to-be-processed image are mixed according to the mixed intensity of each pixel point to obtain the result image.
  • the target effect item includes: at least one of contrast, brightness, saturation, light perception, color temperature, and hue.
  • the method before the acquiring the brightness value of each pixel in the layer to be processed, the method further includes:
  • the to-be-adjusted area is determined according to the anchor point and the range input parameter.
  • the method before the determining the region to be adjusted according to the anchor point and the range input parameter, the method further includes:
  • the anchor point is determined according to the trigger instruction.
  • the determining the area to be adjusted according to the anchor point and the range input parameter includes:
  • the range input parameter is determined according to the second slider value, and the range input parameter is used to determine the range area of the region to be adjusted.
  • an image processing apparatus including:
  • an acquisition module configured to acquire the brightness value of each pixel in the layer to be processed, where the layer to be processed is the feather layer corresponding to the area to be adjusted in the image to be processed;
  • the adjustment module is used to adjust the target effect parameters of the target pixel point according to the input parameters of the target effect item and the preset mapping table, so as to generate an adjusted image, wherein the brightness value of the target pixel point is related to the area to be adjusted.
  • the brightness value of the middle anchor point satisfies the preset relationship;
  • a determination module configured to determine a result image according to the adjusted image and the to-be-processed image.
  • the preset relationship includes that the Gaussian distance is smaller than a preset distance value.
  • the anchor point is a center point of the area to be adjusted.
  • the obtaining module is further configured to obtain a first sliding instruction acting on the first sliding rod object, and determine the first sliding rod value according to the first sliding rod object;
  • the determining module is further configured to determine the corresponding preset mapping table from a plurality of candidate mapping tables according to the parameter range in which the first slider value is located.
  • the determining module is further configured to determine the input parameter of the target effect item according to the first slider value.
  • the adjustment module is specifically used for:
  • the adjusted second target effect parameter of the target pixel is continuously adjusted according to the second input parameter of the second target effect item and the preset mapping table.
  • the first target effect item acts on the first area to be adjusted
  • the second target effect item acts on the second area to be adjusted
  • the target pixel point is the first area to be adjusted
  • a common pixel point between the area and the second to-be-adjusted area, and the first to-be-adjusted area and the second to-be-adjusted area correspond to different anchor points.
  • the determining module is specifically used for:
  • the adjusted image and the to-be-processed image are mixed according to the mixed intensity of each pixel point to obtain the result image.
  • the present disclosure also provides an electronic device, comprising:
  • a memory for storing executable instructions of the processing device
  • the processor is configured to execute any one of the possible image processing methods in the first aspect by executing the executable instructions.
  • an embodiment of the present disclosure further provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements any one of the possible image processing methods in the first aspect.
  • an embodiment of the present disclosure further provides a computer program product, including a computer program, which implements any one of the possible image processing methods in the first aspect when the computer program is executed by a processor.
  • an embodiment of the present disclosure provides a computer program, which implements any one of the possible image processing methods in the first aspect when the computer program is executed by a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理方法、装置、电子设备及存储介质。该图像处理方法,通过先获取待处理图层中各个像素点的明度值(101),然后,根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像(102),最后,根据调整后图像以及待处理图像确定结果图像(103),从而则可以将各类目标效果项目应用在待处理图像局部范围内,实现图像局部的调整,进而将图像编辑内容和方式适用范围拓宽,使得用户可针对局部和细节进行调整,以丰富图像处理方式。

Description

图像处理方法、装置、电子设备及存储介质
相关申请交叉引用
本申请要求于2020年12月31日提交中国专利局、申请号为202011630469.5、发明名称为“图像处理方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用并入本文。
技术领域
本公开涉及图像处理技术领域,尤其涉及一种图像处理方法、装置、电子设备及存储介质。
背景技术
随着智能终端技术的发展,智能终端采集图像的功能越来越强大,于是,对智能终端采集到的图像进行相应处理的各种应用程序也越来越多。
虽然,这些应用程序能够对采集到的图像信息进行美化、增加特效等处理,但是,现有应用程序对图像进行的处理往往形式比较单一,只能对图像进行整体调整,并不能满足用户对于图像细节多样化处理需求。
发明内容
本公开提供一种图像处理方法、装置、电子设备及存储介质,用于解决当前对图像进行的处理形式比较单一,不能满足用户对于图像细节多样化处理需求的技术问题。
第一方面,本公开提供一种图像处理方法,包括:
获取待处理图层中各个像素点的明度值,所述待处理图层包括待处理图像中待调整区域所对应的羽化图层;
根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;
根据所述调整后图像以及所述待处理图像确定结果图像。
第二方面,本公开提供一种图像处理装置,包括:
获取模块,用于获取待处理图层中各个像素点的明度值,所述待处理图层为待处理图像中待调整区域所对应的羽化图层;
调整模块,用于根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;
确定模块,用于根据所述调整后图像以及所述待处理图像确定结果图像。
第三方面,本公开还提供一种电子设备,包括:
处理器;以及,
存储器,用于存储所述处理装置的可执行指令;
其中,所述处理器配置为经由执行所述可执行指令来执行第一方面中任意一种可能的图像处理方法。
第四方面,本公开实施例还提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
第五方面,本公开实施例还提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
第六方面,本公开实施例提供一种计算机程序,该计算机程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
本公开提供一种图像处理方法、装置、电子设备及存储介质,通过先获取待处理图层中各个像素点的明度值,然后,根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,最后,根据调整后图像以及待处理图像确定结果图像,从而可以将各类目标效果项目应用在待处理图像局部范围内,实现图像局部的调整,进而将图像编辑内容和方式适用范围拓宽,使得用户可针对局部和细节进行调整,以丰富图像处理方式。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开根据一示例实施例示出的图像处理方法的应用场景图;
图2为本公开根据一示例实施例示出的图像处理方法的流程示意图;
图3为本公开根据另一示例实施例示出的图像处理方法的流程示意图;
图4为本公开实施例中示出的待调整区域选择的界面示意图;
图5为本公开实施例中示出的待调整区域范围调整的界面示意图;
图6为本公开实施例中示出的待处理图层的示意图;
图7为本公开实施例中示出的图像局部单项调整界面的示意图;
图8为本公开实施例中示出的图像局部多项调整界面的示意图;
图9为本公开根据一示例实施例示出的图像处理装置的结构示意图;
图10为本公开根据一示例实施例示出的电子设备的结构示意图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
应当理解,本公开的方法实施方式中记载的各个步骤可以按照不同的顺序执行,和/或并行执行。此外,方法实施方式可以包括附加的步骤和/或省略执行示出的步骤。本公开的范围 在此方面不受限制。
本文使用的术语“包括”及其变形是开放性包括,即“包括但不限于”。术语“基于”是“至少部分地基于”。术语“一个实施例”表示“至少一个实施例”;术语“另一实施例”表示“至少一个另外的实施例”;术语“一些实施例”表示“至少一些实施例”。其他术语的相关定义将在下文描述中给出。
需要注意,本公开中提及的“第一”、“第二”等概念仅用于对不同的装置、模块或单元进行区分,并非用于限定这些装置、模块或单元所执行的功能的顺序或者相互依存关系。
需要注意,本公开中提及的“一个”、“多个”的修饰是示意性而非限制性的,本领域技术人员应当理解,除非在上下文另有明确指出,否则应该理解为“一个或多个”。
目前,智能终端采集图像的功能越来越强大,于是,对智能终端采集到的图像进行相应处理的各种应用程序也越来越多。例如,这些应用程序可以对采集到的图像信息进行美化、增加特效等处理。但是,现有的这些应用程序对图像进行的处理往往形式比较单一,只能对图像进行整体调整,并不能满足用户对于图像细节多样化处理需求。具体的,现有的图像处理应用程序为了用户使用方便,通常是对图像全局进行处理,或者,固定针对图像的特定区域(例如:眼部、唇部以及脸颊等)进行处理,而用户无法根据自身的处理需求对图像的任意局部范围进行调整。
而在本公开所提供的实施例中,旨在通过先获取待处理图层中各个像素点的明度值,然后,根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,最后,根据调整后图像以及待处理图像确定结果图像,从而则可以将各类目标效果项目应用在待处理图像局部范围内,实现图像局部的调整,进而将图像编辑内容和方式适用范围拓宽,使得用户可针对局部和细节进行调整,以丰富图像处理方式,以实现可以将对比度、亮度、饱和度、光感、色温、色调等基础编辑项目应用在待处理图像的局部范围内,将编辑内容适用范围拓宽,用户可针对局部和细节进行调整,从而拓宽图像各类效果的处理方式。
图1为本公开根据一示例实施例示出的图像处理方法的应用场景图。如图1所示,本实施例提供的图像处理方法,可以应用于终端设备100,其中,终端设备100可以为个人电脑、笔记本电脑、平板电脑以及智能手机、可穿戴电子设备、智能家居设备等设备。用户在对待处理图像进行局部调整时,可以在添加控制点图标S01的提示下,在待处理图像的待调整区域添加控制点,其中,通过该控制点确定锚点,并且以锚点为中心点确定待调整区域所对应的待处理图层。然后,再根据目标效果项目的输入参数以及预设映射表,例如:预设颜色查找表(Look Up Table,简称LUT)对目标像素点的目标效果参数进行调整,以生成调整后图像,最后,根据调整后图像以及待处理图像确定结果图像,以实现对待处理图像的待调整区域进行局部调整的处理效果。下面通过几个具体实现方式对该数据查询方法进行详细说明。
图2为本公开根据一示例实施例示出的图像处理方法的流程示意图。如图2所示,本实施例提供的图像处理方法,包括:
步骤101、获取待处理图层中各个像素点的明度值。
在本步骤中,可以获取待处理图层中各个像素点的明度值,其中,待处理图层包括待处理图像中待调整区域所对应的羽化图层。值得说明的,羽化图层具备羽化效果,其中,羽化图层从中心自边缘的透明度逐渐降低,令羽化图层内外衔接部分虚化,起到渐变的作用从而达到自 然衔接的效果。由此,可以使得待处理图层在进行调整时候,与待处理图像中其他未进行处理的区域之间能够达到自然衔接的效果。
步骤102、根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像。
具体的,在获取到待处理图层中各个像素点的明度值之后,可以根据目标效果项目的输入参数以及预设映射表(为了方便说明,下面以选取预设LUT作为预设映射表进行说明)对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,目标像素点的明度值与待处理图层中锚点的明度值满足预设关系,锚点为待调整区域的中心点。可以理解的,由于待处理图层为待调整区域所对应的羽化图层,即待处理图层对应的范围与待调整区域对应的范围为相同的范围,因此,上述锚点既是待调整区域的中心点,又是待处理图层的中心点。值得说明的,为了使得局部处理的效果能够更加自然,可以选取与待调整区域中的中心点具备相应特征关系的像素点作为目标像素点进行处理。此外,在本实施例中,上述的预设LUT可以根据产品和效果的需求进行配置,从而匹配不同的处理需求。
步骤103、根据调整后图像以及待处理图像确定结果图像。
在生成调整后图像之后,还需根据调整后图像以及待处理图像生成结果图像,从而在结果图像中呈现局部调整的处理效果。
在本实施例中,通过先获取待处理图层中各个像素点的明度值,然后,根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,最后,根据调整后图像以及待处理图像确定结果图像,从而可以将各类目标效果项目应用在待处理图像局部范围内,实现图像局部的调整,进而将图像编辑内容和方式适用范围拓宽,使得用户可针对局部和细节进行调整,以丰富图像处理方式。
图3为本公开根据另一示例实施例示出的图像处理方法的流程示意图。如图3所示,本实施例提供的图像处理方法,包括:
步骤201、获取作用于待处理图像上的触发指令,根据触发指令确定锚点。
当用户需要对待处理图像进行局部处理时,可以通过在对待处理图像添加控制点,例如,可以是在终端设备的触屏上进行点击选取,从而生成作用于待处理图像上的触发指令。然后,再根据触发指令确定锚点,该锚点即可以是在触屏上点击的作用点。当然,触屏方式只是其中一种触发方式,还可以是光标选择方式、坐标输入方式等,在本实施例中不对触发指令的输入形式进行具体限定。
其中,图4为本公开实施例中示出的待调整区域选择的界面示意图。如图4所示,用户可以在待处理图像上通过点击的方式添加控制点,即第一锚点S02。
步骤202、获取作用于第二滑竿对象上的滑动指令,根据第二滑动指令确定第二滑竿值,并根据第二滑竿值确定范围输入参数。
在确定锚点之后,用户可以通过滑动滑竿上的滑动触点的方式进行参数输入。终端设备在获取作用于第二滑竿对象上的滑动指令之后,根据该第二滑动指令确定第二滑竿值,并根据第二滑竿值确定范围输入参数,其中,范围输入参数用于确定待调整区域的范围面积,即通过滑动第二滑竿对象,可以实现对于待调整区域范围的调节。
图5为本公开实施例中示出的待调整区域范围调整的界面示意图,图6为本公开实施例中示出的待处理图层的示意图。如图5-图6所示,可以通过滑动滑竿S04,从而输入范围输入 参数,以确定待处理图层S03的范围。具体的,可以通过滑动滑竿S04,来调节待调整区域的范围大小,进而改变待调整区域所对应的待处理图层S03的范围大小,其中,图6所示的待处理图层为待调整区域所对应的羽化图层。
步骤203、获取待处理图层中各个像素点的明度值。
在本步骤中,可以获取待处理图层中各个像素点的明度值。可选的,可以是先获取待处理图层中各个像素点像素值,然后,根据公式1将像素值转换为明度值:
公式1:anchorLum=anchorColor.rgb*vec3(0.333,0.5,0.167);
其中,anchorLum为当前像素点的明度值,anchorColor.rgb为当前像素点的像素值。其中,0.333、0.5、0.167分别为红(R)、绿(G)、蓝(B)三个通道各自所占的权重值。将彩色图像的RGB通道分别与上述对应权重值进行相乘,即可将该彩色图像转换为对应的灰度图。
此外,待处理图层包括待处理图像中待调整区域所对应的羽化图层。值得说明的,羽化图层具备羽化效果,其中,羽化图层从中心自边缘的透明度逐渐降低,令羽化图层内外衔接部分虚化,起到渐变的作用从而达到自然衔接的效果。从而可以使得待处理图层在进行调整时候,与待处理图像中其他未进行处理的区域之间能够达到自然衔接的效果。
步骤204、确定明度值与锚点的明度值的高斯距离小于预设距离值的像素点为目标像素点。
可选的,对于目标像素点的明度值与锚点的明度值所需要满足预设关系,可以包括目标像素点的明度值与锚点的明度值的高斯距离小于预设距离值。即若待调整区域内像素点的明度值与锚点的明度值的高斯距离小于预设距离值,则确定该像素点即为上述的目标像素点。
如果以高斯距离的平方小于预设距离值为预设关系,则对满足该预设关系的像素点,调整其RGB(如亮度、对比度等),从而,对于满足在该阈值范围内的像素点的目标效果产生影响,其中,影响程度由滑竿值和高斯距离共同决定。具体的,高斯距离越小,则产生的影响程度越大,目标效果项目的输入参数的数值越大(例如:滑竿值对应的数值越大),则产生的影响程度越大。而对于未满足该预设关系的像素点,则不改变其RGB值。通过设置上述预设距离值作为阈值对目标像素点进行筛选,只对符合要求的像素点进行调节,从而使得局部调节的效果更加协调。例如,当在进行局部(如:脸颊)调整时,锚点的亮度值较大,而在待处理图层中存在亮度值较小的区域(如:眉毛),此时,通过设置上述预设距离值作为阈值,可以将脸颊部分的像素点筛选为目标像素点进行后续调整,而对于眉毛部分则不作处理,从而使得局部调节的效果更加协调。若不设置上述预设距离值作为阈值,则就是对这个待处理图层内的所有像素点进行无差别地处理,容易导致局部处理后的图像不协调。
而对于上述高斯距离,可以采用根据公式2进行计算:
公式2:gaussDist=exp((srcLum-anchorLum)2/(-2.0))
其中,gaussDist为高斯距离,srcLum为像素点的明度值,anchorLum为中心点的明度值。
步骤205、获取作用于第一滑竿对象上的第一滑动指令,并根据第一滑动指令确定第一滑竿值。
步骤206、根据第一滑竿值所处的参数范围从多个待选映射表中确定对应的预设映射表。
在本步骤中,可以获取作用于第一滑竿对象上的第一滑动指令,并根据第一滑动指令确定第一滑竿值,然后,再根据第一滑竿值所处的参数范围确定对应的LUT。具体的,对于每个效果项目,可以对应两张LUT图即LUT正向图和LUT负向图,从而来调整实现图像的基础条件效果,如对比度、亮度、饱和度、光感、色温、色调等。可以是,当滑竿值大于0时,应 用正向LUT图,得到正向调整的效果;当滑竿值小于0时,则应用负向LUT图,得到负向调整的效果。
此外,还可以根据第一滑竿值确定目标效果项目的输入参数,例如,可以是第一滑竿值越大,则目标效果项目的输入参数越大,对应的,所实现的目标效果就越强。
并且,为了对上述的第一滑竿对象与第二滑竿对象的区别,可以结合二者所实现的功能进行说明。其中,第一滑竿对象用于对效果项目的呈现效果进行调节,例如,通过滑动第一滑竿对象,可以调节图像中局部位置的对比度、亮度、饱和度、光感、色温、色调等。而对于上述的第二滑竿对象,则是用于对效果项目的待调整范围进行调节,例如,通过滑动第二滑竿对象,可以扩大或者缩小待调整范围的范围面积。
步骤207、根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像。
具体的,在获取到待处理图层中各个像素点的明度值之后,可以根据目标效果项目的输入参数以及预设LUT对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,目标像素点的明度值与所述待处理图层中锚点的明度值满足预设关系,锚点为待调整区域的中心点。值得说明的,为了使得局部处理的效果能够更加自然,可以选取与待调整区域的中心点具备相应特征关系的像素点作为目标像素点进行处理。此外,在本实施例中,上述的预设LUT可以根据产品和效果的需求进行配置,从而匹配不同的处理需求。
步骤208、根据调整后图像以及待处理图像确定结果图像。
在生成调整后图像之后,还需根据调整后图像以及待处理图像生成结果图像,从而在结果图像中呈现局部调整的处理效果。
具体的,可以是根据羽化图层中各个像素点的透明度确定各个像素点的混合强度,然后,再根据各个像素点的混合强度对调整后图像以及待处理图像进行混合处理,以获得上述的结果图像。
图7为本公开实施例中示出的图像局部单项调整界面的示意图。如图7所示,在亮度项目调节下,可以通过滑动滑竿S04,从而确定亮度的调节输入参数,并且根据亮度的调节输入参数对该局部区域进行调整。而且,在触发亮度项目调节后,第一锚点S02上的标识将修改为亮度调节所对应的亮度图标S05。
在上述实施例的基础上,本公开所提供的实施例不但可以实现图像局部单项调整,还可以进行图像局部多项调整,例如,可以是在待处理图像中添加多个控制点,针对每个控制点进行不同的目标效果项目调节。具体的,可以在待处理图像中添加一控制点,针对该控制点所对应的区域进行亮度调节,在待处理图像中添加另一控制点,针对该控制点所对应的另一区域进行饱和度调节。
图8为本公开实施例中示出的图像局部多项调整界面的示意图。如图8所示,可以在待处理图像上进行局部范围的亮度调节(对应S05),进行局部范围的结构调节(对应S06),进行局部范围的对比度调节(对应S07),以及进行局部范围的饱和度调节(对应S08)。其中,对于结构调节、对比度调节以及饱和度调节的具体原理与亮度调节的原理类似,此处不再进行赘述。
值得说明的,在进行图像局部多项调整时,可以是先根据第一目标效果项目的第一输入参数以及LUT对目标像素点的第一目标效果参数进行调整,然后,再根据第二目标效果项目的 第二输入参数以及LUT对调整后的目标像素点的第二目标效果参数继续进行调整。其中,当目标像素点属于第一待调整区域,且目标像素点属于第二待调整区域时(即,目标像素点为第一待调整区域与第二待调整区域的共同像素点时),每一个目标像素点的效果是基于前面已经实现后的效果上叠加的,即前面效果的输出是下一个的输入。
可选的,第一待调整区域与第二待调整区域可以是对应不同的锚点,也可以是对应相同的锚点。此外,上述第一目标效果项目与第二目标效果项目可以是同一个目标效果项目,也可以是不同的目标效果项目。例如,第一目标效果项目与第二目标效果项目均可以是亮度调节,此外,还可以是第一目标效果项为亮度调节,第二目标效果项为饱和度调节。
在本实施例中,通过输入触发指令的方式在待处理图像上选择锚点,并利用滑动第二滑竿对象的方式来确定待调整区域的范围面积,从而利用锚点以及范围面积共同确定待调整区域范围,然后,再利用滑动第一滑竿对象的方式来确定对应的预设映射表以及目标效果项目的输入参数,从而根据目标效果项目的输入参数以及预设映射表对待调整区域范围内的目标像素点的目标效果参数进行调整,以实现通过简单用户操作即可完成对于待处理图像中任意局部位置效果的调整的技术效果。
图9为本公开根据一示例实施例示出的图像处理装置的结构示意图。如图9所示,本实施例提供的图像处理装置300,包括:
获取模块301,用于获取待处理图层中各个像素点的明度值,所述待处理图层为待处理图像中待调整区域所对应的羽化图层;
调整模块302,用于根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;
确定模块303,用于根据所述调整后图像以及所述待处理图像确定结果图像。
在一种可能的设计中,所述预设关系包括高斯距离小于预设距离值。
在一种可能的设计中,所述锚点为所述待调整区域的中心点。
在一种可能的设计中,所述获取模块301,还用于获取作用于第一滑竿对象上的第一滑动指令,并根据所述第一滑动指令确定第一滑竿值;
所述确定模块303,还用于根据所述第一滑竿值所处的参数范围从多个待选映射表中确定对应的所述预设映射表。
在一种可能的设计中,所述确定模块303,还用于根据所述第一滑竿值确定所述目标效果项目的输入参数。
在一种可能的设计中,所述调整模块302,具体用于:
根据第一目标效果项目的第一输入参数以及所述预设映射表对所述目标像素点的第一目标效果参数进行调整;以及
根据第二目标效果项目的第二输入参数以及所述预设映射表对调整后的所述目标像素点的第二目标效果参数继续进行调整。
在一种可能的设计中,所述第一目标效果项目作用于第一待调整区域,所述第二目标效果项目作用于第二待调整区域,所述目标像素点为所述第一待调整区域与所述第二待调整区域的共同像素点,所述第一待调整区域与所述第二待调整区域对应不同的锚点。
在一种可能的设计中,所述确定模块303,具体用于:
根据所述羽化图层中各个像素点的透明度确定各个像素点的混合强度;
根据各个像素点的所述混合强度对所述调整后图像以及所述待处理图像进行混合处理,以获得所述结果图像。
在一种可能的设计中,所述目标效果项目包括:对比度、亮度、饱和度、光感、色温以及色调中的至少一项。
在一种可能的设计中,所述确定模块303,还用于根据所述锚点以及范围输入参数确定所述待调整区域。
在一种可能的设计中,所述获取模块301,还用于获取作用于所述待处理图像上的触发指令;
所述确定模块,根据所述触发指令确定所述锚点。
在一种可能的设计中,所述确定模块303,具体用于:
获取作用于第二滑竿对象上的滑动指令,并根据所述第二滑动指令确定第二滑竿值;
根据所述第二滑竿值确定所述范围输入参数,所述范围输入参数用于确定所述待调整区域的范围面积。
值得说明的,图9所示实施例提供的图像处理装置,可用于执行上述任一实施例提供的方法,具体实现方式和技术效果类似,这里不再赘述。
图10为本公开根据一示例实施例示出的电子设备的结构示意图。如图10所示,其示出了适于用来实现本公开实施例的电子设备400的结构示意图。本公开实施例中的终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑(Portable Android Device,简称PAD)、便携式多媒体播放器(Portable Media Player,简称PMP)、车载终端(例如车载导航终端)、可穿戴电子设备、智能家居设备等等具有图像获取功能的移动终端以及诸如数字TV、台式计算机等等外接有具有图像获取设备的固定终端。图10示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图10所示,电子设备400可以包括处理装置(例如中央处理器、图形处理器等)401,其可以根据存储在只读存储器(Read-Only Memory,简称ROM)402中的程序或者从存储装置408加载到随机访问存储器(Random Access Memory,简称RAM)403中的程序而执行本公开实施例的方法中限定的上述功能。在RAM 403中,还存储有电子设备400操作所需的各种程序和数据。处理装置401、ROM 402以及RAM 403通过总线404彼此相连。输入/输出(Input/Output,简称I/O)接口405也连接至总线404。
通常,以下装置可以连接至I/O接口405:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置406;包括例如液晶显示器(Liquid Crystal Display,简称LCD)、扬声器、振动器等的输出装置407;包括例如磁带、硬盘等的存储装置408;以及通信装置409。通信装置409可以允许电子设备400与其他设备进行无线或有线通信以交换数据。虽然图10示出了具有各种装置的电子设备400,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在非暂态计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例 中,该计算机程序可以通过通信装置409从网络上被下载和安装,或者从存储装置408被安装,或者从ROM 402被安装。在该计算机程序被处理装置401执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(Erasable Programmable Read-Only Memory,简称EPROM,或闪存)、光纤、便携式紧凑磁盘只读存储器(Compact Disc Read-Only Memory,简称CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(Radio Frequency,简称射频)等等,或者上述的任意合适的组合。
在一些实施方式中,客户端、服务器可以利用诸如超文本传输协议(Hyper Text Transfer Protocol,简称HTTP)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(Local Area Network,简称LAN),广域网(Wide Area Network,简称WAN),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行本公开实施例的方法中限定的上述功能。在一个实施例中,当上述一个或者多个程序被该电子设备执行时,可以使得该电子设备执行:获取待处理图层中各个像素点的明度值,所述待处理图层包括待处理图像中待调整区域所对应的羽化图层;根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;根据所述调整后图像以及所述待处理图像确定结果图像。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括但不限于面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网 (WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(Field Programmable Gate Array,简称FPGA)、专用集成电路(Application Specific Integrated Circuit,简称ASIC)、专用标准产品(Application Specific Standard Parts,简称ASSP)、片上系统(System on Chip,简称SOC)、复杂可编程逻辑设备(Complex Programmable Logic Device,简称CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
第一方面,根据本公开的一个或多个实施例,提供了一种图像处理方法,包括:
获取待处理图层中各个像素点的明度值,所述待处理图层包括待处理图像中待调整区域所对应的羽化图层;
根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待处理图层中锚点的明度值满足预设关系;
根据所述调整后图像以及所述待处理图像确定结果图像。
根据本公开的一个或多个实施例,所述预设关系包括高斯距离小于预设距离值。
根据本公开的一个或多个实施例,所述锚点为所述待调整区域的中心点。
根据本公开的一个或多个实施例,在所述根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整之前,还包括:
获取作用于第一滑竿对象上的第一滑动指令,并根据所述第一滑动指令确定第一滑竿值;
根据所述第一滑竿值所处的参数范围从多个待选映射表中确定对应的所述预设映射表。
根据本公开的一个或多个实施例,在所述根据所述第一滑动指令确定第一滑竿值之后,还 包括:
根据所述第一滑竿值确定所述目标效果项目的输入参数。
根据本公开的一个或多个实施例,所述根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,包括:
根据第一目标效果项目的第一输入参数以及所述预设映射表对所述目标像素点的第一目标效果参数进行调整;
根据第二目标效果项目的第二输入参数以及所述预设映射表对调整后的所述目标像素点的第二目标效果参数继续进行调整。
根据本公开的一个或多个实施例,所述第一目标效果项目作用于第一待调整区域,所述第二目标效果项目作用于第二待调整区域,所述目标像素点为所述第一待调整区域与所述第二待调整区域的共同像素点,所述第一待调整区域与所述第二待调整区域对应不同的锚点。
根据本公开的一个或多个实施例,所述根据所述调整后图像以及所述待处理图像确定结果图像,包括:
根据所述羽化图层中各个像素点的透明度确定各个像素点的混合强度;
根据各个像素点的所述混合强度对所述调整后图像以及所述待处理图像进行混合处理,以获得所述结果图像。
根据本公开的一个或多个实施例,所述目标效果项目包括:对比度、亮度、饱和度、光感、色温以及色调中的至少一项。
根据本公开的一个或多个实施例,在所述获取待处理图层中各个像素点的明度值之前,还包括:
根据所述锚点以及范围输入参数确定所述待调整区域。
根据本公开的一个或多个实施例,在所述根据所述锚点以及范围输入参数确定所述待调整区域之前,还包括:
获取作用于所述待处理图像上的触发指令;
根据所述触发指令确定所述锚点。
根据本公开的一个或多个实施例,所述根据所述锚点以及范围输入参数确定所述待调整区域,包括:
获取作用于第二滑竿对象上的滑动指令,并根据所述第二滑动指令确定第二滑竿值;
根据所述第二滑竿值确定所述范围输入参数,所述范围输入参数用于确定所述待调整区域的范围面积。
第二方面,根据本公开的一个或多个实施例,提供了一种图像处理装置,包括:
获取模块,用于获取待处理图层中各个像素点的明度值,所述待处理图层为待处理图像中待调整区域所对应的羽化图层;
调整模块,用于根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;
确定模块,用于根据所述调整后图像以及所述待处理图像确定结果图像。
在一种可能的设计中,所述预设关系包括高斯距离小于预设距离值。
根据本公开的一个或多个实施例,所述锚点为所述待调整区域的中心点。
在一种可能的设计中,所述获取模块,还用于获取作用于第一滑竿对象上的第一滑动指令,并根据所述第一滑动指令确定第一滑竿值;
所述确定模块,还用于根据所述第一滑竿值所处的参数范围从多个待选映射表中确定对应的所述预设映射表。
在一种可能的设计中,所述确定模块,还用于根据所述第一滑竿值确定所述目标效果项目的输入参数。
在一种可能的设计中,所述调整模块,具体用于:
根据第一目标效果项目的第一输入参数以及所述预设映射表对所述目标像素点的第一目标效果参数进行调整;
根据第二目标效果项目的第二输入参数以及所述预设映射表对调整后的所述目标像素点的第二目标效果参数继续进行调整。
在一种可能的设计中,所述第一目标效果项目作用于第一待调整区域,所述第二目标效果项目作用于第二待调整区域,所述目标像素点为所述第一待调整区域与所述第二待调整区域的共同像素点,所述第一待调整区域与所述第二待调整区域对应不同的锚点。
在一种可能的设计中,所述确定模块,具体用于:
根据所述羽化图层中各个像素点的透明度确定各个像素点的混合强度;
根据各个像素点的所述混合强度对所述调整后图像以及所述待处理图像进行混合处理,以获得所述结果图像。
第三方面,本公开还提供一种电子设备,包括:
处理器;以及,
存储器,用于存储所述处理装置的可执行指令;
其中,所述处理器配置为经由执行所述可执行指令来执行第一方面中任意一种可能的图像处理方法。
第四方面,本公开实施例还提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
第五方面,本公开实施例还提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
第六方面,本公开实施例提供一种计算机程序,该计算机程序被处理器执行时实现第一方面中任意一种可能的图像处理方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在 多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (17)

  1. 一种图像处理方法,其特征在于,包括:
    获取待处理图层中各个像素点的明度值,所述待处理图层包括待处理图像中待调整区域所对应的羽化图层;
    根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待调整区域中锚点的明度值满足预设关系;
    根据所述调整后图像以及所述待处理图像确定结果图像。
  2. 根据权利要求1所述的图像处理方法,其特征在于,所述预设关系包括高斯距离小于预设距离值。
  3. 根据权利要求1或2所述的图像处理方法,其特征在于,所述锚点为所述待调整区域的中心点。
  4. 根据权利要求1-3中任意一项所述的图像处理方法,其特征在于,在所述根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整之前,还包括:
    获取作用于第一滑竿对象上的第一滑动指令,并根据所述第一滑动指令确定第一滑竿值;
    根据所述第一滑竿值所处的参数范围从多个待选映射表中确定对应的所述预设映射表。
  5. 根据权利要求4所述的图像处理方法,其特征在于,在所述根据所述第一滑动指令确定第一滑竿值之后,还包括:
    根据所述第一滑竿值确定所述目标效果项目的输入参数。
  6. 根据权利要求1-5中任意一项所述的图像处理方法,其特征在于,所述根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,包括:
    根据第一目标效果项目的第一输入参数以及所述预设映射表对所述目标像素点的第一目标效果参数进行调整;
    根据第二目标效果项目的第二输入参数以及所述预设映射表对调整后的所述目标像素点的第二目标效果参数继续进行调整。
  7. 根据权利要求6所述的图像处理方法,其特征在于,所述第一目标效果项目作用于第一待调整区域,所述第二目标效果项目作用于第二待调整区域,所述目标像素点为所述第一待调整区域与所述第二待调整区域的共同像素点,所述第一待调整区域与所述第二待调整区域对应不同的锚点。
  8. 根据权利要求1-7中任意一项所述的图像处理方法,其特征在于,所述根据所述调整后图像以及所述待处理图像确定结果图像,包括:
    根据所述羽化图层中各个像素点的透明度确定各个像素点的混合强度;
    根据各个像素点的所述混合强度对所述调整后图像以及所述待处理图像进行混合处理,以获得所述结果图像。
  9. 根据权利要求1-8中任意一项所述的图像处理方法,其特征在于,所述目标效果项目包括:对比度、亮度、饱和度、光感、色温以及色调中的至少一项。
  10. 根据权利要求1-9中任意一项所述的图像处理方法,其特征在于,在所述获取待处理图层中各个像素点的明度值之前,还包括:
    根据所述锚点以及范围输入参数确定所述待调整区域。
  11. 根据权利要求10所述的图像处理方法,其特征在于,在所述根据所述锚点以及范围输入参数确定所述待调整区域之前,还包括:
    获取作用于所述待处理图像上的触发指令;
    根据所述触发指令确定所述锚点。
  12. 根据权利要求10或11所述的图像处理方法,其特征在于,所述根据所述锚点以及范围输入参数确定所述待调整区域,包括:
    获取作用于第二滑竿对象上的滑动指令,并根据所述第二滑动指令确定第二滑竿值;
    根据所述第二滑竿值确定所述范围输入参数,所述范围输入参数用于确定所述待调整区域的范围面积。
  13. 一种图像处理装置,其特征在于,包括:
    获取模块,用于获取待处理图层中各个像素点的明度值,所述待处理图层为待处理图像中待调整区域所对应的羽化图层;
    调整模块,用于根据目标效果项目的输入参数以及预设映射表对目标像素点的目标效果参数进行调整,以生成调整后图像,其中,所述目标像素点的明度值与所述待处理图层中锚点的明度值满足预设关系;
    确定模块,用于根据所述调整后图像以及所述待处理图像确定结果图像。
  14. 一种电子设备,其特征在于,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1-12中任意一项所述的图像处理方法。
  15. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1-12中任意一项所述的图像处理方法。
  16. 一种计算机程序产品,包括计算机程序,其特征在于,该计算机程序被处理器执行时实现权利要求1-12中任意一项所述的图像处理方法。
  17. 一种计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-12中任意一项所述的图像处理方法。
PCT/CN2021/132592 2020-12-31 2021-11-23 图像处理方法、装置、电子设备及存储介质 WO2022142875A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/344,759 US20230360286A1 (en) 2020-12-31 2023-06-29 Image processing method and apparatus, electronic device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011630469.5A CN112767238A (zh) 2020-12-31 2020-12-31 图像处理方法、装置、电子设备及存储介质
CN202011630469.5 2020-12-31

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/132594 Continuation-In-Part WO2022142876A1 (zh) 2020-12-31 2021-11-23 图像处理方法、装置、电子设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/344,759 Continuation-In-Part US20230360286A1 (en) 2020-12-31 2023-06-29 Image processing method and apparatus, electronic device and storage medium

Publications (1)

Publication Number Publication Date
WO2022142875A1 true WO2022142875A1 (zh) 2022-07-07

Family

ID=75699577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/132592 WO2022142875A1 (zh) 2020-12-31 2021-11-23 图像处理方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN112767238A (zh)
WO (1) WO2022142875A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115619904A (zh) * 2022-09-09 2023-01-17 北京字跳网络技术有限公司 图像处理方法、装置及设备
CN115880168A (zh) * 2022-09-30 2023-03-31 北京字跳网络技术有限公司 图像修复方法、装置、设备、计算机可读存储介质及产品

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767238A (zh) * 2020-12-31 2021-05-07 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质
CN113763287A (zh) * 2021-09-27 2021-12-07 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260203A1 (en) * 2014-07-16 2016-09-08 Shenzhen Tcl New Technology Co., Ltd Method for acquiring histogram, method for dynamically adjusting luminance and image processing apparatus
CN111127543A (zh) * 2019-12-23 2020-05-08 北京金山安全软件有限公司 图像处理方法、装置、电子设备以及存储介质
CN111489429A (zh) * 2020-04-16 2020-08-04 诚迈科技(南京)股份有限公司 一种图像渲染控制方法、终端设备和存储介质
CN112102154A (zh) * 2020-08-20 2020-12-18 北京百度网讯科技有限公司 图像处理方法、装置、电子设备和存储介质
CN112767238A (zh) * 2020-12-31 2021-05-07 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8538148B2 (en) * 2010-09-24 2013-09-17 Intel Corporation Brightness enhancement method, system and apparatus for low power architectures
CN107705247B (zh) * 2017-10-31 2021-06-15 努比亚技术有限公司 一种图像饱和度的调整方法、终端及存储介质
CN111583103B (zh) * 2020-05-14 2023-05-16 抖音视界有限公司 人脸图像处理方法、装置、电子设备及计算机存储介质
CN111598813B (zh) * 2020-05-25 2023-05-19 抖音视界有限公司 人脸图像处理方法、装置、电子设备及计算机可读介质
CN111724329B (zh) * 2020-07-03 2022-03-01 北京字节跳动网络技术有限公司 图像的处理方法、装置以及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260203A1 (en) * 2014-07-16 2016-09-08 Shenzhen Tcl New Technology Co., Ltd Method for acquiring histogram, method for dynamically adjusting luminance and image processing apparatus
CN111127543A (zh) * 2019-12-23 2020-05-08 北京金山安全软件有限公司 图像处理方法、装置、电子设备以及存储介质
CN111489429A (zh) * 2020-04-16 2020-08-04 诚迈科技(南京)股份有限公司 一种图像渲染控制方法、终端设备和存储介质
CN112102154A (zh) * 2020-08-20 2020-12-18 北京百度网讯科技有限公司 图像处理方法、装置、电子设备和存储介质
CN112767238A (zh) * 2020-12-31 2021-05-07 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115619904A (zh) * 2022-09-09 2023-01-17 北京字跳网络技术有限公司 图像处理方法、装置及设备
CN115880168A (zh) * 2022-09-30 2023-03-31 北京字跳网络技术有限公司 图像修复方法、装置、设备、计算机可读存储介质及产品

Also Published As

Publication number Publication date
CN112767238A (zh) 2021-05-07

Similar Documents

Publication Publication Date Title
WO2022142875A1 (zh) 图像处理方法、装置、电子设备及存储介质
CN107256555B (zh) 一种图像处理方法、装置及存储介质
WO2023125374A1 (zh) 图像处理方法、装置、电子设备及存储介质
US9665247B2 (en) Method and device for applying a new skin to a display environment
TWI586159B (zh) 終端間影像共享方法、終端裝置以及通信系統
CN110069974B (zh) 高光图像处理方法、装置和电子设备
WO2022042290A1 (zh) 一种虚拟模型处理方法、装置、电子设备和存储介质
CN106611402B (zh) 图像处理方法及装置
CN110211030B (zh) 图像生成方法和装置
US20240119969A1 (en) Video processing method and apparatus, electronic device and storage medium
JP2018509663A (ja) 画像タイプ識別方法、装置、プログラム及び記録媒体
WO2022088970A1 (zh) 一种图像的处理方法、装置、设备及存储介质
WO2022171024A1 (zh) 图像显示方法、装置、设备及介质
WO2023109829A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2022142876A1 (zh) 图像处理方法、装置、电子设备及存储介质
KR20220137067A (ko) 영상 특수 효과 처리 방법 및 장치
CN111815750A (zh) 对图像打光的方法及装置、电子设备和存储介质
JP2023509429A (ja) 画像処理方法及び装置
US20230360286A1 (en) Image processing method and apparatus, electronic device and storage medium
CN111583103A (zh) 人脸图像处理方法、装置、电子设备及计算机存储介质
JP7471510B2 (ja) ピクチャのビデオへの変換の方法、装置、機器および記憶媒体
US20230393704A1 (en) Screenshot Capture based on Content Type
CN113645476A (zh) 画面处理方法、装置、电子设备及存储介质
CN112348910A (zh) 获取图像的方法、装置、设备和计算机可读介质
WO2023207381A1 (zh) 图像处理方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21913617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23.10.2023)