CN117522760B - Image processing method, device, electronic equipment, medium and product - Google Patents

Image processing method, device, electronic equipment, medium and product Download PDF

Info

Publication number
CN117522760B
CN117522760B CN202311508982.0A CN202311508982A CN117522760B CN 117522760 B CN117522760 B CN 117522760B CN 202311508982 A CN202311508982 A CN 202311508982A CN 117522760 B CN117522760 B CN 117522760B
Authority
CN
China
Prior art keywords
pixel
value
range
reserved
hue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311508982.0A
Other languages
Chinese (zh)
Other versions
CN117522760A (en
Inventor
王萍萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shuhang Technology Beijing Co ltd
Original Assignee
Shuhang Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shuhang Technology Beijing Co ltd filed Critical Shuhang Technology Beijing Co ltd
Priority to CN202311508982.0A priority Critical patent/CN117522760B/en
Publication of CN117522760A publication Critical patent/CN117522760A/en
Application granted granted Critical
Publication of CN117522760B publication Critical patent/CN117522760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device, electronic equipment, media and products, which can be applied to the technical field of data processing. The method comprises the following steps: acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; determining a hue value range to which each pixel point belongs from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of the pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; determining reserved hue value ranges in the service image from K hue value ranges based on the number of pixel points corresponding to each hue value range; and carrying out special effect processing on the business image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determining the business image after the special effect processing as a target image. By adopting the embodiment of the application, the efficiency of special effect processing on the image is improved.

Description

Image processing method, device, electronic equipment, medium and product
Technical Field
The present application relates to the field of data processing technologies, and in particular, to an image processing method, an image processing device, an electronic device, a medium, and a product.
Background
Currently, when performing special effects on an image, an area requiring special effects may be manually selected, or complex deep learning is required to identify the area requiring special effects.
In the practical process, the inventor finds that when an area needing special effect processing is manually selected, the situation that the selected area is inaccurate easily occurs, and a business object (such as a user) may need to repeatedly execute the selection operation of the area of the special effect processing, so that the efficiency of the special effect processing is low. In addition, when a region requiring special effect processing is identified by deep learning, it is necessary to calculate the depth of image data, and once the content in an image is complex or the image data is large, it takes much time to identify the region, resulting in low efficiency of special effect processing. Therefore, how to improve the efficiency of special effect processing on images is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, electronic equipment, a medium and a product, which are beneficial to improving the efficiency of special effect processing on images.
In one aspect, an embodiment of the present application discloses an image processing method, including:
Acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
Acquiring a hue value of each pixel point in N pixel points, determining a hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel;
Determining reserved hue value ranges in the service image from K hue value ranges based on the number of pixel points corresponding to each hue value range;
and carrying out special effect processing on the business image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determining the business image after the special effect processing as a target image.
In one aspect, an embodiment of the present application discloses an image processing apparatus, including:
An acquisition unit, configured to acquire a service image including a target object, and acquire a mask area of the target object in the service image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
The processing unit is used for acquiring the hue value of each pixel point in the N pixel points, determining the hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of the pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel;
the processing unit is further used for determining reserved hue value ranges in the service image from the K hue value ranges based on the number of pixel points corresponding to each hue value range;
And the processing unit is also used for carrying out special effect processing on the business image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determining the business image after the special effect processing as a target image.
In one aspect, an embodiment of the present application provides an electronic device, including a processor, and a memory, where the memory is configured to store a computer program, the computer program including program instructions, and the processor is configured to perform the steps of:
In one aspect, embodiments of the present application provide a computer readable storage medium having stored therein computer program instructions which, when executed by a processor, are adapted to perform the steps of:
in one aspect, embodiments of the present application provide a computer program product or computer program comprising computer instructions which, when executed by a processor, implement the method provided in one of the aspects above.
By adopting the embodiment of the application, the number of the pixels of the pixel points corresponding to the K hue value ranges can be determined based on the hue values of the pixels in the mask region of the target object in the service image, and further the reserved hue value range is determined based on the number of the pixels of the pixel points corresponding to each hue value range, so that special effect processing is carried out on the service image based on the reserved hue value range, and the target image is obtained. Therefore, the reserved hue value range of the special effect processing can be determined by counting the number of the pixels corresponding to each hue value range, and then the reserved hue value range is used for carrying out the special effect processing, so that the selection of a precise special effect area or the calculation of a complex depth network are not needed to be carried out manually, the special effect processing is carried out based on the information of the pixels of the image, the calculated amount of the whole special effect processing process is small, and the efficiency of carrying out the special effect processing on the image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an image processing system according to an embodiment of the present application;
FIG. 2 is a schematic view of an image processing procedure according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a mask area determination process according to an embodiment of the present application;
FIG. 5 is a schematic diagram showing the effect of a hue value range according to an embodiment of the present application;
fig. 6 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 7 is a schematic diagram showing the effect of sub-ranges in a hue value range according to an embodiment of the present application;
Fig. 8 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 9 is a schematic diagram of a special effect processing scenario provided in an embodiment of the present application;
fig. 10 is a schematic structural view of an image processing apparatus according to an embodiment of the present application;
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
The embodiment of the application provides an image processing method, which can determine the number of pixels corresponding to K hue value ranges based on hue values of all pixels in a mask region of a target object in a business image, further determine reserved hue value ranges based on the number of pixels corresponding to each hue value range, and further perform special effect processing on the business image based on the reserved hue value ranges to obtain the target image. Therefore, the reserved hue value range of the special effect processing can be determined by counting the number of the pixel points corresponding to each hue value range, and then the reserved hue value range is used for the special effect processing, so that the selection of special effect areas and the calculation of a complex depth network are not needed, and the efficiency of the special effect processing on the image is improved.
In one possible implementation, the above image processing scheme may be applied in an image processing system. Referring to fig. 1, fig. 1 is a schematic diagram of an image processing system according to an embodiment of the application. As shown in fig. 1, the image processing system may include a client 11 and a server 12. The client 11 may be running in a terminal device, and the client 11 may be some social information software, audio/video software, a community e-commerce platform, etc., which is not limited herein. The server 12 is used to provide services to clients, the content of which such as providing resources to clients, saving client data, and the like.
It will be appreciated that data interaction may be performed between the client 11 and the server 12, for example, when a business object (e.g. user a) publishes a page of a message containing an image on a social platform through the client a, the client a may send message data to the server, and then the server may send the message data to the client B corresponding to another user (e.g. user B) after receiving the message data. The server may also send a message issued by another service object to the client a corresponding to the user a.
Further, referring to fig. 2, fig. 2 is a schematic view of an image processing procedure according to an embodiment of the present application. As shown in fig. 2, the acquired business image 21a may include a target object 211a, where the target object may be a person, an animal, an object, or the like, and is not limited herein. A mask area of the target object in the business image is acquired, which can be seen at 212a in fig. 2. The service image comprises N pixel points, each pixel point has an initial pixel value, and further, the hue value corresponding to each pixel point can be determined based on the initial pixel value. Further, the hue value range to which each pixel point belongs may be determined based on the hue value corresponding to each pixel point, and the number of pixel points corresponding to each hue value range may be counted. For example, as shown by 202a in fig. 2, the K hue value ranges may include hue value range 1, hue value range 2, hue value range 6, and the like, each of which may be associated with a corresponding number of pixels, e.g., hue value range 1 corresponds to number of pixels r1, hue value range 2 corresponds to number of pixels r2, and the like.
Further, the number of pixels corresponding to each hue value range determines a reserved hue value range (shown as 203a in fig. 2) in the service image from the K hue value ranges, and further performs special effect processing on the service image based on the initial pixel value of the pixel corresponding to the reserved hue value range, and determines the service image after the special effect processing as a target image, as shown by a target image 22a in fig. 2.
It can be appreciated that in the embodiment of the present application, complex computing logic is not involved, and the above-described image processing scheme may be implemented by a GPU (image processor) in an electronic device, so that the processing efficiency may be effectively improved compared to a CPU (central processing unit) based implementation. Because the rendering of the image is finally realized through the processing of the GPU, when the GPU performs the image rendering, the image rendering is realized through determining the pixel value corresponding to each pixel point. On the other hand, the method and the device carry out color system classification (namely division of hue value range) on the picture through the information of the pixel points in the HSV space (such as hue value of the pixel points), have lower algorithm complexity and can meet the performance requirement of real-time rendering of the mobile terminal. On the other hand, in the actual software development process, the scheme can be completely developed and realized by personnel in the aspect of image rendering, and the development production link is shortened while the special effect processing effect is ensured.
It should be noted that, all object data (such as the resource that the target object performs the interaction) collected by the present application are collected under the condition that the object (such as a person, an enterprise, an organization, etc.) agrees and authorizes, and the collection, the use and the processing of the related object data need to comply with the related laws and regulations and standards of the related country and region. For example, the present application may display a prompt message before and during the process of collecting the relevant data of the object, so as to prompt the object to collect the relevant data currently, so that the present application only starts to execute the relevant step of obtaining the relevant data of the object after obtaining the confirmation operation of the object to the prompt message, otherwise (i.e. the confirmation operation of the object to the prompt message is not obtained), and ends the relevant step of obtaining the relevant data of the object, i.e. the relevant data of the object is not obtained.
It will be appreciated that the above-described image processing scheme may be applied to an electronic device, for example, the electronic device may be a terminal device where the above-described client 11 operates, or may be the server 12. It can be understood that if the image processing scheme is applied to a terminal device where a client (e.g., the client 11) operates, the terminal device may directly acquire a service image, perform special effect processing on the service image, and directly display a target image when the service image (i.e., the target image) after the special effect processing is obtained. If the image processing scheme is applied to a server (such as the server 12), the service object can select a service image needing special effect processing in the terminal device, the terminal device sends the service image to the server, the server performs special effect processing on the service image after acquiring the service image to obtain a target image, and the target image is returned to the terminal device so that the terminal device can display the target image after special effect processing.
The method can be used for executing the image processing scheme, namely, when the service image is acquired, the reserved hue value range is determined based on the hue value of the pixel point in the mask region where the target object in the service image is located, and then special effect processing is carried out on the service image based on the reserved hue value range, so that the target image is obtained. Therefore, special effect processing on the business image can be realized based on a simple processing process, and the efficiency of special effect processing on the image is improved. The terminal equipment can include, but is not limited to, mobile phones, computers, intelligent voice interaction equipment, intelligent home appliances, vehicle-mounted terminals, aircrafts, intelligent sound boxes, intelligent home appliances and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
It can be understood that the above scenario is merely an example, and does not constitute a limitation on the application scenario of the technical solution provided by the embodiment of the present application, and the technical solution of the present application may also be applied to other scenarios. For example, as one of ordinary skill in the art can know, with the evolution of the system architecture and the appearance of new service scenarios, the technical solution provided by the embodiment of the present application is also applicable to similar technical problems.
Based on the above description, an embodiment of the present application proposes an image processing method. Referring to fig. 3, fig. 3 is a flowchart of an image processing method according to an embodiment of the application. The method may be performed by the electronic device described above. The image processing method may include steps S101 to S104.
S101, acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer.
The business image refers to an image to be subjected to special effect processing. The target object is an object to be subjected to special effect processing in the business image. The target object may be a person, an animal, an object, etc., without limitation. The target object may be an object in a real environment, or may be an object in a virtual environment, such as a three-dimensional modeled game object in a game scene, and the like, which is not limited herein.
It will be appreciated that the mask region refers to the region where the target object is located. It can be appreciated that the mask area may include N pixels, where the value of N should be less than or equal to the number of all pixels in the service image. The mask area may be based on a determination associated with a mask image of the target object, also referred to as a mask area. The mask image is a binary image composed of 0 and 1, the image size of the mask image is consistent with the image size of the service image, and when the area of the target object in the service image is determined based on the mask image, the mask area of the target object can be determined based on the pixel points of the pixel points in the 1-value area in the mask image in the same position in the service image. In other words, other areas except the area where the target object is located in the service image are covered (also called masking operation) through the mask image, so that the pixel points in the area where the target object is located (namely the mask area) can be more conveniently processed later, the number of the pixel points to be processed is reduced, and the image processing efficiency is improved. Alternatively, in some scenarios, the mask area may be the entire area of the business image, which is not limited herein. For example, in some scenes, the business image may be an image of a local area of a certain object, where special effects need to be performed on the whole business image, or the business image may be a landscape image, where the target object is all objects in the business image, and then the mask area may be the whole area of the business image.
The mask image may be determined based on a pre-trained neural network, for example, a foreground region where the target object is located is identified through the pre-trained neural network, a pixel value of a pixel point in the foreground region is determined to be 1, and a pixel value of a pixel point other than the foreground region is determined to be 0, so that a mask image is obtained, and the mask region is the identified foreground region. The mask image may also be determined based on a region selection operation of a service object (such as a user), for example, if a pixel value of a pixel point in a region selected by the service object is determined to be 1, and a pixel value of a pixel point in another region is determined to be 0, so as to obtain the mask image, the mask region may be the region selected by the service object. It can be understood that the area determined when determining the mask image can be a rough area, only the position of the target object is needed to be roughly positioned, and each small area is not needed to be precisely positioned, so that the calculation cost and the time consumption are less, and the efficiency of the whole special effect processing is ensured.
For example, referring to fig. 4, fig. 4 is a schematic diagram illustrating a mask region determining process according to an embodiment of the present application. As shown in fig. 4, the business image 41a includes the target object 411a, and the mask image corresponding to the business image 41a is shown with reference to the mask image 42 a. In the mask image 42a, the pixel value of the pixel point in the area indicated by 421a is 1, and the pixel values of the pixel points in the remaining areas are 0. And mask region 43a, which may mask business image 41a based on mask image 42 a. It can be seen that in the image obtained by masking the business image 41a, the other areas than the masked area 43a are covered.
The initial pixel value refers to a pixel value of the pixel point in the service image. It will be appreciated that the traffic image may be a three-channel image, and that the initial pixel values of the traffic image may include respective corresponding values on each of the three channels, e.g., may include a first channel value, a second channel value, a third channel value. For example, the service image may be an RGB (red green blue) three-channel image, the initial pixel value of the service image may include a first channel value corresponding to an R (red) channel, a second channel value may be a value corresponding to a G (green) channel, and a third channel value may be a value corresponding to a G (green) channel.
It can be appreciated that when the mask region of the target object is obtained, the mask region of the target object may be converted into a texture image (texture), and the texture image may be transmitted to a fragment shader in the GPU, so that the fragment shader in the subsequent electronic device may perform special effect processing on the service image.
S102, acquiring a hue value of each pixel point in N pixel points, determining a hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in N pixel points, and determining the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel.
The hue value refers to a value of a pixel point on a hue, and the hue (hue) is a basic attribute of a color, such as a color name commonly known as red, yellow, and the like. The hue value can be in the range of 0-360 degrees.
In one embodiment, the hue value of a pixel may be calculated by calculating a first channel value, a second channel value, and a third channel value included in the initial pixel value of the pixel. Specifically, a hue conversion formula for converting an initial pixel value into a hue value may be obtained, and the initial pixel value of each pixel point is recorded into the hue conversion formula to calculate, so as to obtain the hue value of each pixel point. For example, the service image may be an RGB three-channel image, and when the first channel value, the second channel value, and the third channel value included in the initial pixel value of a pixel point are (r 1, g1, and b 1), and when acquiring a hue conversion formula for converting the RGB value into a hue value, r1, g1, and b1 are input into the hue conversion formula to calculate, so as to obtain the hue value of the pixel point.
Specifically, inputting the initial pixel value of each pixel point into a hue conversion formula to calculate to obtain the hue value of each pixel point, which may include: converting the first channel value, the second channel value and the third channel value into a target value range to obtain a first target value, a second target value and a third target value; determining a channel maximum value and a channel minimum value from the first target value, the second target value and the third target value; and determining a hue conversion formula by the channel maximum value, the channel minimum value, the first target value, the second target value and the third target value, and obtaining the hue value of each pixel point based on the hue conversion formula. It is understood that the calculated hue value is converted from radian to angle, and the value range is 0-360 degrees.
It is understood that a hue value range refers to a range of hue values, which may also be referred to as a color space. The K hue value ranges may be K ranges divided based on the value ranges of the hue values, and the K hue value ranges do not overlap with each other. It is understood that K values may be determined according to actual requirements, for example, K is 6 or 12, etc. It is understood that the hue value range is 0-360 °, and the hue value range (0-360 °) can be uniformly divided into K ranges.
For example, referring to fig. 5, fig. 5 is a schematic diagram showing effects of a hue value range according to an embodiment of the present application. As shown in fig. 5, here, taking K as 6 as an example, 6 hue value ranges are determined. Hue value range 1 is 330-360 ° (360 ° overlapping 0 ° here) and 0-30 °, hue value range 2 is 30-90 °, hue value range 3 is 90-150 °, hue value range 4 is 150-210 °, hue value range 5 is 210-270 °, and hue value range 6 is 270-330 °. It will be appreciated that it is understood that the hue value within the range boundary value specifically belongs to which hue value range, and may be determined according to practical situations, for example, the range boundary value 90 may be determined to belong to the hue value range 2 or belong to the hue value range 3 according to practical requirements.
The N pixel points include a pixel point i, where the pixel point i may be any pixel point of the N pixel points, and i is less than or equal to N positive integers; then, determining the hue value range to which each pixel point belongs from the K hue value ranges based on the hue value of each pixel point in the N pixel points may include traversing each hue value range in the K hue value ranges, and determining the traversed hue value range as the hue value range to which the pixel point i belongs when the hue value of the pixel point i belongs; and when each pixel point in the N pixel points is determined to be the pixel point i, obtaining a hue value range to which each pixel point belongs.
It can be understood that, when determining the hue value range to which the pixel points belong, the number of the pixel points corresponding to each hue value range can be counted based on the hue value range to which each pixel point belongs. It is to be understood that, when determining the number of pixels corresponding to each hue value range, the number of pixels corresponding to each hue value range may be determined based on the hue value range to which each pixel belongs after determining all the hue value ranges to which all the pixels belong, or the number of pixels corresponding to each hue value range may be counted while determining the hue value range to which each pixel belongs, which is not limited herein.
Specifically, determining the number of pixels corresponding to each hue value range based on the pixels corresponding to each hue value range in the K hue value ranges includes: obtaining a target array for recording the number of pixels corresponding to each hue value range, wherein the target array comprises K array elements, and one array element is used for recording the number of pixels corresponding to one hue value range; when the hue value range of the pixel point i is determined to be the target hue value range, adding one to the numerical value of the number of the pixel points corresponding to the target hue range in the target array; when each pixel in the N pixels is determined to be the pixel i, the number of pixels corresponding to each hue value range is determined based on the number of pixels corresponding to the hue value range recorded by each array element in the target array.
The target hue value range may be any hue value range among K hue value ranges. The target array may be a one-dimensional array or a two-dimensional array, which is not limited herein. When the target array is a one-dimensional array, the number of pixels corresponding to each hue value range can be recorded according to the sequence of the preset hue value ranges, and when the target array is initialized, the size of the target array is larger than or equal to K. When the target array is a two-dimensional array, the number of pixels corresponding to each hue value range can be recorded according to the preset hue value range, and when the target array is initialized, the number of rows or columns of the target array is larger than or equal to K. In some scenarios, the initialization size of the target array, which is a two-dimensional array, may be the same as the size of the business image, i.e. the size of the mask image, but the array element of the target array, which is actually used to record the number of pixels corresponding to the hue value range, is still K.
For example, the N number of pixels includes pixel 1, pixel 2, and the number of pixels N, and the K hue value ranges include: hue value range a, hue value range b a hue value range k. First, an initialized target array may be obtained, i.e., each array element in the target array is 0. When the pixel point 1 is determined to belong to the hue value range a, adding 1 to the number of the pixel points corresponding to the hue value range a in the target array, namely, the number of the pixel points corresponding to the hue value range a is 0+1=1; when the pixel point 2 is determined to belong to the hue value range c, adding 1 to the number of the pixel points corresponding to the hue value range c in the target array, namely, the number of the pixel points corresponding to the hue value range c is 0+1=1; when the pixel point 3 is determined to belong to the hue value range c, adding 1 to the number of the pixel points corresponding to the hue value range c in the target array, namely, the number of the pixel points corresponding to the hue value range c is 1+1=2; and similarly, determining the hue value range of each pixel point, updating the values of the array elements in the target array until the values of the array elements in the target array are updated based on the hue value ranges of all the pixel points, and determining the number of the pixel points corresponding to each hue value range by using the number of the pixel points corresponding to the hue value range recorded by each array element in the target array.
S103, determining reserved hue value ranges in the service image from K hue value ranges based on the number of pixel points corresponding to each hue value range.
The reserved hue value range may be a hue value range that needs to be reserved in the mask area. It is to be understood that the reserved hue value range may be a hue value range with the number of pixels corresponding to the K hue value ranges, for example, the reserved hue value range may be a hue value range with the maximum number of pixels corresponding to the K hue value ranges, or may be a hue value range with the maximum number of pixels corresponding to the K hue value ranges and a hue value range with the second largest number of pixels corresponding to the K hue value ranges.
Specifically, determining a reserved hue value range in the service image from the K hue value ranges based on the number of pixels corresponding to each hue value range may include the following steps: from the K hue value ranges, determining M hue value ranges with the maximum number of corresponding pixel points; m is a positive integer less than K; and determining the determined M hue value ranges as reserved hue value ranges in the service image.
The M hue value ranges with the largest number of corresponding pixels refer to hue value ranges with the number of corresponding pixels being equal to the number of the first M pixels. When m=1, the corresponding M hue value ranges with the largest number of pixels, that is, the largest hue value range among the K hue value ranges. The M values may be determined according to actual needs, for example, may default to a certain value (for example, default to 1), or may be selected by a user, which is not limited herein.
For example, when K is 6, the number of pixels corresponding to each of the 6 hue value ranges is, in order from large to small: 1500. 500, 200, 120, 100, 80. If m=1, the maximum number of pixels (i.e., 1500) can be determined, and the hue value range corresponding to the number of pixels of 1500 can be determined as the reserved hue value range. If m=2, the maximum number of 2 pixels (i.e., 1500 and 500) can be determined, and the hue value range corresponding to the number of pixels of 1500 are both determined as the reserved hue value range.
In some embodiments, an additional variable (e.g., a target variable) may be used to record the maximum value of the number of pixels, and when the target array is traversed, the target variable is continuously updated, and when the traversal is finished, the value recorded by the target variable is the maximum value of the number of pixels corresponding to the K hue value ranges, so that the hue value range corresponding to the number of pixels recorded by the target variable is determined as the reserved hue value range.
S104, performing special effect processing on the business image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determining the business image after the special effect processing as a target image.
The target image may be an image obtained after performing special effect processing on the service image.
It can be understood that, based on the initial pixel value of the pixel point corresponding to the reserved hue value range, special effect processing is performed on the service image, and gray processing can be performed on the initial pixel values of other pixel points except the pixel point corresponding to the reserved hue value range, so as to obtain the service image after special effect processing. Specifically, the performing special effect processing on the service image based on the initial pixel value of the pixel point corresponding to the reserved hue value range may include: determining other pixel points except the pixel point corresponding to the reserved hue value range as gray pixel points, and carrying out graying treatment on the initial pixel value of the gray pixel points to obtain gray pixel values corresponding to the gray pixel points; and determining the business image after special effect processing based on the gray pixel value corresponding to the gray pixel point and the initial pixel value of the pixel point corresponding to the reserved hue value range.
It can be understood that, based on the gray pixel value corresponding to the gray pixel point and the initial pixel value of the pixel point corresponding to the reserved hue value range, the business image after special effect processing is determined, that is, the original color corresponding to the pixel point corresponding to the reserved hue value range, and the gray processing is performed on the other pixel points except the pixel point corresponding to the hue value range. In other words, the pixel value of the gray pixel point in the service image can be adjusted from the initial pixel value to the gray pixel value, so as to obtain the service image after special effect processing.
The gray pixel points refer to pixel points to be subjected to gray processing. The gradation processing is performed on the initial pixel value of the gradation pixel point, and may be performed by means of average gradation, maximum gradation, or the like. The average value graying may be a gray value obtained by graying an average value of three channel values included in the pixel, for example, when the three channels are R, G, B, R "=g" =b "= (r+g+b)/3, where R", G ", and B" represent values of the three channels after graying, and R, G, B is used to obtain three channel values included in the initial pixel value of the pixel. The maximum value graying may be a gray value obtained by graying a maximum value of three channel values included in the pixel, for example, when the three channels are R, G, B, R "=g" =b "=max ([ R, G, B ]), where R", G ", B" represent values of the three channels after graying, and R, G, B represents values of the three channels included in the initial pixel value of the pixel.
In some embodiments, in order to avoid the situation that the color of the pixel point corresponding to the reserved hue value range in the target image is too hard and abrupt, the initial pixel value of the pixel point corresponding to the reserved hue value range in the target image and the value obtained by performing the gray-scale processing on the initial pixel value may be subjected to pixel mixing, so as to determine the pixel value of the pixel point corresponding to the reserved hue value range, and further determine the service image after the special effect processing based on the pixel value obtained by performing the pixel mixing of the pixel point corresponding to the reserved hue value range and the gray-scale pixel value corresponding to the gray-scale pixel point. Reference is made in particular to the description associated with the embodiment of fig. 6 which follows.
In some scenes, when the user needs to issue an image for special effect processing in the social application, a service image to be processed can be selected, and then the terminal device responds to the selection operation for the service image to execute steps S101-S104, so that a target image for special effect processing is obtained. Further, the terminal device may display the target image in the terminal page, so that the user may view the image effect after the special effect processing. In response to a user issuing confirmation operation for the target image, the target image may be issued, for example, the target image may be sent to the server, so that the server may push the target image to other users.
Referring to fig. 6, fig. 6 is a flowchart of an image processing method according to an embodiment of the application. The method may be performed by the electronic device described above. The image processing method may include steps S201 to S206.
S201, acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer.
S202, acquiring a hue value of each pixel point in N pixel points, determining a hue value range to which each pixel point belongs from K hue value ranges based on the hue value of each pixel point in N pixel points, and determining the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel.
S203, determining a reserved hue value range in the service image from the K hue value ranges based on the number of pixels corresponding to each hue value range.
The processing procedures of step S201 to step S203 may refer to the related descriptions of step S101 to step S103, which are not described herein.
S204, determining the pixel point corresponding to the reserved hue value range as a first reserved pixel point, and determining the reserved pixel value corresponding to the first reserved pixel point based on the initial pixel value of the first reserved pixel point.
The first reserved pixel point refers to a pixel point corresponding to the reserved hue value range. In other words, the first reserved pixel point refers to a pixel point whose hue value belongs to the reserved hue value range. The reserved pixel value corresponding to the first reserved pixel point refers to the pixel value of the first reserved pixel point after special effect processing is performed. For convenience of description, the reserved pixel value corresponding to the first reserved pixel point may be referred to as a first reserved pixel value.
It is understood that the reserved pixel value of the first reserved pixel point may be determined based on the initial pixel value of the first reserved pixel point. Alternatively, the reserved pixel value corresponding to the first reserved pixel point may be directly used as the initial pixel value of the first reserved pixel point. Alternatively, the retained pixel value corresponding to the first retained pixel point may be obtained by performing pixel mixing with the initial pixel value based on a pixel value obtained by performing the graying process on the initial pixel value. It can be understood that the reserved pixel values obtained by mixing the pixels can avoid the situation that the colors of the pixel points in the image after the special effect processing are hard and abrupt, so that the color transition is smoother, the picture is more balanced and the special effect processing effect is improved in the image after the special effect processing.
Specifically, each hue value range includes a transition range and a center range, the transition range in any hue value range is determined based on a range boundary value of any hue value range, and the center range of any hue value range is other range than the transition range of any hue value range; the transition range and the central range are different ranges; then, determining the reserved pixel value corresponding to the first reserved pixel point based on the initial pixel value of the first reserved pixel point may include the steps of: determining a target sub-range to which the first reserved pixel belongs in the reserved hue value range based on the hue value of the first reserved pixel; the target sub-range is a transition range or a central range of the reserved hue value range; and determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point.
The transition range and the center range can be two sub-ranges in the hue value range, the transition range and the center range are different ranges, the transition range and the center range are not overlapped, and the union of the transition range and the center range is the hue value range.
The transition range is a range of hue values of pixel points that need to be mixed in performing special effect processing, and is determined based on a range boundary value of the hue value range. The range boundary value refers to a value on a boundary of the hue range, and may be simply referred to as a boundary value. One hue value range includes 2 range boundary values, e.g., noted as a first range boundary value and a second range boundary value. For example, if a hue value range is 30-90, the hue value range boundary values include both 30 and 90 range boundary values. As another example, a hue value range of 330-360, and 0-30, then the hue value range boundary values include both 330 and 30 range boundary values.
The process of determining the transition range based on the range boundary value of the hue value ranges will be described herein taking as an example the target hue value range (i.e., any one of the K hue value ranges) included in the K hue value ranges. The method specifically comprises the following steps: the method comprises the steps of obtaining the size of a preset transition range, determining a first transition range based on a first range boundary value of a target hue value range and the size of the preset transition range, determining a second transition range based on a second range boundary value of the target hue value range and the size of the preset transition range, and determining the first transition range and the second transition range as transition ranges in the target hue value range. Wherein the first range boundary value and the second range boundary value refer to two range boundary values of the target hue value range. The first transition range and the second transition range are two sub-transition ranges included in the transition range, the first transition range and the second transition range are not overlapped, the size of the preset transition range refers to the size of the sub-transition range in the preset transition range, and the size of the preset transition range is not more than half of the size in the target hue value range. The first transition range is determined based on a first range boundary value of the target hue value range and a preset transition range size, the first range boundary value may be a boundary value of a transition range, a range with a range size being the preset transition range size is determined in the target hue value range, and the determined range is taken as the first transition range. Similarly, the second transition range is determined based on the second range boundary value of the target hue value range and the preset transition range size, and the range size is determined to be the range of the preset transition range size in the target hue value range by taking the second range boundary value as the boundary value of the transition range, and the determined range is taken as the second transition range. For example, if one hue value range is 30-90, the boundary values of the hue value range include two range boundary values of 30 and 90, and if the preset transition range size is 20, the first transition range may be determined to be 30-50 based on the boundary value 30 (i.e., the first range boundary value), the second transition range may be determined to be 70-90 based on the boundary value 90 (i.e., the second range boundary value), and the center range may be the other range than the transition range, i.e., 50-70, of the hue value ranges.
The center range is a range of hue values of pixel points where pixel blending is required at the time of performing special effect processing, and the center range value is a range other than the transition range in the hue value range. The central range may also be determined by: a center range determined based on the center value of the hue value range, then the transition range is the other range of the hue value range than the center range. The center value of the hue value range refers to the center-most value in the hue value range, and one hue value range includes 1 center value. For example, if one hue value range is 30-90, then the center value of the hue value range is 60. As another example, one hue value range is 330-360, and 0-30, then the hue value range has a center value of 360 (or 0).
Here, a process of determining a transition range based on range boundary values of hue value ranges is explained taking a target hue value range included in K hue value ranges as an example. The method specifically comprises the following steps: and acquiring the size of a preset center range, and determining the center range of the target hue value range based on the center value of the target hue value range and the size of the preset center range. Wherein, the sum of the size of the preset transition range and the size of the preset center range is 2 times and is equal to the size of a hue value range. The size of the preset center range refers to the size of the preset center range, and the size of the preset center range should be smaller than the size of the hue value range. The center range of the target hue value range is determined based on the center value of the target hue value range and the preset center range size, and the center value of the target hue value range may be the center value of the center range, the range size is determined to be the preset center range size in the target hue value range, and the determined range is taken as the center range. For example, if one hue value range is 30-90, the center value of the hue value range is 66, if the preset center range size is 20, the center range can be determined to be 50-70 based on the center value 60, and accordingly, the transition ranges are other ranges than the center range in the hue value range, namely 30-50 and 70-90.
For example, the transition ranges and center ranges are set forth herein in connection with the illustrations. Referring to fig. 7, fig. 7 is a schematic diagram showing the effect of sub-ranges in a hue value range according to an embodiment of the present application. As shown in fig. 7, the K hue value ranges include 6 hue value ranges of hue value range 1, hue value range 2, and. Hue values range 1 from 330-360 deg. and 0-30 deg., with a first range boundary value of 330 deg., a second range boundary value of 30 deg., and a center value of 0 deg. (i.e., 360 deg.), the transition ranges include a first transition range 330-350 deg. and a second transition range 10-30 deg., and a center range 350-360 deg. and 0-10 deg.. Similarly, the hue value range 2 is 30-90 °, the first range boundary value is 30 °, the second range boundary value is 90 °, the center value is 60 °, and the transition range includes a first transition range 30-50 ° and a second transition range 70-90 °, and the center range is 50-70 °. Similarly, the transition range and the center range in each of the 6 hue value ranges can be determined.
It is understood that the target sub-range refers to a sub-range to which the hue value of the first reserved pixel point belongs in the reserved hue value range. The target subrange may be a transition range or a central range. Determining, based on the hue value of the first reserved pixel, a target sub-range to which the first reserved pixel belongs in the reserved hue value range may include determining, as the target sub-range, the hue value of the first reserved pixel to which the reserved hue value range belongs.
Specifically, determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point may include the following steps: if the target sub-range is the center range of the reserved hue value range, determining the initial pixel value of the first reserved pixel point as the reserved pixel value corresponding to the first reserved pixel point.
It can be understood that if the target sub-range of the first reserved pixel is the central range, the original pixel value (i.e., the initial pixel value) of the first reserved pixel in the service image can be directly output when the special effect processing is performed.
Optionally, if the target sub-range is a transition range of the reserved hue value range, the initial pixel value of the first reserved pixel point and a numerical value obtained by performing the graying processing on the initial pixel value may be subjected to pixel mixing, so as to obtain a pixel value in the image after the special effect processing. The pixel blending may be a value obtained by performing a graying process on the initial pixel value of the first remaining pixel point and the initial pixel value, and the two values are mixed in average (i.e., 1:1 blending), that is, an average value of the two values is obtained as a final output pixel value. The pixel blending may be performed by blending the initial pixel value of the first remaining pixel point with a value obtained by subjecting the initial pixel value to graying based on a position of the hue value of the first remaining pixel point in the transition range, for example, the closer the distance from the range boundary value (i.e., the smaller the difference from the nearest range boundary value), the smaller the blended initial pixel value, the farther the distance from the range boundary value (i.e., the larger the difference from the nearest range boundary value), the more the blended initial pixel value, in other words, the closer the distance from the center value, the more the blended initial pixel value, and the farther the distance from the center value.
Specifically, determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point may include the following steps: if the target sub-range is a transition range of the reserved hue value range, acquiring a gray pixel value to be processed corresponding to the first reserved pixel point; the gray pixel value to be processed is obtained by carrying out gray processing based on the initial pixel value of the first reserved pixel point; further, determining a range boundary value with the smallest difference value from the hue value of the first reserved pixel point in the reserved hue value range as a target range boundary value; further, determining pixel mixing influence information corresponding to the first reserved pixel point based on a difference value between the hue value of the first reserved pixel point and the boundary value of the target range, wherein the pixel mixing influence information comprises a second influence value associated with a first influence value associated with a gray pixel value to be processed and a target initial pixel value when the gray pixel value to be processed and the target initial pixel value are subjected to pixel mixing; the target initial pixel value is the initial pixel value of the first reserved pixel point; further, based on a first influence value and a second influence value included in the pixel mixing influence information, pixel mixing is performed on the gray pixel value to be processed and the target initial pixel value, and a reserved pixel value corresponding to the first reserved pixel point is obtained.
The gray pixel value to be processed is obtained by performing gray processing based on the initial pixel value of the first reserved pixel point. The process of performing the graying process on the first remaining pixel point may refer to the description related to the graying process on the gray pixel point in step S104, which is not described herein.
It is understood that the target range boundary value refers to a range boundary value in which the difference from the hue value of the first reserved pixel point is smallest in the reserved hue value range. For example, if the hue value of the first reserved pixel is 77, the target range boundary value may be the range boundary value closest to 77, i.e., the target range boundary value 90, of the range boundary values 30 and 90, with the reserved hue value ranging from 30 to 90, and the transition ranges from 30 to 50 and 70 to 90.
It is understood that the pixel mixture influence information includes an influence value associated with the target initial pixel value when the gray pixel value to be processed is pixel-mixed with the target initial pixel value, in other words, the pixel mixture influence information indicates a mixture ratio of the gray pixel value to be processed and the target initial pixel value when the gray pixel value to be processed is pixel-mixed with the target initial pixel value. The first influence value associated with the gray pixel value to be processed is equivalent to the gray pixel value weight to be processed when the pixel mixing is performed, and the second influence value associated with the target initial pixel value is equivalent to the target initial pixel value weight when the pixel mixing is performed. It will be appreciated that the first influence value is used to characterize the extent to which the pixel value to be greyscale processed has an influence on the determined retained pixel value and the second influence value is used to characterize the extent to which the target initial pixel value has an influence on the determined retained pixel value. The larger the influence value (e.g., the first influence value or the second influence value), the larger the influence degree of the corresponding pixel value (e.g., the gray pixel value to be processed or the target initial pixel value) on the reserved pixel value is, whereas the smaller the influence value (e.g., the first influence value or the second influence value) is, the smaller the influence degree of the corresponding pixel value (e.g., the gray pixel value to be processed or the target initial pixel value) on the reserved pixel value is.
It may be appreciated that determining the pixel blending influence information corresponding to the first reserved pixel point based on the difference between the hue value of the first reserved pixel point and the boundary value of the target range may include: and determining pixel mixing influence information corresponding to the first reserved pixel point based on the ratio of the difference value between the hue value of the first reserved pixel point and the boundary value of the target range to the size of the transition range. For example, if the hue value of the first reserved pixel point is 75, the transition range to which the first reserved pixel point belongs is 70-90, and the size of the transition range is 20, the difference between the hue value of the first reserved pixel point and the boundary value of the target range is 90-75=15, and the ratio between the first influence value and the second influence value is 3:4 can be obtained from 15/20=3:4. Further, the normalization processing is performed based on the ratio, so that a first influence value and a second influence value can be obtained.
The pixel mixing of the gray pixel value to be processed and the target initial pixel value based on the first influence value and the second influence value included in the pixel mixing influence information to obtain a reserved pixel value corresponding to the first reserved pixel point may include: and carrying out weighted average on the gray pixel value to be processed and the target initial pixel value based on the first influence value and the second influence value to obtain a first reserved pixel value after the pixels are mixed. For example, the first influence value is 0.3, the second influence value is 0.7, the gray pixel value to be processed is (s 1, s1, s 1), the target initial pixel value is (r 1, g1, b 1), the calculated first reserved pixel value is
It can be understood that if the first influence value is the same as the second influence value, it is equivalent to average mixing the gray pixel value to be processed and the target initial pixel value to obtain the first reserved pixel value.
It can be understood that, based on the above description, when the hue value of the first reserved pixel point belongs to the transition range in the reserved hue value range, pixel mixing can be performed based on the initial pixel value (i.e., the target initial pixel value) of the first reserved pixel point and the pixel value obtained by performing the graying process on the first reserved pixel point (i.e., the gray pixel value to be processed), so as to obtain the reserved pixel value corresponding to the first reserved pixel point. When the hue value of the first reserved pixel point belongs to the center range in the reserved hue value range, the initial pixel value of the first reserved pixel point can be determined as the reserved pixel value corresponding to the first reserved pixel point.
S205, determining gray pixel points based on other pixel points except the first reserved pixel point in the pixel points included in the service image, performing graying processing based on the initial pixel values of the gray pixel points, and determining gray pixel values corresponding to the gray pixel points.
The process of determining the gray pixel value corresponding to the gray pixel point may refer to the description related to step S104, which is not described herein.
S206, determining a business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point, and determining the business image after special effect processing as a target image.
The determining the business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point may include: and adjusting the pixel value of the gray pixel point in the service image from the initial pixel value to the gray pixel value, and adjusting the pixel value of the first reserved pixel point from the initial pixel value to the corresponding reserved pixel value to obtain the service image after special effect processing.
It will be appreciated that, in determining the pixel value of the first reserved pixel, the closer the hue value of the first reserved pixel is to the center value of the reserved hue value range, the more the initial pixels are mixed, and conversely, the first reserved pixel at the edge of the color system (i.e., near the boundary value of the reserved hue value unit) is almost the effect of graying. Therefore, smooth edge transition can be constructed, color jump is not too abrupt when the edge is formed, and the special effect processing effect is improved.
Referring to fig. 8, fig. 8 is a flowchart of an image processing method according to an embodiment of the application. The method may be performed by the electronic device described above. The image processing method may include steps S301 to S307.
S301, acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer.
S302, acquiring a hue value of each pixel point in N pixel points, determining a hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in N pixel points, and determining the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel.
S303, determining a reserved hue value range in the service image from K hue value ranges based on the number of pixel points corresponding to each hue value range.
S304, determining the pixel point corresponding to the reserved hue value range as a first reserved pixel point, and determining the reserved pixel value corresponding to the first reserved pixel point based on the initial pixel value of the first reserved pixel point.
The processing in steps S301 to S304 may refer to the descriptions related to steps S201 to S304, which are not limited herein.
S305, determining gray pixel points based on other pixel points except the first reserved pixel point in the pixel points included in the service image, performing graying processing based on the initial pixel values of the gray pixel points, and determining gray pixel values corresponding to the gray pixel points.
It may be understood that the gray pixel is determined based on other pixels except the first remaining pixel in the pixels included in the service image, and may further include determining remaining pixels whose pixel information satisfies a pixel remaining condition in the mask area, and determining other pixels except the first remaining pixel and the remaining pixels whose pixel information satisfies the pixel remaining condition in the service image as the gray pixel.
The pixel information may include one or more of hue value, saturation, and brightness, among others. That is, the pixel information may be information of the pixel point in HSL (a color expression form) space, and may be calculated based on the initial pixel value. As described above, the hue value refers to a value of a pixel point on a hue, the hue (hue) is a basic attribute of a color, and the hue (hue) is a basic attribute of a color, such as a color name commonly known as red, yellow, and the like. The hue value can be in the range of 0-360 degrees. The saturation refers to the purity of the color, the higher the saturation is, the purer the color is, the lower the saturation is, the gray is gradually changed, and the value range of the saturation is 0-100%. The brightness (lightness) is used for expressing the depth of white, and the value range of the brightness is 0-100%. The hue value, saturation and brightness are determined based on the initial pixel value of one pixel point, and may be: the method comprises the steps of obtaining a hue conversion formula for converting an initial pixel value into a hue value, a saturation conversion formula for converting the initial pixel value into saturation, a brightness conversion formula for converting the initial pixel value into brightness, and respectively calculating the hue conversion formula, the saturation conversion formula and the brightness conversion formula to obtain the hue value, the saturation and the brightness of the pixel point.
The pixel retention conditions may include one or more of a retention condition for hue values (denoted as a first retention condition), a retention condition for saturation (denoted as a second retention condition), and a retention condition for brightness (denoted as a third retention condition). It is understood that, when the pixel retention condition includes any one of the first retention condition, the second retention condition, or the third retention condition, it is only necessary to satisfy the one included condition, and it is determined that the pixel information satisfies the pixel retention condition; when the pixel retention condition includes a plurality of items of the first retention condition, the second retention condition, or the third retention condition, each item of the condition included in the pixel retention condition needs to be satisfied at the same time to determine that the pixel information satisfies the pixel retention condition.
It is understood that the first retention condition may be that the hue value belongs to a reference hue range, the second retention condition may be that the saturation belongs to a reference saturation range, and the third retention condition may be that the luminance belongs to a reference luminance range. Wherein, the reference hue range, the reference saturation range and the reference brightness range all belong to the pixel reference range. The pixel reference range refers to a range for which pixel information satisfies a pixel retention condition. The reference hue range refers to a range to which a hue value belongs when pixel information satisfies a pixel retention condition; the reference saturation range refers to a range to which brightness belongs when pixel information satisfies a pixel retention condition; the reference luminance range refers to a range to which luminance belongs when pixel information satisfies a pixel retention condition.
Specifically, the pixel information includes hue value, saturation and brightness; the hue value, saturation and brightness of any pixel point are all determined based on the initial pixel value of any pixel point; then, determining the gray pixel point based on other pixels except the first remaining pixels in the pixels of the business image includes: acquiring a pixel reference range, wherein the pixel reference range comprises a reference hue range, a reference saturation range and a reference brightness range; acquiring hue, saturation and brightness included in pixel information of each pixel point in the N pixel points; among the N pixel points, determining the pixel points with hue values belonging to a reference hue range, saturation belonging to a reference saturation range and brightness belonging to a reference brightness range as pixel points with pixel information meeting pixel reservation conditions, and determining the pixel points with pixel information meeting the pixel reservation conditions as second reserved pixel points; and determining other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image as gray pixel points.
The second reserved pixel point is a pixel point whose pixel information meets a pixel reservation condition, and further other pixel points except the first reserved pixel point and the second reserved pixel point can be determined to be gray pixel points. For example, if the reference hue range is about 0-50, the reference saturation range is 0.2-1.0, the reference luminance range is 0.3-0.9, and if the hue value of a pixel is 25, the saturation is 0.5, and the luminance is 0.7, then since the hue value 25 belongs to the reference hue range (i.e., 0-50), the saturation is 0.5 belongs to the reference saturation range (0.2-1.0), and the luminance is 0.7 belongs to the reference luminance range (0.3-0.9), then it can be determined that the pixel information of the pixel satisfies the pixel retention condition, and so on, it can be determined whether the pixel information of each pixel in the mask region satisfies the pixel retention condition, thereby determining the second retained pixel in the mask region.
It can be understood that when the hue value of a pixel belongs to the reserved hue value range and the pixel information satisfies the pixel reserved condition, the pixel may be determined as the first reserved pixel, or the pixel may be determined as the second reserved pixel, which may be specifically determined according to the actual requirement, and is not limited herein.
Alternatively, the pixel information may include one or more of hue value, saturation, brightness. That is, the pixel information may be information of the pixel point in HSV (a color expression form) space, and may be calculated based on the initial pixel value. The brightness (Value) can be used for expressing the depth of the brightness Value, and the Value range of the brightness is 0-100%. The process of determining the pixel points where the pixel information satisfies the pixel retention condition when the pixel information includes one or more of the hue value, the saturation, and the brightness may refer to the description related to the pixel points where the pixel information satisfies the pixel retention condition when the pixel information includes one or more of the hue value, the saturation, and the brightness, which will not be described herein.
It is understood that the steps of determining the first remaining pixels and determining the second remaining pixels from the mask area may be performed simultaneously, or may be sequentially performed, which is not limited herein.
S306, determining the initial pixel value of the second reserved pixel point as the reserved pixel value corresponding to the second reserved pixel point.
The reserved pixel value corresponding to the second reserved pixel point refers to the pixel value of the second reserved pixel point after special effect processing is performed. For convenience of description, the reserved pixel value corresponding to the second reserved pixel point may be referred to as a second reserved pixel value.
Optionally, when determining the second reserved pixel value, the initial pixel value of the second reserved pixel point may be mixed with a value obtained by performing the graying processing on the initial pixel value to obtain the second reserved pixel value. The specific processing procedure may refer to the above-mentioned pixel mixing of the initial pixel value of the first reserved pixel point and the value obtained by performing the graying processing on the initial pixel value (i.e. the gray pixel value to be processed) to obtain the relevant description of the first reserved pixel value, which is not described herein.
S307, determining the business image after special effect processing based on the reserved pixel value corresponding to the second reserved pixel point, the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point.
And determining the business image after special effect processing based on the reserved pixel value corresponding to the second reserved pixel point, the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point.
An application scenario of the present solution is described herein with reference to the drawings. Referring to fig. 9, fig. 9 is a schematic diagram of a special effect processing scenario provided in an embodiment of the present application. As shown in fig. 9, the business image 91a includes the target object 911a, and further, the special effect processing can be performed on the business image 91a according to the embodiment of the present application to obtain the target image 92a. In the target image 92a, the area indicated by 921a is the area where the first remaining pixel is located, that is, the area where the pixel corresponding to the hue value range where the number of pixels is the largest is located. The area indicated by 922a is the area where the second reserved pixel point is located, that is, the area where the pixel information satisfies the pixel reservation condition. Therefore, the color with the largest hue value range ratio on the target object and the color of the pixel point meeting the preset reservation condition can be automatically reserved during special effect processing.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. Alternatively, the image processing apparatus may be provided in the above-described electronic device. As shown in fig. 10, the image processing apparatus described in the present embodiment may include:
an acquiring unit 1001, configured to acquire a service image including a target object, and acquire a mask area of the target object in the service image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
The processing unit 1002 is configured to obtain a hue value of each of the N pixel points, determine a hue value range to which each pixel point belongs from K hue value ranges based on the hue value of each of the N pixel points, and determine the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range of the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel;
the processing unit 1002 is further configured to determine a reserved hue value range in the service image from the K hue value ranges based on the number of pixels corresponding to each hue value range;
The processing unit 1002 is further configured to perform special effect processing on the service image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determine the service image after the special effect processing as the target image.
In one implementation, the processing unit 1002 is specifically configured to:
determining a pixel point corresponding to the reserved hue value range as a first reserved pixel point, and determining a reserved pixel value corresponding to the first reserved pixel point based on an initial pixel value of the first reserved pixel point;
determining gray pixel points based on other pixel points except the first reserved pixel point in the pixel points included in the service image, and carrying out graying treatment on the initial pixel value of the gray pixel points to obtain a gray pixel value corresponding to the gray pixel points;
And determining the business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point, and determining the business image after special effect processing as a target image.
In one implementation, each hue value range includes a transition range and a center range, the transition range in any hue value range being determined based on range boundary values for any hue value range, the center range of any hue value range being other than the transition range of any hue value range; the transition range and the central range are different ranges;
the processing unit 1002 is specifically configured to:
determining a target sub-range to which the first reserved pixel belongs in the reserved hue value range based on the hue value of the first reserved pixel; the target sub-range is a transition range or a central range of the reserved hue value range;
And determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point.
In one implementation, the processing unit 1002 is specifically configured to:
if the target sub-range is the center range of the reserved hue value range, determining the initial pixel value of the first reserved pixel point as the reserved pixel value corresponding to the first reserved pixel point.
In one implementation, the processing unit 1002 is specifically configured to:
If the target sub-range is a transition range of the reserved hue value range, acquiring a gray pixel value to be processed corresponding to the first reserved pixel point; the gray pixel value to be processed is obtained by carrying out gray processing based on the initial pixel value of the first reserved pixel point;
determining a range boundary value with the minimum difference value between the hue value of the reserved hue value range and the hue value of the first reserved pixel point as a target range boundary value;
Determining pixel mixing influence information corresponding to a first reserved pixel point based on a difference value between a hue value of the first reserved pixel point and a target range boundary value, wherein the pixel mixing influence information comprises a first influence value associated with a gray pixel value to be processed and a second influence value associated with a target initial pixel value when the gray pixel value to be processed and the target initial pixel value are subjected to pixel mixing; the target initial pixel value is the initial pixel value of the first reserved pixel point;
And carrying out pixel mixing on the gray pixel value to be processed and the target initial pixel value based on the first influence value and the second influence value included in the pixel mixing influence information, and obtaining a reserved pixel value corresponding to the first reserved pixel point.
In one implementation manner, the mask region in the service image includes a second reserved pixel point, where the second reserved pixel point refers to a pixel point where pixel information of N pixel points meets a pixel reservation condition; the gray pixel points are other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image;
the processing unit 1002 is specifically configured to:
Determining the initial pixel value of the second reserved pixel point as the reserved pixel value corresponding to the second reserved pixel point;
and determining the business image after special effect processing based on the reserved pixel value corresponding to the second reserved pixel point, the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point.
In one implementation, the pixel information includes hue value, saturation, and brightness; the hue value, saturation and brightness of any pixel point are all determined based on the initial pixel value of any pixel point;
the processing unit 1002 is specifically configured to:
Acquiring a pixel reference range, wherein the pixel reference range comprises a reference hue range, a reference saturation range and a reference brightness range;
acquiring hue, saturation and brightness included in pixel information of each pixel point in the N pixel points;
Among the N pixel points, determining the pixel points with hue values belonging to a reference hue range, saturation belonging to a reference saturation range and brightness belonging to a reference brightness range as pixel points with pixel information meeting pixel reservation conditions, and determining the pixel points with pixel information meeting the pixel reservation conditions as second reserved pixel points;
And determining other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image as gray pixel points.
In one implementation, the processing unit 1002 is specifically configured to:
From the K hue value ranges, determining M hue value ranges with the maximum number of corresponding pixel points; m is a positive integer less than K;
And determining the determined M hue value ranges as reserved hue value ranges in the service image.
Referring to fig. 11, fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device described in the present embodiment includes: processor 1101, memory 1102. Optionally, the electronic device may further include a network interface or a power module. Data may be exchanged between the processor 1101 and the memory 1102.
The Processor 1101 may be a central processing unit (Central Processing Unit, CPU), which may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The network interface may include input devices, such as a control panel, microphone, receiver, etc., and/or output devices, such as a display screen, transmitter, etc., which are not shown.
The memory 1102 may include read-only memory and random access memory, and provides program instructions and data to the processor 1101. A portion of memory 1102 may also include non-volatile random access memory. Wherein the processor 1101, when calling the program instructions, is configured to perform:
Acquiring a business image comprising a target object, and acquiring a mask area of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
Acquiring a hue value of each pixel point in N pixel points, determining a hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are different; the hue value of each pixel is determined based on the initial pixel value of each pixel;
Determining reserved hue value ranges in the service image from K hue value ranges based on the number of pixel points corresponding to each hue value range;
and carrying out special effect processing on the business image based on the initial pixel value of the pixel point corresponding to the reserved hue value range, and determining the business image after the special effect processing as a target image.
In one implementation, the processor 1101 is specifically configured to:
determining a pixel point corresponding to the reserved hue value range as a first reserved pixel point, and determining a reserved pixel value corresponding to the first reserved pixel point based on an initial pixel value of the first reserved pixel point;
determining gray pixel points based on other pixel points except the first reserved pixel point in the pixel points included in the service image, and carrying out graying treatment on the initial pixel value of the gray pixel points to obtain a gray pixel value corresponding to the gray pixel points;
And determining the business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point, and determining the business image after special effect processing as a target image.
In one implementation, each hue value range includes a transition range and a center range, the transition range in any hue value range being determined based on range boundary values for any hue value range, the center range of any hue value range being other than the transition range of any hue value range; the transition range and the central range are different ranges;
the processor 1101 is specifically configured to:
determining a target sub-range to which the first reserved pixel belongs in the reserved hue value range based on the hue value of the first reserved pixel; the target sub-range is a transition range or a central range of the reserved hue value range;
And determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point.
In one implementation, the processor 1101 is specifically configured to:
if the target sub-range is the center range of the reserved hue value range, determining the initial pixel value of the first reserved pixel point as the reserved pixel value corresponding to the first reserved pixel point.
In one implementation, the processor 1101 is specifically configured to:
If the target sub-range is a transition range of the reserved hue value range, acquiring a gray pixel value to be processed corresponding to the first reserved pixel point; the gray pixel value to be processed is obtained by carrying out gray processing based on the initial pixel value of the first reserved pixel point;
determining a range boundary value with the minimum difference value between the hue value of the reserved hue value range and the hue value of the first reserved pixel point as a target range boundary value;
Determining pixel mixing influence information corresponding to a first reserved pixel point based on a difference value between a hue value of the first reserved pixel point and a target range boundary value, wherein the pixel mixing influence information comprises a first influence value associated with a gray pixel value to be processed and a second influence value associated with a target initial pixel value when the gray pixel value to be processed and the target initial pixel value are subjected to pixel mixing; the target initial pixel value is the initial pixel value of the first reserved pixel point;
And carrying out pixel mixing on the gray pixel value to be processed and the target initial pixel value based on the first influence value and the second influence value included in the pixel mixing influence information, and obtaining a reserved pixel value corresponding to the first reserved pixel point.
In one implementation manner, the mask region in the service image includes a second reserved pixel point, where the second reserved pixel point refers to a pixel point where pixel information of N pixel points meets a pixel reservation condition; the gray pixel points are other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image;
the processor 1101 is specifically configured to:
Determining the initial pixel value of the second reserved pixel point as the reserved pixel value corresponding to the second reserved pixel point;
and determining the business image after special effect processing based on the reserved pixel value corresponding to the second reserved pixel point, the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point.
In one implementation, the pixel information includes hue value, saturation, and brightness; the hue value, saturation and brightness of any pixel point are all determined based on the initial pixel value of any pixel point;
the processor 1101 is specifically configured to:
Acquiring a pixel reference range, wherein the pixel reference range comprises a reference hue range, a reference saturation range and a reference brightness range;
acquiring hue, saturation and brightness included in pixel information of each pixel point in the N pixel points;
Among the N pixel points, determining the pixel points with hue values belonging to a reference hue range, saturation belonging to a reference saturation range and brightness belonging to a reference brightness range as pixel points with pixel information meeting pixel reservation conditions, and determining the pixel points with pixel information meeting the pixel reservation conditions as second reserved pixel points;
And determining other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image as gray pixel points.
In one implementation, the processor 1101 is specifically configured to:
From the K hue value ranges, determining M hue value ranges with the maximum number of corresponding pixel points; m is a positive integer less than K;
And determining the determined M hue value ranges as reserved hue value ranges in the service image.
Optionally, the program instructions may further implement other steps of the method in the above embodiment when executed by the processor, which is not described herein.
The present application also provides a computer readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the above method, such as the method performed by the above electronic device, which is not described herein.
Alternatively, a storage medium such as a computer-readable storage medium to which the present application relates may be nonvolatile or may be volatile.
Alternatively, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created from the use of blockchain nodes, and the like. The blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm and the like. The blockchain (Blockchain), essentially a de-centralized database, is a string of data blocks that are generated in association using cryptographic methods, each of which contains information from a batch of network transactions for verifying the validity (anti-counterfeit) of its information and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, an application services layer, and the like.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of action described, as some steps may be performed in other order or simultaneously according to the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program to instruct related hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions which, when executed by a processor, implement some or all of the steps of the above-described method. For example, the computer instructions are stored in a computer readable storage medium. The computer instructions are read from a computer-readable storage medium by a processor of a computer device (i.e., the electronic device described above), and executed by the processor, cause the computer device to perform the steps performed in the embodiments of the methods described above. For example, the computer device may be a terminal, or may be a server.
The foregoing has described in detail the methods, apparatuses, electronic devices, media and products for image processing according to the embodiments of the present application, and specific examples have been applied to illustrate the principles and embodiments of the present application, where the foregoing examples are provided to assist in understanding the methods and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (12)

1. An image processing method, the method comprising:
Acquiring a business image comprising a target object, and acquiring a mask region of the target object in the business image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
Acquiring a hue value of each pixel point in the N pixel points, determining a hue value range to which each pixel point belongs from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of the pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are all different; the hue value of each pixel point is determined based on the initial pixel value of each pixel point;
Determining reserved hue value ranges in the service image from the K hue value ranges based on the number of pixel points corresponding to each hue value range; each hue value range comprises a transition range and a central range, the transition range in any hue value range is determined based on the range boundary value of the any hue value range, and the central range of any hue value range is other range except the transition range of any hue value range; the pixel point corresponding to the reserved hue value range is a first reserved pixel point;
Determining a reserved pixel value corresponding to the first reserved pixel point based on a target sub-range to which the first reserved pixel point belongs in the reserved hue value range and an initial pixel value of the first reserved pixel point; if the target sub-range is the transition range of the reserved hue value range, the reserved pixel value corresponding to the first reserved pixel point is determined by performing pixel mixing on the gray pixel value to be processed and the initial pixel value of the first reserved pixel point; the gray pixel value to be processed is obtained by gray processing based on the initial pixel value of the first reserved pixel point;
And carrying out special effect processing on the business image based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel values corresponding to other pixel points except the first reserved pixel point, and determining the business image after the special effect processing as a target image.
2. The method according to claim 1, wherein the performing special effect processing on the business image based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel values corresponding to the other pixel points except the first reserved pixel point, and determining the business image after the special effect processing as the target image includes:
determining gray pixel points based on other pixel points except the first reserved pixel point in the pixel points included in the service image, and carrying out gray processing on the initial pixel value of the gray pixel points to obtain a gray pixel value corresponding to the gray pixel points;
And determining a business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point, and determining the business image after special effect processing as a target image.
3. The method of claim 1, wherein the transition range and the central range are different ranges;
The determining, based on the target sub-range to which the first reserved pixel belongs in the reserved hue value range and the initial pixel value of the first reserved pixel, a reserved pixel value corresponding to the first reserved pixel includes:
Determining a target sub-range to which the first reserved pixel point belongs in the reserved hue value range based on the hue value of the first reserved pixel point; the target sub-range is a transition range or a central range of the reserved hue value range;
and determining a reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point.
4. The method of claim 3, wherein determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point comprises:
and if the target sub-range is the central range of the reserved hue value range, determining the initial pixel value of the first reserved pixel point as the reserved pixel value corresponding to the first reserved pixel point.
5. The method of claim 3, wherein determining the reserved pixel value corresponding to the first reserved pixel point based on the target sub-range and the initial pixel value of the first reserved pixel point comprises:
If the target sub-range is the transition range of the reserved hue value range, acquiring a gray pixel value to be processed corresponding to the first reserved pixel point;
Determining a range boundary value with the minimum difference value between the hue value of the reserved hue value range and the hue value of the first reserved pixel point as a target range boundary value;
Determining pixel mixing influence information corresponding to the first reserved pixel point based on a difference value between the hue value of the first reserved pixel point and the target range boundary value, wherein the pixel mixing influence information comprises a first influence value associated with the gray pixel value to be processed and a second influence value associated with the target initial pixel value when the gray pixel value to be processed and the target initial pixel value are subjected to pixel mixing; the target initial pixel value is an initial pixel value of the first reserved pixel point;
And carrying out pixel mixing on the gray pixel value to be processed and the target initial pixel value based on a first influence value and a second influence value included in the pixel mixing influence information, so as to obtain a reserved pixel value corresponding to the first reserved pixel point.
6. The method according to claim 2, wherein the mask area in the service image includes a second reserved pixel point, and the second reserved pixel point is a pixel point where pixel information of the N pixel points meets a pixel reservation condition; the gray pixel points are other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image;
The determining the business image after special effect processing based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point includes:
determining the initial pixel value of the second reserved pixel point as the reserved pixel value corresponding to the second reserved pixel point;
and determining the business image after special effect processing based on the reserved pixel value corresponding to the second reserved pixel point, the reserved pixel value corresponding to the first reserved pixel point and the gray pixel value corresponding to the gray pixel point.
7. The method of claim 6, wherein the pixel information includes hue value, saturation, brightness; the hue value, saturation and brightness of any pixel point are all determined based on the initial pixel value of the any pixel point;
The determining a gray pixel point based on other pixel points except the first reserved pixel point among the pixel points included in the service image includes:
Acquiring a pixel reference range, wherein the pixel reference range comprises a reference hue range, a reference saturation range and a reference brightness range;
acquiring hue, saturation and brightness included in pixel information of each pixel point in the N pixel points;
Among the N pixel points, determining the pixel points with hue values belonging to the reference hue range, saturation belonging to the reference saturation range and brightness belonging to the reference brightness range as pixel points with pixel information meeting the pixel retention conditions, and determining the pixel points with pixel information meeting the pixel retention conditions as second retention pixel points;
And determining other pixel points except the first reserved pixel point and the second reserved pixel point in the pixel points included in the service image as gray pixel points.
8. The method according to claim 1, wherein determining a reserved hue value range in the service image from the K hue value ranges based on the number of pixels corresponding to each hue value range includes:
from the K hue value ranges, determining M hue value ranges with the maximum number of corresponding pixel points; m is a positive integer less than K;
And determining the determined M hue value ranges as reserved hue value ranges in the service image.
9. An image processing apparatus, characterized in that the apparatus comprises:
An acquisition unit, configured to acquire a service image including a target object, and acquire a mask area of the target object in the service image; the pixel points included in the business image all have initial pixel values; the mask region comprises N pixel points, wherein N is a positive integer;
The processing unit is used for acquiring the hue value of each pixel point in the N pixel points, determining the hue value range of each pixel point from K hue value ranges based on the hue value of each pixel point in the N pixel points, and determining the number of the pixel points corresponding to each hue value range based on the pixel points corresponding to each hue value range in the K hue value ranges; k is a positive integer greater than 1; the K hue value ranges are all different; the hue value of each pixel point is determined based on the initial pixel value of each pixel point;
The processing unit is further configured to determine a reserved hue value range in the service image from the K hue value ranges based on the number of pixels corresponding to each hue value range; each hue value range comprises a transition range and a central range, the transition range in any hue value range is determined based on the range boundary value of the any hue value range, and the central range of any hue value range is other range except the transition range of any hue value range; the pixel point corresponding to the reserved hue value range is a first reserved pixel point;
The processing unit is further configured to determine a reserved pixel value corresponding to the first reserved pixel point based on a target sub-range to which the first reserved pixel point belongs in the reserved hue value range and an initial pixel value of the first reserved pixel point; if the target sub-range is the transition range of the reserved hue value range, the reserved pixel value corresponding to the first reserved pixel point is determined by performing pixel mixing on the gray pixel value to be processed and the initial pixel value of the first reserved pixel point; the gray pixel value to be processed is obtained by gray processing based on the initial pixel value of the first reserved pixel point;
The processing unit is further configured to perform special effect processing on the service image based on the reserved pixel value corresponding to the first reserved pixel point and the gray pixel values corresponding to the other pixel points except the first reserved pixel point, and determine the service image after the special effect processing as a target image.
10. An electronic device comprising a processor, a memory, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-8.
11. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1-8.
12. A computer program product comprising computer instructions which, when executed by a processor, implement the method of any of claims 1-8.
CN202311508982.0A 2023-11-13 2023-11-13 Image processing method, device, electronic equipment, medium and product Active CN117522760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311508982.0A CN117522760B (en) 2023-11-13 2023-11-13 Image processing method, device, electronic equipment, medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311508982.0A CN117522760B (en) 2023-11-13 2023-11-13 Image processing method, device, electronic equipment, medium and product

Publications (2)

Publication Number Publication Date
CN117522760A CN117522760A (en) 2024-02-06
CN117522760B true CN117522760B (en) 2024-06-25

Family

ID=89745108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311508982.0A Active CN117522760B (en) 2023-11-13 2023-11-13 Image processing method, device, electronic equipment, medium and product

Country Status (1)

Country Link
CN (1) CN117522760B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118321184B (en) * 2024-05-28 2024-10-22 深圳市希购科技有限公司 Logistics express automatic sorting method based on bar code identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112861661A (en) * 2021-01-22 2021-05-28 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN113096022A (en) * 2019-12-23 2021-07-09 RealMe重庆移动通信有限公司 Image blurring processing method and device, storage medium and electronic equipment
CN117455753A (en) * 2023-10-12 2024-01-26 书行科技(北京)有限公司 Special effect template generation method, special effect generation device and storage medium

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4539280B2 (en) * 2004-10-19 2010-09-08 ソニー株式会社 Image processing apparatus, image processing method, and computer program
US7689038B2 (en) * 2005-01-10 2010-03-30 Cytyc Corporation Method for improved image segmentation
JP2008236223A (en) * 2007-03-19 2008-10-02 Kyocera Mita Corp Image processing apparatus and image processing program
CN106447638A (en) * 2016-09-30 2017-02-22 北京奇虎科技有限公司 Beauty treatment method and device thereof
CN111724430A (en) * 2019-03-22 2020-09-29 株式会社理光 Image processing method and device and computer readable storage medium
CN110706187B (en) * 2019-05-31 2022-04-22 成都品果科技有限公司 Image adjusting method for uniform skin color
CN110689488B (en) * 2019-08-22 2022-03-04 稿定(厦门)科技有限公司 Image toning method, medium, device and apparatus
CN114882125A (en) * 2021-02-05 2022-08-09 武汉Tcl集团工业研究院有限公司 Method and device for graying image data, terminal device and readable storage medium
CN112767285B (en) * 2021-02-23 2023-03-10 北京市商汤科技开发有限公司 Image processing method and device, electronic device and storage medium
CN113223070A (en) * 2021-05-13 2021-08-06 深圳地理人和科技有限公司 Depth image enhancement processing method and device
CN113450282B (en) * 2021-07-12 2023-01-06 上海交通大学 Method and system for beautifying image
US20230298212A1 (en) * 2022-03-17 2023-09-21 Advanced Micro Devices, Inc. Locking mechanism for image classification
CN116485967A (en) * 2022-11-22 2023-07-25 腾讯科技(深圳)有限公司 Virtual model rendering method and related device
CN116664436A (en) * 2023-05-31 2023-08-29 深圳市国微电子有限公司 Image tone mapping method, system, terminal device and storage medium
CN116843566A (en) * 2023-06-29 2023-10-03 峰米(北京)科技有限公司 Tone mapping method, tone mapping device, display device and storage medium
CN116664603B (en) * 2023-07-31 2023-12-12 腾讯科技(深圳)有限公司 Image processing method, device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113096022A (en) * 2019-12-23 2021-07-09 RealMe重庆移动通信有限公司 Image blurring processing method and device, storage medium and electronic equipment
CN112861661A (en) * 2021-01-22 2021-05-28 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN117455753A (en) * 2023-10-12 2024-01-26 书行科技(北京)有限公司 Special effect template generation method, special effect generation device and storage medium

Also Published As

Publication number Publication date
CN117522760A (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN113763296B (en) Image processing method, device and medium
CN117522760B (en) Image processing method, device, electronic equipment, medium and product
CN109871845B (en) Certificate image extraction method and terminal equipment
CN104364825A (en) Visual conditioning for augmented-reality-assisted video conferencing
CN101529495A (en) Image mask generation
CN112001274A (en) Crowd density determination method, device, storage medium and processor
US20230074060A1 (en) Artificial-intelligence-based image processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN110599554A (en) Method and device for identifying face skin color, storage medium and electronic device
CN114529490B (en) Data processing method, device, equipment and readable storage medium
CN111275784A (en) Method and device for generating image
CN113112422B (en) Image processing method, device, electronic equipment and computer readable medium
CN104093010B (en) A kind of image processing method and device
CN109194942A (en) A kind of naked eye 3D video broadcasting method, terminal and server
CN111339315B (en) Knowledge graph construction method, system, computer readable medium and electronic equipment
CN113573044A (en) Video data processing method and device, computer equipment and readable storage medium
CN117058030A (en) Image processing method, apparatus, device, readable storage medium, and program product
CN113435515B (en) Picture identification method and device, storage medium and electronic equipment
Wang et al. Blind photograph watermarking with robust defocus‐based JND model
CN114339252A (en) Data compression method and device
CN114120002A (en) Image color extraction method and device, electronic equipment and storage medium
CN113887495A (en) Video labeling method and device based on transfer learning
CN117714712B (en) Data steganography method, equipment and storage medium for video conference
CN110086687B (en) Flexible network communication sensitive data visualization processing method
CN117541883B (en) Image generation model training, image generation method, system and electronic equipment
CN109901897A (en) A kind of method and apparatus of the color of match views in the application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant