CN112069339A - Background picture processing and search result display method, device, equipment and medium - Google Patents

Background picture processing and search result display method, device, equipment and medium Download PDF

Info

Publication number
CN112069339A
CN112069339A CN202010923452.2A CN202010923452A CN112069339A CN 112069339 A CN112069339 A CN 112069339A CN 202010923452 A CN202010923452 A CN 202010923452A CN 112069339 A CN112069339 A CN 112069339A
Authority
CN
China
Prior art keywords
color
main
picture
target picture
colors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010923452.2A
Other languages
Chinese (zh)
Inventor
吴培培
苏铎
丁明旭
李华夏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010923452.2A priority Critical patent/CN112069339A/en
Publication of CN112069339A publication Critical patent/CN112069339A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides a background picture processing and search result display method, device, equipment and medium, wherein the method comprises the following steps: extracting at least two main colors from a target picture; dividing the target picture into a plurality of blocks according to the number of the extracted main colors; for each extracted main color, determining a block corresponding to the main color according to a first spatial distance between the main color and the color of the pixel contained in each block; determining an adjustment color corresponding to each main color according to a predefined color value mapping interval; and generating a content display background picture according to each adjusted color and the block corresponding to each main color. According to the method and the device for displaying the search results, the background color and the target picture in the page are visually fused in a mode of generating the content display background picture, and the display quality and the display effect of the search results are improved.

Description

Background picture processing and search result display method, device, equipment and medium
Technical Field
The invention relates to the technical field of computers, in particular to a method, a device, equipment and a medium for processing a background picture and displaying a search result.
Background
At present, in many social and information APP (application programs), a graphic display page is adopted to provide information for a user. The image and text display page is generally displayed together with the image, so that the content of the page is enriched, and the combination style is more diversified. In some image-text display pages, summary text and picture introduction are taken as main parts, and a large picture is usually matched with the summary text to attract a user to click to enter the page for viewing.
However, the image and the search result display background in the current image-text display page are separated, so that the user hardly has immersion feeling on the display content, the search result display effect is influenced, and the user experience is reduced.
Disclosure of Invention
The embodiment of the disclosure at least provides a method, a device, equipment and a medium for processing a background picture and displaying a search result, which can visually fuse background colors and target picture colors in a page to achieve the effect of improving the texture of a page design.
In a first aspect, an embodiment of the present disclosure provides a background picture processing method, including:
extracting at least two main colors from a target picture;
dividing the target picture into a plurality of blocks according to the number of the extracted main colors;
for each extracted main color, determining a block corresponding to the main color according to a first spatial distance between the main color and each block color;
determining an adjustment color corresponding to each main color according to a predefined color value mapping interval;
and generating a content display background picture according to each adjusted color and the block corresponding to each main color.
In an optional embodiment, the extracting at least two main colors from the target picture includes:
carrying out fuzzy processing on the target picture;
counting the number of pixels corresponding to each color in the target picture after the fuzzy processing;
and extracting at least two main colors from the target picture according to the number of pixels corresponding to each color.
In an optional embodiment, extracting at least two main colors from the target picture according to the counted number of pixels corresponding to each color includes:
according to the counted number of pixels corresponding to each color, arranging the colors in a descending order to obtain a color set;
sequentially selecting a color from the color set, and determining a second spatial distance between the currently selected color and each color contained in the main color list;
and if the second spatial distance is greater than a first preset threshold value, adding the currently selected color into the main color list.
In an optional embodiment, before determining the second spatial distance between the selected color and other colors in the color set, the method further includes:
determining a third spatial distance between the selected color and the target color;
and if the third spatial distance is smaller than a second preset threshold value, screening to select a color.
In an optional implementation manner, for each extracted main color, determining a tile corresponding to the main color according to a first spatial distance between the main color and each tile color includes:
for each extracted main color, determining a first spatial distance between the main color and the color of each pixel in the block;
counting the number of pixels of which the first spatial distance from each pixel to the main color is smaller than a third preset threshold value;
and determining the block corresponding to the current main color according to the counted number of the pixels.
In an alternative embodiment, the dominant color is represented by any one of: red, green and blue RGB, hue saturation value HSV and hue saturation value HSL; and if the dominant color is represented in RGB, then
Determining an adjustment color corresponding to each main color according to a predefined color value mapping interval, including:
converting the primary color from an RGB space to an HSL space;
according to a predefined color value mapping interval, respectively mapping a saturation value and a brightness value in the HSL space to a first interval and a second interval to obtain a mapped HSL value;
converting the mapped HSL value into an RGB value;
and representing the adjusted color of the main color by using the RGB value obtained by conversion.
In an optional implementation manner, generating a content display background picture according to each of the adjusted colors and the block corresponding to each of the main colors includes:
generating a block picture corresponding to each main color by using a linear gradient tool according to the number of the pixels and the adjustment color corresponding to each main color;
and splicing the block pictures corresponding to the main colors based on the blocks corresponding to the main colors to generate the content display background picture.
In an optional embodiment, before extracting at least two main colors from the target picture, the method further includes:
respectively determining a saturation mean value and a gray mean value of the target picture;
when the target picture saturation mean value and the target picture gray scale mean value are smaller than a fourth preset threshold value, determining that the target picture is a gray scale picture;
when the target picture saturation mean value and/or the target picture gray scale mean value are not smaller than the fourth preset threshold value, determining that the target picture is not a gray scale picture;
in the case where it is determined that the target picture is not a grayscale picture, the step of extracting at least two dominant color features from the target picture is performed.
In an optional embodiment, before extracting at least two main colors from the target picture, the method further includes:
performing color matching pretreatment on the target picture, wherein the color matching pretreatment comprises at least one of the following steps: tone scale adjustment processing, saturation adjustment processing, color contrast processing, and blur processing.
In a second aspect, an embodiment of the present disclosure further provides a search result display method, including:
initiating a search request based on the obtained search keyword;
acquiring at least one target resource matched with the search keyword, wherein the target resource comprises a target picture;
and displaying a search result card generated according to the at least one target resource on a search result page, wherein a content display background picture of the search result card is generated based on the target picture.
In an optional implementation manner, the content display background picture color is a stream type gradient color, the stream type gradient color is generated according to the main color adjustment color of the target picture, and the content display background picture color is gradually changed into transparent according to a predetermined direction.
In a third aspect, an embodiment of the present disclosure further provides a background picture processing apparatus, including:
the extraction unit is used for extracting at least two main colors from the target picture;
the dividing unit is used for dividing the target picture into a plurality of blocks according to the number of the extracted main colors;
a first determining unit, configured to determine, for each extracted main color, a block corresponding to the main color according to a first spatial distance between the main color and each block color;
the second determining unit is used for determining the adjustment color corresponding to each main color according to a predefined color value mapping interval;
and the creating unit is used for generating a content display background picture according to each adjusting color and the block corresponding to each main color.
In an alternative embodiment, the apparatus further comprises a processing unit and a statistics unit, wherein:
the processing unit is used for carrying out fuzzy processing on the target picture;
the counting unit is used for counting the number of pixels corresponding to each color in the target picture after the fuzzy processing;
the extraction unit is used for extracting at least two main colors from the target picture according to the number of pixels corresponding to each color.
In an alternative embodiment, the apparatus further comprises a sorting unit, wherein:
the sorting unit is used for sorting the colors in a descending order according to the counted number of pixels corresponding to the colors to obtain a color set;
the extracting unit is specifically configured to select one color from the color set in sequence, and determine a second spatial distance between the currently selected color and each color included in the main color list; and if the second spatial distance is greater than a first preset threshold value, adding the currently selected color into the main color list.
In an optional embodiment, the extracting unit is further configured to determine a third spatial distance between the selected color and the target color; and if the third spatial distance is smaller than a second preset threshold value, screening to select a color.
In an optional embodiment, the first determining unit is specifically configured to determine, for each extracted main color, a first spatial distance between the main color and a color of each pixel in the block; counting the number of pixels of which the first spatial distance from each pixel to the main color is smaller than a third preset threshold value; and determining the block corresponding to the current main color according to the counted number of the pixels.
In an alternative embodiment, the dominant color is represented by any one of: red, green and blue RGB, hue saturation value HSV and hue saturation value HSL; and if the dominant color is represented in RGB, then
The second determining unit is specifically configured to convert the main color from an RGB space to an HSL space; according to a predefined color value mapping interval, respectively mapping a saturation value and a brightness value in the HSL space to a first interval and a second interval to obtain a mapped HSL value; converting the mapped HSL value into an RGB value; and representing the adjusted color of the main color by using the RGB value obtained by conversion.
In an optional implementation manner, the creating unit is specifically configured to generate, by using a linear gradient tool, a block picture corresponding to each of the main colors according to the number of pixels and an adjustment color corresponding to each of the main colors;
and splicing the block pictures corresponding to the main colors based on the blocks corresponding to the main colors to generate the content display background picture.
In an alternative embodiment, the apparatus further comprises: the third determining unit is specifically configured to determine a saturation mean value and a gray scale mean value of the target picture respectively; when the target picture saturation mean value and the target picture gray scale mean value are smaller than a fourth preset threshold value, determining that the target picture is a gray scale picture; when the target picture saturation mean value and/or the target picture gray scale mean value are not smaller than the fourth preset threshold value, determining that the target picture is not a gray scale picture; in the case where it is determined that the target picture is not a grayscale picture, the step of extracting at least two dominant color features from the target picture is performed.
In an alternative embodiment, the apparatus further comprises:
a preprocessing unit, configured to perform color matching preprocessing on the target picture, where the color matching preprocessing includes at least one of: tone scale adjustment processing, saturation adjustment processing, color contrast processing, and blur processing.
In a fourth aspect, an embodiment of the present disclosure further provides a search result display apparatus, including:
a search unit for initiating a search request based on the acquired search keyword;
the acquisition unit is used for acquiring at least one target resource matched with the search keyword, and the target resource comprises a target picture;
and the display unit is used for displaying a search result card generated according to the at least one target resource on a search result page, wherein the content display background picture of the search result card is generated based on the target picture.
In an optional implementation manner, the content display background picture color is a stream type gradient color, the stream type gradient color is generated according to the main color adjustment color of the target picture, and the content display background picture color is gradually changed into transparent according to a predetermined direction.
In a fifth aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of any one of the possible implementations of the first or second aspect.
In a sixth aspect, the disclosed embodiments further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program, when executed by a processor, performs the steps in any possible implementation manner of the first aspect or the second aspect.
For the description of the effects of the background picture processing and search result displaying apparatus, device and medium, reference is made to the description of the background picture processing and search result displaying method, and details are not repeated here.
According to the background picture processing and search result display method, device, equipment and medium provided by the embodiment of the disclosure, the main color is extracted from the target picture, the target picture is divided into the plurality of blocks based on the number of the extracted main color, the block corresponding to the main color is determined according to the space distance between the main color and each divided block color, the main color is adjusted according to the predefined color mapping interval to obtain the adjusted color, and finally, the content display background picture is generated according to each adjusted color and the block corresponding to each main color.
Further, the background picture processing and search result display method, device, equipment and medium provided by the embodiment of the disclosure can also perform preprocessing before extracting the target picture, including color level adjustment processing, saturation adjustment processing, color contrast processing, blurring processing and the like, and the generated blurring texture can improve the texture of the picture design.
Further, according to the background picture processing and search result display method, device, equipment and medium provided by the embodiment of the disclosure, the shielding color in the target picture can be shielded according to the predefined target color, so that the color which is dirty in the target picture can be avoided, black and gray and relatively deep undesirable color are avoided, and the effect of adapting to different pictures for processing is light and clean and meets the background effect of picture color atmosphere.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1a shows a flowchart of a background picture processing method provided by an embodiment of the present disclosure;
fig. 1b is a schematic diagram illustrating an effect of generating a content presentation background picture for a target picture by dividing and splicing blocks according to an embodiment of the present disclosure;
fig. 2a shows a flowchart of a method for removing redundant colors from a target picture according to an embodiment of the present disclosure;
fig. 2b illustrates a flowchart of another method for removing redundant colors from a target picture according to an embodiment of the present disclosure;
fig. 3a is a schematic diagram illustrating a color level adjustment process in the color matching process provided by the embodiment of the disclosure;
fig. 3b is a schematic diagram illustrating a saturation adjustment process in the color matching process provided by the embodiment of the disclosure;
FIG. 3c is a schematic diagram illustrating a color contrast process in a toning process provided by an embodiment of the present disclosure;
FIG. 3d is a schematic diagram illustrating a blurring process in a color matching process provided by an embodiment of the disclosure;
fig. 4 is a flowchart illustrating a specific method for masking undesired colors in a background picture processing method provided by an embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating determining whether a target picture is a grayscale chart according to an embodiment of the disclosure;
fig. 6 shows an application flowchart of a background picture processing method provided by the embodiment of the present disclosure;
FIG. 7a is a flowchart illustrating a method for presenting search results provided by an embodiment of the present disclosure;
FIG. 7b illustrates a schematic view of a search page provided for display to a client in accordance with an embodiment of the present disclosure;
fig. 7c is a schematic diagram illustrating a possible effect of a search result page processed by applying a background picture processing method according to an embodiment of the disclosure;
fig. 8 is a schematic diagram illustrating a background picture processing apparatus provided in an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a search result presentation apparatus provided by an embodiment of the present disclosure;
fig. 10 shows a schematic diagram of a computer device provided by an embodiment of the present disclosure.
Detailed Description
First, some terms related to the embodiments of the present invention are explained to facilitate understanding by those skilled in the art.
The RGB color scheme (also translated as "Red, Green and Blue") is a color standard in the industry, and various colors are obtained by changing three color channels of Red (Red), Green (Green) and Blue (Blue) and superimposing the three color channels with each other, wherein RGB is a color representing the three color channels of Red, Green and Blue.
HSV (Hue, Saturation, Value) is a color space created according to the intuitive characteristics of color, also called a hexagonal cone Model (Hexcone Model), and the parameters of the color in the Model are: hue (Hue), saturation (saturratio), lightness (Value).
HSL is a representation of points in the RGB color model in a cylindrical coordinate system. Both representations attempt to be more intuitive than the geometry RGB based on cartesian coordinates. HSL is Hue (Hue), saturation (saturratio), and Lightness (Lightness). Hue (H) is a basic attribute of color, which is a commonly-known color name, such as red, yellow, etc.; the saturation (S) is the purity of the color, the higher the color is, the more pure the color is, the lower the color is, the gray gradually becomes, and the numerical value of 0-100% is taken; brightness (L) is 0-100%.
The LAB mode is a color mode. The LAB color model makes up for the deficiencies of both RGB and CMYK color schemes. It is a device-independent color model, and is also a color model based on physiological characteristics. The LAB color model consists of three elements, one element being luminance (brightness), a and B being two color channels. The colors included in a range from dark green (low brightness value) to gray (medium brightness value) to bright pink red (high brightness value); b is from bright blue (low luminance value) to gray (medium luminance value) to yellow (high luminance value). Thus, such colors will produce a color with a bright effect when mixed.
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Furthermore, the terms "first," "second," and the like in the description and in the claims, and in the drawings described above, in the embodiments of the present disclosure are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein.
Reference herein to "a plurality or a number" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Research shows that the page displayed in the information flow mode is usually displayed in a mode of matching a large picture with abstract characters so as to attract a user to click to enter the page for viewing. However, the background color of the page is single, and the picture in the page is separated from the background, so that the visual effect is monotonous, the user hardly has immersion feeling on the display content, dirty colors are also displayed when the picture color and the background color are not matched, the display effect of the search result is influenced, and the user experience is reduced.
Based on the research, the disclosure provides a method, a device, equipment and a medium for processing a background picture and displaying a search result, wherein the background color of the method, the device, the equipment and the medium are visually fused with the picture in a page, and meanwhile, the texture of a page design product can be improved by the generated fuzzy texture. In addition, the color of the display is avoided through parameter adjustment, black and gray and deeper undesirable color are avoided, and the effect of adapting different pictures to be processed is light and clean and meets the background effect of the picture color atmosphere.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, a detailed description is given to a background image processing method disclosed in an embodiment of the present disclosure, where an execution subject of the background image processing method provided in the embodiment of the present disclosure is generally a computer device with certain computing capability, and the computer device includes, for example: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or a server or other processing device. In some possible implementations, the background picture processing method may be implemented by a processor calling computer readable instructions stored in a memory.
The following describes a background picture processing method provided by the embodiment of the present disclosure by taking an execution subject as a computer device as an example.
Example one
Referring to fig. 1a, it is a flowchart of a background picture processing method provided in the embodiment of the present disclosure, the method includes steps S101 to S105, where:
s101: at least two main colors are extracted from the target picture.
According to the embodiment of the disclosure, the target picture can be a picture displayed by matching characters in a page. If the page contains a plurality of pictures, the first picture of the page can be selected as the target picture, or the picture with the highest click rate can be selected as the target picture according to the click rate of each picture, the target picture can be manually appointed, and the like.
Generally, either a grayscale picture or a color picture can be represented by a two-dimensional numerical matrix. Wherein the color picture may be represented by a two-dimensional matrix of RGB (red green blue) triplets. Typically, each value of the triplet is between 0-255, with 0 indicating that the corresponding primary color is not present in the pixel and 255 indicating that the corresponding primary color takes the maximum value in the pixel, in which case each pixel may be represented by three bytes.
An HSV space and an HSL space can also be adopted in the picture processing, and the mode of expressing a color picture by HSV consists of three parts: hue (Hue ); saturation (Saturation, color purity); value (lightness). HSL is similar to HSV in that it has three components, hue, saturation and brightness. The three different color models can be arbitrarily converted.
Based on the above description, when embodied, the main color may be represented by any one of RGB (red green blue), HSV (hue saturation brightness), and HSL (hue saturation luminance). When HSV or HSL is adopted to represent the main color, the main color of the target picture can be converted into HSV space or HSL space from RGB space, and corresponding HSV or HSL is obtained.
As can be seen from the above description, each pixel in the target picture may be represented by any one of RGB, HSV, or HSL. Based on this, in this step, the number of pixels corresponding to each color included in the target picture may be counted, and at least two main colors may be extracted from the target picture according to the number of pixels corresponding to each color.
In one embodiment, for each color, the colors may be sorted according to the corresponding number of pixels in the target picture, and N colors are sequentially selected as the main colors of the target picture, where N is an integer greater than or equal to 1.
S102: and dividing the target picture into a plurality of blocks according to the number of the extracted main colors.
In this step, the target picture may be divided into a plurality of blocks according to the following formula: n/2, wherein N represents the extracted main color, and N is an integer greater than or equal to 1. In specific implementation, the value N may be set according to actual needs or empirical values, which is not limited in the embodiment of the present disclosure.
In one embodiment, the number of dominant colors may be the same as the number of blocks into which the target picture is to be divided. For example, N — 4 may be set, and in the case where the number of main colors is 4, as shown in fig. 1b, the target picture may be divided into 2 × 2 blocks.
S103: and for each extracted main color, determining a block corresponding to the main color according to a first spatial distance between the main color and the color of the pixel contained in each block.
In specific implementation, aiming at each extracted main color, determining a first spatial distance between the main color and the color of each pixel in the block; counting the number of pixels of which the first spatial distance from each pixel to the main color is smaller than a third preset threshold value; and determining the block corresponding to the current main color according to the counted number of the pixels.
For example, for the N main colors extracted in step S101, calculating, pixel by pixel, an LAB space distance between the main color and a pixel included in the current block, respectively counting the number of pixels whose LAB space distance between the pixel color included in each block and the main color is smaller than a preset threshold, and taking the block with the largest number of pixels satisfying the above condition as the block corresponding to the main color.
For example, taking the 4 main colors extracted in step S101, which are a, B, C and D as an example, as shown in fig. 1B, the target picture is divided into block 1, block 2, block 3 and block 4, for each pixel contained in block 1, the LAB space distance between the pixel and a is calculated respectively, and the number of pixels in block 1 whose LAB space distance to a is smaller than the preset threshold is counted, in this example, assuming that the LAB space distance between 105 pixels in block 1 and a is smaller than the preset threshold, the same method is adopted, the number of pixels in block 1 whose LAB space distance to B, C and D is smaller than the preset threshold can also be counted respectively, assuming that the number is 22, 312 and 86 respectively, and similarly, the number of pixels in block 2, block 3 and block 4 whose LAB space distance to A, B, C and D is smaller than the preset threshold can be counted respectively, the statistical results are shown in table 1:
TABLE 1
Block 1 Block 2 Block 3 Block 4
A 105 22 312 86
B 523 110 20 5
C 45 26 85 426
D 16 458 52 236
According to table 1, it can be determined that the block corresponding to the dominant color a is block 3, the block corresponding to the dominant color B is block 1, the block corresponding to the dominant color C is block 4, and the block corresponding to the dominant color D is block 2.
S104: and determining the adjustment color corresponding to each main color according to the predefined color value mapping interval.
In this step, the main color may be mapped to a predefined color value mapping interval according to the following formula, where the color mapping interval may be set according to an empirical value: v _ target ═ Vmax ' -Vmin ')/(Vmax-Vmin) × Value + Vmin '. Wherein: vmin 'and Vmax' are the lower limit Value and the upper limit Value of a predefined color Value mapping interval, Vmax and Vmin are the upper limit Value and the lower limit Value of the interval to which the main color Value belongs at present, and Value is the main color Value.
Generally, the target picture is usually represented by using an RGB color model, that is, taking the primary color as RGB as an example, in an embodiment, the adjustment color of the primary color may be determined according to the following procedure:
step 1, converting the main color from the RGB space to the HSL space.
And 2, mapping the saturation value and the brightness value in the HSL space to a first interval and a second interval respectively according to a predefined color mapping interval to obtain a mapped HSL value.
And 3, converting the mapped HSL value into an RGB value.
And 4, adopting the RGB value obtained by conversion to express the adjustment color of the main color.
In specific implementation, different sensory colors can be unified into the same sensory by the main color adjusting color, the color is controlled within a range required by design, the dirty color is removed, the effect picture of the background picture is displayed by the content generated by adjusting the color, and the quality of the display color of the target picture is further improved.
For example, the custom N main colors may be converted from the RGB space to the HSL space, and the expression intervals of the saturation values and the luminance values after the conversion are usually 0 to 100, and the saturation values and the luminance values in the HSL space are mapped to 50 to 100 intervals and 80 to 100 intervals, respectively, by using the above formula V _ target ═ Vmax ' -Vmin ')/(Vmax-Vmin) × Value + Vmin '.
Taking the current saturation Value of 58 and the current brightness Value of 65 as an example, the saturation Value after mapping is calculated by the above formula as: v _ target ═ Vmax '-Vmin')/(Vmax-Vmin) × Value + Vmin '((100-50)/(100-0) × 58+ 50) ═ 79, the mapped luminance Value is calculated as V _ target ═ Vmax' -Vmin ')/(Vmax-Vmin) × Value + Vmin' ((100-80)/(100-0) × 65+ 80) ═ 93, and finally the mapped HSL Value is converted into RGB Value representation; and adopting the RGB value obtained by conversion as the adjustment color of the main color.
S105: and generating a content display background picture according to the adjusting colors and the blocks corresponding to the main colors.
In one embodiment, a block picture corresponding to each main color may be generated by using a linear gradient tool according to the number of pixels and the adjustment color corresponding to each main color; and splicing the block pictures corresponding to the main colors based on the blocks corresponding to the main colors to generate a content display background picture.
In specific implementation, for each main color, the number of pixels meeting the condition included in the block corresponding to the main color counted in step S103 and the adjusted color corresponding to each main color determined in step S104 are determined, a block picture corresponding to each main color is generated by using a linear-gradient API (linear gradient application program interface) of a CSS (Cascading Style Sheets), specifically, the number of pixels meeting the condition may be used as an input parameter of the linear gradient API of the CSS to generate a block picture corresponding to each main color, and based on the block corresponding to each main color, the block pictures corresponding to each main color may be spliced according to the block identifier to generate a content display background picture. For example, in the above example, the block corresponding to the main color a is block 3, the block corresponding to the main color B is block 1, the block corresponding to the main color C is block 4, the block corresponding to the main color D is block 2, the number of pixels 313 of the main color a included in the block 3 is used as an input parameter of the CSS linear gradient API, so as to generate the block picture 3 corresponding to the main color a, and by using the same method, the block picture 1 corresponding to the main color B, the block picture 1 corresponding to the main color C, and the block picture 2 corresponding to the main color D can be obtained, and the content display background picture is obtained by splicing the corresponding positions of the blocks corresponding to the block pictures in the target picture.
By processing the target picture in the steps S101-S105, the background color and the target picture in the page can be visually fused, so that the visual immersion effect of the display content of the search result is increased, and the quality and the display effect of the display background of the search result are improved.
Example two
In the step S101, statistics is performed on the pixels of each color appearing in the target picture, and there are usually some colors with similar colors in the target picture, and the colors with similar colors do not generate qualitative change in visual effect, so as to avoid monotonous display effect of the displayed color of the background picture due to the proximity of the main color in the extracted N.
In one embodiment, based on the number of pixels corresponding to each color included in the target picture counted in step S101, redundant colors may be removed according to the flow shown in fig. 2 a.
S21: and according to the counted number of pixels corresponding to each color, arranging the colors in a descending order to obtain a color set.
In this step, for the number of pixels corresponding to each color included in the target picture, the pixels corresponding to the color in the target picture are sequentially arranged in descending order from large to small to obtain a color set.
S22: one color is selected in turn from the set of colors, and a second spatial distance between the currently selected color and each color included in the master color list is determined.
In this step, a color is sequentially selected from the color set according to the arrangement order of the colors in the color set, and a second spatial distance between the currently selected color and each color included in the color list is calculated.
S23: and if the second spatial distance is larger than the first preset threshold value, adding the currently selected color into the main color list.
And starting from the color set with the second color, respectively determining a second spatial distance between the color and each color contained in the main color list, comparing the second spatial distance with a first preset threshold, adding the second spatial distance into the main color list if the second spatial distance is greater than the first preset threshold, and deleting the second spatial distance from the color set if the second spatial distance is not greater than the first preset threshold.
In one embodiment, the redundant color in the target picture can be removed according to the following process, as shown in fig. 2b, including the following steps:
s201: and according to the counted number of pixels corresponding to each color, arranging the colors in a descending order to obtain a color set.
S202: and sequentially traversing the obtained color sets.
In this step, traversal is performed in sequence according to the arrangement order of the colors included in the color set obtained in step S201.
S203: a second spatial distance between the currently traversed color and the colors contained in the master color list is determined.
In this step, a second spatial distance between the currently traversed color and the color included in the master color list is calculated. In one embodiment, the second spatial distance may be represented in terms of an LAB distance.
It should be appreciated that initially, since no color has been selected, the first-ranked color may be added directly to the list of dominant colors. In particular, LAB distances between other colors in the color set and the colors in the dominant color list (i.e., the determined dominant colors) are calculated sequentially.
S204: and judging whether a second spatial distance between the currently traversed color and any color in the main color list is larger than a first preset threshold, if so, executing the step S205, otherwise, executing the step S206.
In specific implementation, if it is determined that the second spatial distance between the selected color and any other color is smaller than the first preset threshold, the selected color is determined to be a redundant color, otherwise, the selected color is determined to be a main color.
S205: and adding the currently traversed color into the main color list, and ending the process.
In this step, if it is determined that the second spatial distance between the currently traversed color and any one of the colors included in the main color list is greater than the first preset threshold, the currently traversed color is determined to be the main color and added to the main color list.
S206: and judging whether the number of the selected main color characteristic values reaches a preset number, if so, executing step S207, and if not, executing step S202.
S207: the flow ends.
By executing the above-described flow, N primary colors can be selected.
In one embodiment, in order to improve the texture of the search result display background, in a specific implementation, before counting the number of pixels corresponding to each color included in the target picture in step S101, the target picture may be blurred, so that the search result display background picture obtained based on the target picture has a frosted glass blurring effect, and the display quality and the display effect of the search result display background picture are improved.
In a specific implementation, a filtering core with a preset size may be used to perform blurring processing on the target picture by using a linear operation smooth weighted summation method on pixels in a neighborhood. The size of the filter kernel may be set according to actual needs or empirical values, which is not limited in the embodiment of the present invention.
For example, a 5 × 5 filter kernel may be predefined to blur pixels of a target picture using linear operation smooth weighted summation; counting the color types of the blurred target picture pixel by pixel, counting the number of pixels corresponding to each color, and arranging the pixels in a descending order; and determining the LAB space distance between the selected color and other colors in the color set aiming at any color selected from the color set, adding the currently selected color into the main color list if the LAB space distance between the selected color and other colors in the color set is greater than a first preset threshold, and repeating the process until the number of main colors contained in the main color list reaches a set number N.
EXAMPLE III
In specific implementation, in order to further improve the display quality and the display effect of the background picture displayed by the search result obtained based on the target picture, the third embodiment adds a processing step of preprocessing the target picture and/or shielding an undesired color on the basis of the first embodiment and the second embodiment.
In this embodiment, before extracting at least two main colors from the target picture, the target picture may be subjected to color matching preprocessing and/or to masking an undesired color, for example, a dirty color may be avoided through parameter adjustment, black gray and a darker undesired color are avoided, and it is ensured that the processed effect of adapting to different pictures is light and clean and meets the background effect of the main color atmosphere.
In one embodiment, the toning process may include a tone scale adjustment process shown in fig. 3a, a saturation adjustment process shown in fig. 3b, a color contrast process shown in fig. 3c, and a blurring process shown in fig. 3 d.
For example, in the step of adjusting the color gradation, the color gradation output portion may be adjusted to 180-. In the saturation adjustment processing step, the saturation +70 may be set to increase the overall color purity, and the lightness value +10 may be slightly increased to balance the visual effect. In the color contrast processing step, the curve can be adjusted to slightly improve the image contrast, and finally the fuzzy processing is carried out to obtain the preprocessed target image.
Referring to fig. 4, which is a flowchart of a specific method for masking an undesired color in a background picture processing method provided by the embodiment of the present disclosure, the method may include the following steps:
s401: a third spatial distance between the selected color and the target color is determined.
In specific implementation, a color set is obtained based on the colors obtained in steps S201 and S202 and arranged in descending order according to the number of pixels corresponding to each color, and a third spatial distance between the currently selected color and a target color is determined for any color selected from the color set, where the target color may be a custom mask color. In one embodiment, the third spatial distance may be represented in terms of an LAB distance.
S402: judging whether the third space distance is smaller than a second preset threshold value, if not, executing the step S403; if so, step S404 is performed.
S403: and keeping the currently selected color, and ending the process.
S404: the currently selected color is masked.
In a specific implementation, the currently selected color may be deleted from the color set, that is, the undesired color in the target picture is masked according to the predefined masking color. Through the above processing, the colors reserved in the color set can avoid the colors which are dirty, and avoid black and gray, darker undesirable colors and the like.
Example four
The background picture processing method provided by the embodiment of the disclosure aims to visually fuse picture background colors and picture colors in a page, so as to achieve the effect of improving the texture of a page design product, and therefore, the processing effect of the background picture processing method on color pictures is better. In order to ensure the display quality and the display effect of the search result display background picture obtained based on the target picture, in specific implementation, whether the target picture is a gray-scale picture or not can be judged according to the flow shown in fig. 5 for the target picture included in the current display page, and in the case that the target picture is judged not to be the gray-scale picture, the method provided by the first to third embodiments and the combination thereof is adopted to perform the visual fusion processing on the target picture and the search result display background picture.
Referring to fig. 5, which is a schematic view of an implementation flow of determining whether a target picture is a grayscale map in the embodiment of the present disclosure, the method includes steps S501 to S505, where:
s501: and determining the saturation mean value of the target picture.
In one embodiment, the target picture may be converted from an RGB space to an HSV space, HSV channels are separated into an H channel, an S channel, and a V channel, an S channel value is extracted, and an S channel pixel mean value is calculated according to the S channel value to obtain a saturation mean value of the target picture.
S502: and determining the gray average value of the target picture.
In the step, separating target picture RGB channels into R channels, G channels and B channels, calculating a target picture RGB three-channel mean value, traversing each channel, calculating a Manhattan distance between each pixel contained in the target picture in the channel and the RGB three-channel mean value, and summing and averaging Manhattan distances corresponding to all pixels contained in the target picture to obtain a gray level mean value of the target picture.
S503: and judging whether the saturation mean value and the gray mean value of the target picture are smaller than a fourth preset threshold value, if so, executing step S504, and if not, executing step S505.
In this step, when the target picture saturation mean value and the target picture gray scale mean value are smaller than the preset threshold value, S504 is executed; and when the target picture saturation mean value and/or the target picture gray scale mean value are not less than the preset threshold value, sequentially executing S505.
S504: and determining that the target picture is a gray-scale picture, and ending the process.
In this step, in the case where it is determined that the target picture is not a grayscale picture, a step of extracting at least two main color features from the target picture, that is, step S101, is performed.
S505: determining that the target picture is not a grayscale picture.
In this step, if at least one of the saturation average and the gray average of the target picture is not less than a fourth preset threshold, it is determined that the target picture is not a gray picture.
For better understanding of the embodiment of the present disclosure, the following describes an implementation process of the embodiment of the present disclosure with reference to a flow of determining whether a picture is a grayscale map and a flow of a method for masking an undesired color, and is shown in fig. 6, which includes steps S601 to S612, where:
s601: and determining a target picture contained in the current display page.
S602: and judging whether the input target picture is a gray-scale picture.
In this step, the implementation process of determining whether the input target picture is a grayscale map is the same as the implementation process of the flowchart of determining whether the target picture is a grayscale map in fig. 5, and the specific implementation manner thereof may refer to the implementation of steps S501 to S505, which is not described herein again. If the target picture is not a gray scale picture, sequentially executing S604; if the target picture is a grayscale image, S603 is sequentially performed.
S603: and matching a background picture according to a preset template, and ending the process.
S604: and carrying out color matching processing on the target picture.
In this step, the specific process of the toning process may refer to fig. 3a, fig. 3b, fig. 3c, and is not described herein again.
S605: and counting the number of pixels corresponding to each color in the target picture.
S606: and according to the counted number of pixels corresponding to each color, arranging the colors in a descending order to obtain a color set.
S607: and masking undesired colors in the target picture from the obtained color set.
In this step, the implementation flow of masking the undesired color in the target picture is the same as the implementation flow of masking the undesired color in fig. 4, and the specific implementation manner thereof may refer to the implementation of steps S401 to S404, which is not described herein again.
S608: at least two main colors are extracted from the target picture.
In this step, the specific process of extracting at least one main color from the target picture refers to steps S202 to S205, which are not described herein again.
S609: and dividing the target picture into a plurality of blocks according to the number of the extracted main colors.
In this step, the specific process of dividing the target picture into a plurality of blocks according to the number of the extracted main colors refers to step S102, which is not described herein again.
S610: and for each extracted main color, determining a block corresponding to the main color according to a first spatial distance between the main color and the color of each block.
In this step, for each extracted main color, the specific process of determining the block corresponding to the main color according to the first spatial distance between the main color and the color of each block refers to step S103, which is not described herein again.
S611: and determining the adjustment color corresponding to each main color according to the predefined color value mapping interval.
In this step, the specific process refers to step S104, which is not described herein again.
S612: and generating a content display background picture according to the adjusting colors and the blocks corresponding to the main colors.
In this step, referring to step S105, the specific process of generating the content display background picture according to each adjustment color and the block corresponding to each main color is not described herein again.
After the content display background picture is generated according to the background picture processing method provided by the embodiment of the disclosure, the content display background picture can be superimposed on the existing page background for display. It should be noted that the background picture processing method provided by the embodiment of the present disclosure may be applied to displaying information in a page in an information flow manner, and in particular, in an information flow of a card design, in response to a search request initiated by a user, displaying related search results in a aggregated card manner on a search result page, and generating a display background picture of a main card according to a target picture in the main card.
EXAMPLE five
Referring to fig. 7a, a flowchart of a search result displaying method provided in the embodiment of the present disclosure includes the following steps:
s701: and initiating a search request based on the acquired search keyword.
In specific implementation, a user submits a search request through a client application program according to own requirements, and the submitted search request carries search keywords. It should be noted that, a user may submit a search request in any manner of text, voice, or pictures, which is not limited in this disclosure. As shown in fig. 7b, which is a schematic view of a search page displayed by a client, a user inputs and submits a search keyword in a search box displayed in the search page, and the client initiates a search request to a server based on the search keyword submitted by the user, taking the search keyword input by the user as "malus spectabilis" as an example.
S702: and acquiring at least one target resource matched with the search keyword, wherein the target resource comprises a target picture.
In specific implementation, in a search request sent by a client, a server searches for a matched target resource according to a search keyword 'big fish begonia' carried in the search request, wherein the target resource matched by the server can comprise multiple types, for example, a video resource, a picture resource, a music resource, an information resource and the like related to the 'big fish begonia', and the matched target resource comprises a target picture.
S703: and displaying a search result card generated according to at least one target resource on a search result page.
And generating a content display background picture of the search result card based on the target picture. In specific implementation, the content display background picture color may be a stream type gradient color, the stream type gradient color is generated according to the main color adjustment color of the target picture, and the content display background picture color gradually changes to be transparent according to a predetermined direction.
In specific implementation, different search result cards may be generated for different types of target resources acquired in step S702.
For example, in one embodiment, the server generates a plurality of search result cards according to the searched target resources of different types, and in a specific implementation, a corresponding search result card may be generated for each search result, or a plurality of search results may be aggregated to generate one search result card, which is not limited in the embodiment of the present disclosure. Under the condition that the search result card contains the target picture, the server can also generate the content display background picture of the search result card by adopting the background picture processing method provided by the embodiment of the disclosure according to the target picture, send the search result card containing the content display background picture to the client, and display the search result card on the search result page to the user by the client. Fig. 7c is a schematic diagram illustrating a possible effect of the search result page after being processed by applying the background picture processing method.
In another embodiment, after finding the matched at least one target resource according to the search keyword, the server may send the found at least one target resource to the client, and the client generates a plurality of search result cards. Under the condition that the search result card contains the target picture, the client generates the content display background picture of the search result card by adopting the background picture processing method provided by the embodiment of the disclosure according to the target picture and displays the content display background picture on the search result page.
In this embodiment, at least one target resource matched with the search keyword is acquired based on the acquired search keyword, a search result generated according to each target resource is displayed on a search result page, the target resource is matched at a server and a content display background picture is generated according to the target picture included in the target resource aiming at a main search result included in the search result, and the content display background picture is displayed in the page.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, a background picture processing apparatus corresponding to the background picture processing method is also provided in the embodiments of the present disclosure, and since the principle of the apparatus in the embodiments of the present disclosure for solving the problem is similar to the above background picture processing method in the embodiments of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
EXAMPLE six
Referring to fig. 8, which is a schematic diagram of a background picture processing apparatus according to a sixth embodiment of the present disclosure, the apparatus includes:
an extracting unit 801, configured to extract at least two main colors from a target picture;
a dividing unit 802, configured to divide the target picture into a plurality of blocks according to the number of the extracted main colors;
a first determining unit 803, configured to determine, for each extracted dominant color, a tile corresponding to the dominant color according to a first spatial distance between the dominant color and each tile color;
a second determining unit 804, configured to determine, according to a predefined color value mapping interval, an adjustment color corresponding to each of the main colors;
a creating unit 805, configured to generate a content display background picture according to each adjusted color and the block corresponding to each main color.
In a possible embodiment, the apparatus further comprises a processing unit and a statistics unit, wherein:
the processing unit is used for carrying out fuzzy processing on the target picture;
the counting unit is used for counting the number of pixels corresponding to each color in the target picture after the fuzzy processing;
the extracting unit 801 is configured to extract at least two main colors from the target picture according to the number of pixels corresponding to each color.
In a possible implementation, the apparatus further comprises a sorting unit, wherein:
the sorting unit is used for sorting the colors in a descending order according to the counted number of pixels corresponding to the colors to obtain a color set;
the extracting unit 801 is specifically configured to select one color from the color set in sequence, and determine a second spatial distance between the currently selected color and each color included in the primary color list; and if the second spatial distance is greater than a first preset threshold value, adding the currently selected color into the main color list.
In a possible implementation, the extracting unit 801 is further configured to determine a third spatial distance between the selected color and the target color; and if the third spatial distance is smaller than a second preset threshold value, screening to select a color.
In a possible implementation manner, the first determining unit 803 is specifically configured to determine, for each extracted main color, a first spatial distance between the main color and a color of each pixel in the block; counting the number of pixels of which the first spatial distance from each pixel to the main color is smaller than a third preset threshold value; and determining the block corresponding to the current main color according to the counted number of the pixels.
In a possible embodiment, the dominant color is represented by any one of: red, green and blue RGB, hue saturation value HSV and hue saturation value HSL; and if the dominant color is represented in RGB, then
The second determining unit 804 is specifically configured to convert the main color from an RGB space to an HSL space; according to a predefined color value mapping interval, respectively mapping a saturation value and a brightness value in the HSL space to a first interval and a second interval to obtain a mapped HSL value; converting the mapped HSL value into an RGB value; and representing the adjusted color of the main color by using the RGB value obtained by conversion.
In a possible implementation manner, the creating unit 805 is specifically configured to generate, according to the number of pixels and an adjustment color corresponding to each of the main colors, a block picture corresponding to each of the main colors by using a linear gradient tool;
and splicing the block pictures corresponding to the main colors based on the blocks corresponding to the main colors to generate the content display background picture.
In a possible embodiment, the apparatus further comprises: the third determining unit is specifically configured to determine a saturation mean value and a gray scale mean value of the target picture respectively; when the target picture saturation mean value and the target picture gray scale mean value are smaller than a fourth preset threshold value, determining that the target picture is a gray scale picture; when the target picture saturation mean value and/or the target picture gray scale mean value are not smaller than the fourth preset threshold value, determining that the target picture is not a gray scale picture; in the case where it is determined that the target picture is not a grayscale picture, the step of extracting at least two dominant color features from the target picture is performed.
In an alternative embodiment, the apparatus further comprises:
a preprocessing unit, configured to perform color matching preprocessing on the target picture, where the color matching preprocessing includes at least one of: tone scale adjustment processing, saturation adjustment processing, color contrast processing, and blur processing.
EXAMPLE seven
The embodiment of the present disclosure further provides a search result display apparatus corresponding to the search result display method, and as the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the search result display method in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 9, which is a schematic diagram of a search result display apparatus provided in a seventh embodiment of the present disclosure, the apparatus includes:
a search unit 901 configured to initiate a search request based on the acquired search keyword;
an obtaining unit 902, configured to obtain at least one target resource that matches the search keyword, where the target resource includes a target picture;
a presentation unit 903, configured to present, on a search result page, a search result card generated according to the at least one target resource, where a content presentation background picture of the search result card is generated based on the target picture.
The content display background picture color is a stream type gradient color, the stream type gradient color is generated according to the main color adjustment color of the target picture, and the content display background picture color is gradually changed into transparent according to the preset direction.
The description of the processing flow of each unit in the device and the interaction flow between each unit may refer to the related description in the above method embodiments, and will not be described in detail here.
Example eight
Based on the same technical concept, the embodiment of the application also provides computer equipment. Referring to fig. 10, a schematic structural diagram of a computer device provided in the embodiment of the present application includes a processor 101, a memory 102, and a bus 103. The memory 102 is used for storing execution instructions, and includes a memory 1021 and an external memory 1022; the memory 1021 is also called an internal memory, and is used for temporarily storing the operation data in the processor 101 and the data exchanged with the external storage 1022 such as a hard disk, the processor 101 exchanges data with the external storage 1022 through the memory 1021, and when the computer device is running, the processor 101 communicates with the storage 102 through the bus 103, so that the processor 101 executes the instructions mentioned in the above method embodiments.
The embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method for processing a background picture or displaying a search result in the foregoing method embodiment are executed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the background picture processing method or the search result displaying method provided by the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the background picture processing method or the search result displaying method described in the embodiments of the foregoing methods, and reference may be made to the embodiments of the foregoing methods specifically, and details are not repeated here.
The disclosed embodiments also provide a computer program, which when executed by a processor implements any one of the methods of the preceding embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (15)

1. A background picture processing method is characterized by comprising the following steps:
extracting at least two main colors from a target picture;
dividing the target picture into a plurality of blocks according to the number of the extracted main colors;
for each extracted main color, determining a block corresponding to the main color according to a first spatial distance between the main color and the color of the pixel contained in each block;
determining an adjustment color corresponding to each main color according to a predefined color value mapping interval;
and generating a content display background picture according to each adjusted color and the block corresponding to each main color.
2. The method of claim 1, wherein extracting at least two dominant colors from the target picture comprises:
carrying out fuzzy processing on the target picture;
counting the number of pixels corresponding to each color in the target picture after the fuzzy processing;
and extracting at least two main colors from the target picture according to the number of pixels corresponding to each color.
3. The method according to claim 2, wherein extracting at least two main colors from the target picture according to the counted number of pixels corresponding to each color comprises:
according to the counted number of pixels corresponding to each color, arranging the colors in a descending order to obtain a color set;
sequentially selecting a color from the color set, and determining a second spatial distance between the currently selected color and each color contained in the main color list;
and if the second spatial distance is greater than a first preset threshold value, adding the currently selected color into the main color list.
4. The method of claim 3, further comprising, prior to determining the second spatial distances between the selected color and other colors in the set of colors:
determining a third spatial distance between the selected color and the target color;
and if the third spatial distance is smaller than a second preset threshold value, screening to select a color.
5. The method according to claim 1, wherein for each extracted main color, determining the block corresponding to the main color according to a first spatial distance between the main color and colors of pixels included in each block comprises:
for each extracted main color, determining a first spatial distance between the main color and the color of each pixel in the block;
counting the number of pixels of which the first spatial distance from each pixel to the main color is smaller than a third preset threshold value;
and determining the block corresponding to the current main color according to the counted number of the pixels.
6. The method of claim 1, wherein the dominant color is represented by any one of: red, green and blue RGB, hue saturation value HSV and hue saturation value HSL; and if the dominant color is represented in RGB, then
Determining an adjustment color corresponding to each main color according to a predefined color value mapping interval, including:
converting the primary color from an RGB space to an HSL space;
according to a predefined color value mapping interval, respectively mapping a saturation value and a brightness value in the HSL space to a first interval and a second interval to obtain a mapped HSL value;
converting the mapped HSL value into an RGB value;
and representing the adjusted color of the main color by using the RGB value obtained by conversion.
7. The method of claim 5, wherein generating a content display background picture according to each of the adjusted colors and the block corresponding to each of the main colors comprises:
generating a block picture corresponding to each main color by using a linear gradient tool according to the number of the pixels and the adjustment color corresponding to each main color;
and splicing the block pictures corresponding to the main colors based on the blocks corresponding to the main colors to generate the content display background picture.
8. The method of claim 1, wherein before extracting the at least two dominant colors from the target picture, further comprising:
respectively determining a saturation mean value and a gray mean value of the target picture;
when the target picture saturation mean value and the target picture gray scale mean value are smaller than a fourth preset threshold value, determining that the target picture is a gray scale picture;
and when the target picture saturation mean value and/or the target picture gray scale mean value are not smaller than the fourth preset threshold value, determining that the target picture is not a gray scale picture.
9. The method according to claims 1-8, further comprising, before extracting at least two dominant colors from the target picture:
performing color matching pretreatment on the target picture, wherein the color matching pretreatment comprises at least one of the following steps: tone scale adjustment processing, saturation adjustment processing, color contrast processing, and blur processing.
10. A search result display method is characterized by comprising the following steps:
initiating a search request based on the obtained search keyword;
acquiring at least one target resource matched with the search keyword, wherein the target resource comprises a target picture;
and displaying a search result card generated according to the at least one target resource on a search result page, wherein a content display background picture of the search result card is generated based on the target picture.
11. The method according to claim 10, wherein the content presentation background picture color is a stream gradient color, the stream gradient color is generated according to a main color adjustment color of the target picture, and the content presentation background picture color is gradually changed to be transparent according to a predetermined direction.
12. A background picture processing apparatus, comprising:
the extraction unit is used for extracting at least two main colors from the target picture;
the dividing unit is used for dividing the target picture into a plurality of blocks according to the number of the extracted main colors;
a first determining unit, configured to determine, for each extracted main color, a block corresponding to the main color according to a first spatial distance between the main color and each block color;
the second determining unit is used for determining the adjustment color corresponding to each main color according to a predefined color value mapping interval;
and the creating unit is used for generating a content display background picture according to each adjusting color and the block corresponding to each main color.
13. A search result presentation apparatus, comprising:
a search unit for initiating a search request based on the acquired search keyword;
the acquisition unit is used for acquiring at least one target resource matched with the search keyword, and the target resource comprises a target picture;
and the display unit is used for displaying a search result card generated according to the at least one target resource on a search result page, wherein the content display background picture of the search result card is generated based on the target picture.
14. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when a computer device is running, the machine-readable instructions being executed by the processor to perform the steps of a background picture processing method according to any one of claims 1 to 9, or the steps of a search result presentation method according to claim 10 or 11.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, performs the steps of a background picture processing method according to any one of claims 1 to 9, or the steps of a search result presentation method according to claim 10 or 11.
CN202010923452.2A 2020-09-04 2020-09-04 Background picture processing and search result display method, device, equipment and medium Pending CN112069339A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010923452.2A CN112069339A (en) 2020-09-04 2020-09-04 Background picture processing and search result display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010923452.2A CN112069339A (en) 2020-09-04 2020-09-04 Background picture processing and search result display method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN112069339A true CN112069339A (en) 2020-12-11

Family

ID=73665564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010923452.2A Pending CN112069339A (en) 2020-09-04 2020-09-04 Background picture processing and search result display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112069339A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031675A (en) * 2021-03-05 2021-06-25 中南大学 Self-adaptive control method, system and device for visible light and storage medium
CN113592963A (en) * 2021-07-08 2021-11-02 深圳Tcl新技术有限公司 Image generation method and device, computer equipment and computer readable storage medium
CN113742025A (en) * 2021-09-17 2021-12-03 北京字跳网络技术有限公司 Page generation method, device, equipment and storage medium
WO2022142222A1 (en) * 2020-12-30 2022-07-07 北京搜狗科技发展有限公司 Method and apparatus for setting application program
WO2023093721A1 (en) * 2021-11-26 2023-06-01 维沃移动通信有限公司 Resource recall method and apparatus, and network-side device
CN116401359A (en) * 2023-06-09 2023-07-07 深圳前海环融联易信息科技服务有限公司 Document extraction method and device, medium and equipment
WO2023185431A1 (en) * 2022-03-29 2023-10-05 北京字跳网络技术有限公司 Card display method and apparatus, electronic device, storage medium, and program product
WO2024099284A1 (en) * 2022-11-07 2024-05-16 北京字跳网络技术有限公司 Page display method and apparatus, device, storage medium, and program product
CN113592963B (en) * 2021-07-08 2024-06-04 深圳Tcl新技术有限公司 Image generation method, device, computer equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104965631A (en) * 2015-05-26 2015-10-07 深圳市万普拉斯科技有限公司 Desktop color matching method, desktop color matching apparatus and intelligent terminal
CN106934838A (en) * 2017-02-08 2017-07-07 广州阿里巴巴文学信息技术有限公司 Picture display method, equipment and programmable device
CN304333123S (en) * 2017-10-27
CN109947973A (en) * 2018-09-21 2019-06-28 北京字节跳动网络技术有限公司 Background configuration method, device, equipment and the readable medium of display area
CN111242836A (en) * 2018-11-29 2020-06-05 阿里巴巴集团控股有限公司 Method, device and equipment for generating target image and advertising image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN304333123S (en) * 2017-10-27
CN104965631A (en) * 2015-05-26 2015-10-07 深圳市万普拉斯科技有限公司 Desktop color matching method, desktop color matching apparatus and intelligent terminal
CN106934838A (en) * 2017-02-08 2017-07-07 广州阿里巴巴文学信息技术有限公司 Picture display method, equipment and programmable device
CN109947973A (en) * 2018-09-21 2019-06-28 北京字节跳动网络技术有限公司 Background configuration method, device, equipment and the readable medium of display area
CN111242836A (en) * 2018-11-29 2020-06-05 阿里巴巴集团控股有限公司 Method, device and equipment for generating target image and advertising image

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022142222A1 (en) * 2020-12-30 2022-07-07 北京搜狗科技发展有限公司 Method and apparatus for setting application program
CN113031675A (en) * 2021-03-05 2021-06-25 中南大学 Self-adaptive control method, system and device for visible light and storage medium
CN113031675B (en) * 2021-03-05 2022-03-29 中南大学 Self-adaptive control method, system and device for visible light and storage medium
CN113592963A (en) * 2021-07-08 2021-11-02 深圳Tcl新技术有限公司 Image generation method and device, computer equipment and computer readable storage medium
CN113592963B (en) * 2021-07-08 2024-06-04 深圳Tcl新技术有限公司 Image generation method, device, computer equipment and computer readable storage medium
CN113742025A (en) * 2021-09-17 2021-12-03 北京字跳网络技术有限公司 Page generation method, device, equipment and storage medium
WO2023093721A1 (en) * 2021-11-26 2023-06-01 维沃移动通信有限公司 Resource recall method and apparatus, and network-side device
WO2023185431A1 (en) * 2022-03-29 2023-10-05 北京字跳网络技术有限公司 Card display method and apparatus, electronic device, storage medium, and program product
WO2024099284A1 (en) * 2022-11-07 2024-05-16 北京字跳网络技术有限公司 Page display method and apparatus, device, storage medium, and program product
CN116401359A (en) * 2023-06-09 2023-07-07 深圳前海环融联易信息科技服务有限公司 Document extraction method and device, medium and equipment

Similar Documents

Publication Publication Date Title
CN112069339A (en) Background picture processing and search result display method, device, equipment and medium
CN110780873B (en) Interface color adaptation method, device, computer equipment and storage medium
CN112069341A (en) Background picture generation and search result display method, device, equipment and medium
KR20200014842A (en) Image illumination methods, devices, electronic devices and storage media
JP2005202469A (en) Image processor, image processing method and program
JP2005151282A (en) Apparatus and method of image processing, and program
US11347792B2 (en) Video abstract generating method, apparatus, and storage medium
CN110990617B (en) Picture marking method, device, equipment and storage medium
CN108806638B (en) Image display method and device
CN105981360A (en) Image processing apparatus, image processing system, image processing method and recording medium
WO2023005743A1 (en) Image processing method and apparatus, computer device, storage medium, and computer program product
CN112328345A (en) Method and device for determining theme color, electronic equipment and readable storage medium
CN109064525A (en) A kind of picture format conversion method, device, equipment and storage medium
Li et al. Directive local color transfer based on dynamic look-up table
WO2016197705A1 (en) Image processing method and device
CN110020645A (en) A kind of image processing method and device, a kind of calculating equipment and storage medium
CN117112090A (en) Business page theme generation method, device, computer equipment, medium and product
CN114707013A (en) Image color matching method and device, terminal and readable storage medium
CN114494467A (en) Image color migration method and device, electronic equipment and storage medium
CN111338627B (en) Front-end webpage theme color adjustment method and device
CN113240760A (en) Image processing method and device, computer equipment and storage medium
CN115147259A (en) Image color migration method, system and computer medium
CN112069340A (en) Background picture processing and search result display method, device, equipment and medium
CN110097070B (en) Chinese painting characteristic color set acquisition method based on human visual perception
CN111369431A (en) Image processing method and device, readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: Tiktok vision (Beijing) Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information